Inoue, Kentaro; Maeda, Kazuhiro; Miyabe, Takaaki; Matsuoka, Yu; Kurata, Hiroyuki
2014-09-01
Mathematical modeling has become a standard technique to understand the dynamics of complex biochemical systems. To promote the modeling, we had developed the CADLIVE dynamic simulator that automatically converted a biochemical map into its associated mathematical model, simulated its dynamic behaviors and analyzed its robustness. To enhance the feasibility by CADLIVE and extend its functions, we propose the CADLIVE toolbox available for MATLAB, which implements not only the existing functions of the CADLIVE dynamic simulator, but also the latest tools including global parameter search methods with robustness analysis. The seamless, bottom-up processes consisting of biochemical network construction, automatic construction of its dynamic model, simulation, optimization, and S-system analysis greatly facilitate dynamic modeling, contributing to the research of systems biology and synthetic biology. This application can be freely downloaded from http://www.cadlive.jp/CADLIVE_MATLAB/ together with an instruction.
Automatic network coupling analysis for dynamical systems based on detailed kinetic models.
Lebiedz, Dirk; Kammerer, Julia; Brandt-Pollmann, Ulrich
2005-10-01
We introduce a numerical complexity reduction method for the automatic identification and analysis of dynamic network decompositions in (bio)chemical kinetics based on error-controlled computation of a minimal model dimension represented by the number of (locally) active dynamical modes. Our algorithm exploits a generalized sensitivity analysis along state trajectories and subsequent singular value decomposition of sensitivity matrices for the identification of these dominant dynamical modes. It allows for a dynamic coupling analysis of (bio)chemical species in kinetic models that can be exploited for the piecewise computation of a minimal model on small time intervals and offers valuable functional insight into highly nonlinear reaction mechanisms and network dynamics. We present results for the identification of network decompositions in a simple oscillatory chemical reaction, time scale separation based model reduction in a Michaelis-Menten enzyme system and network decomposition of a detailed model for the oscillatory peroxidase-oxidase enzyme system.
ERIC Educational Resources Information Center
Ma, Dongmei; Yu, Xiaoru; Zhang, Haomin
2017-01-01
The present study aimed to investigate second language (L2) word-level and sentence-level automatic processing among English as a foreign language students through a comparative analysis of students with different proficiency levels. As a multidimensional and dynamic construct, automaticity is conceptualized as processing speed, stability, and…
2D Automatic body-fitted structured mesh generation using advancing extraction method
USDA-ARS?s Scientific Manuscript database
This paper presents an automatic mesh generation algorithm for body-fitted structured meshes in Computational Fluids Dynamics (CFD) analysis using the Advancing Extraction Method (AEM). The method is applicable to two-dimensional domains with complex geometries, which have the hierarchical tree-like...
2D automatic body-fitted structured mesh generation using advancing extraction method
USDA-ARS?s Scientific Manuscript database
This paper presents an automatic mesh generation algorithm for body-fitted structured meshes in Computational Fluids Dynamics (CFD) analysis using the Advancing Extraction Method (AEM). The method is applicable to two-dimensional domains with complex geometries, which have the hierarchical tree-like...
Dynamic simulation of train derailments
DOT National Transportation Integrated Search
2006-11-05
This paper describes a planar rigid-body model to examine the gross motions of rail cars in a train derailment. The model is implemented using a commercial software package called ADAMS (Automatic Dynamic Analysis of Mechanical Systems). The results ...
NASA Technical Reports Server (NTRS)
Hsieh, Shang-Hsien
1993-01-01
The principal objective of this research is to develop, test, and implement coarse-grained, parallel-processing strategies for nonlinear dynamic simulations of practical structural problems. There are contributions to four main areas: finite element modeling and analysis of rotational dynamics, numerical algorithms for parallel nonlinear solutions, automatic partitioning techniques to effect load-balancing among processors, and an integrated parallel analysis system.
Bifurcation analysis of an automatic dynamic balancing mechanism for eccentric rotors
NASA Astrophysics Data System (ADS)
Green, K.; Champneys, A. R.; Lieven, N. J.
2006-04-01
We present a nonlinear bifurcation analysis of the dynamics of an automatic dynamic balancing mechanism for rotating machines. The principle of operation is to deploy two or more masses that are free to travel around a race at a fixed distance from the hub and, subsequently, balance any eccentricity in the rotor. Mathematically, we start from a Lagrangian description of the system. It is then shown how under isotropic conditions a change of coordinates into a rotating frame turns the problem into a regular autonomous dynamical system, amenable to a full nonlinear bifurcation analysis. Using numerical continuation techniques, curves are traced of steady states, limit cycles and their bifurcations as parameters are varied. These results are augmented by simulations of the system trajectories in phase space. Taking the case of a balancer with two free masses, broad trends are revealed on the existence of a stable, dynamically balanced steady-state solution for specific rotation speeds and eccentricities. However, the analysis also reveals other potentially attracting states—non-trivial steady states, limit cycles, and chaotic motion—which are not in balance. The transient effects which lead to these competing states, which in some cases coexist, are investigated.
Gait analysis--precise, rapid, automatic, 3-D position and orientation kinematics and dynamics.
Mann, R W; Antonsson, E K
1983-01-01
A fully automatic optoelectronic photogrammetric technique is presented for measuring the spatial kinematics of human motion (both position and orientation) and estimating the inertial (net) dynamics. Calibration and verification showed that in a two-meter cube viewing volume, the system achieves one millimeter of accuracy and resolution in translation and 20 milliradians in rotation. Since double differentiation of generalized position data to determine accelerations amplifies noise, the frequency domain characteristics of the system were investigated. It was found that the noise and all other errors in the kinematic data contribute less than five percent error to the resulting dynamics.
NASA Technical Reports Server (NTRS)
Hou, Gene
1998-01-01
Sensitivity analysis is a technique for determining derivatives of system responses with respect to design parameters. Among many methods available for sensitivity analysis, automatic differentiation has been proven through many applications in fluid dynamics and structural mechanics to be an accurate and easy method for obtaining derivatives. Nevertheless, the method can be computational expensive and can require a high memory space. This project will apply an automatic differentiation tool, ADIFOR, to a p-version finite element code to obtain first- and second- order then-nal derivatives, respectively. The focus of the study is on the implementation process and the performance of the ADIFOR-enhanced codes for sensitivity analysis in terms of memory requirement, computational efficiency, and accuracy.
Characterizing chaotic melodies in automatic music composition
NASA Astrophysics Data System (ADS)
Coca, Andrés E.; Tost, Gerard O.; Zhao, Liang
2010-09-01
In this paper, we initially present an algorithm for automatic composition of melodies using chaotic dynamical systems. Afterward, we characterize chaotic music in a comprehensive way as comprising three perspectives: musical discrimination, dynamical influence on musical features, and musical perception. With respect to the first perspective, the coherence between generated chaotic melodies (continuous as well as discrete chaotic melodies) and a set of classical reference melodies is characterized by statistical descriptors and melodic measures. The significant differences among the three types of melodies are determined by discriminant analysis. Regarding the second perspective, the influence of dynamical features of chaotic attractors, e.g., Lyapunov exponent, Hurst coefficient, and correlation dimension, on melodic features is determined by canonical correlation analysis. The last perspective is related to perception of originality, complexity, and degree of melodiousness (Euler's gradus suavitatis) of chaotic and classical melodies by nonparametric statistical tests.
Combinatorial-topological framework for the analysis of global dynamics.
Bush, Justin; Gameiro, Marcio; Harker, Shaun; Kokubu, Hiroshi; Mischaikow, Konstantin; Obayashi, Ippei; Pilarczyk, Paweł
2012-12-01
We discuss an algorithmic framework based on efficient graph algorithms and algebraic-topological computational tools. The framework is aimed at automatic computation of a database of global dynamics of a given m-parameter semidynamical system with discrete time on a bounded subset of the n-dimensional phase space. We introduce the mathematical background, which is based upon Conley's topological approach to dynamics, describe the algorithms for the analysis of the dynamics using rectangular grids both in phase space and parameter space, and show two sample applications.
Combinatorial-topological framework for the analysis of global dynamics
NASA Astrophysics Data System (ADS)
Bush, Justin; Gameiro, Marcio; Harker, Shaun; Kokubu, Hiroshi; Mischaikow, Konstantin; Obayashi, Ippei; Pilarczyk, Paweł
2012-12-01
We discuss an algorithmic framework based on efficient graph algorithms and algebraic-topological computational tools. The framework is aimed at automatic computation of a database of global dynamics of a given m-parameter semidynamical system with discrete time on a bounded subset of the n-dimensional phase space. We introduce the mathematical background, which is based upon Conley's topological approach to dynamics, describe the algorithms for the analysis of the dynamics using rectangular grids both in phase space and parameter space, and show two sample applications.
Trust, control strategies and allocation of function in human-machine systems.
Lee, J; Moray, N
1992-10-01
As automated controllers supplant human intervention in controlling complex systems, the operators' role often changes from that of an active controller to that of a supervisory controller. Acting as supervisors, operators can choose between automatic and manual control. Improperly allocating function between automatic and manual control can have negative consequences for the performance of a system. Previous research suggests that the decision to perform the job manually or automatically depends, in part, upon the trust the operators invest in the automatic controllers. This paper reports an experiment to characterize the changes in operators' trust during an interaction with a semi-automatic pasteurization plant, and investigates the relationship between changes in operators' control strategies and trust. A regression model identifies the causes of changes in trust, and a 'trust transfer function' is developed using time series analysis to describe the dynamics of trust. Based on a detailed analysis of operators' strategies in response to system faults we suggest a model for the choice between manual and automatic control, based on trust in automatic controllers and self-confidence in the ability to control the system manually.
Oscillatory brain dynamics associated with the automatic processing of emotion in words.
Wang, Lin; Bastiaansen, Marcel
2014-10-01
This study examines the automaticity of processing the emotional aspects of words, and characterizes the oscillatory brain dynamics that accompany this automatic processing. Participants read emotionally negative, neutral and positive nouns while performing a color detection task in which only perceptual-level analysis was required. Event-related potentials and time frequency representations were computed from the concurrently measured EEG. Negative words elicited a larger P2 and a larger late positivity than positive and neutral words, indicating deeper semantic/evaluative processing of negative words. In addition, sustained alpha power suppressions were found for the emotional compared to neutral words, in the time range from 500 to 1000ms post-stimulus. These results suggest that sustained attention was allocated to the emotional words, whereas the attention allocated to the neutral words was released after an initial analysis. This seems to hold even when the emotional content of the words is task-irrelevant. Copyright © 2014 Elsevier Inc. All rights reserved.
Ibraheem; Hasan, Naimul; Hussein, Arkan Ahmed
2014-01-01
This Paper presents the design of decentralized automatic generation controller for an interconnected power system using PID, Genetic Algorithm (GA) and Particle Swarm Optimization (PSO). The designed controllers are tested on identical two-area interconnected power systems consisting of thermal power plants. The area interconnections between two areas are considered as (i) AC tie-line only (ii) Asynchronous tie-line. The dynamic response analysis is carried out for 1% load perturbation. The performance of the intelligent controllers based on GA and PSO has been compared with the conventional PID controller. The investigations of the system dynamic responses reveal that PSO has the better dynamic response result as compared with PID and GA controller for both type of area interconnection.
Adaptive pseudolinear compensators of dynamic characteristics of automatic control systems
NASA Astrophysics Data System (ADS)
Skorospeshkin, M. V.; Sukhodoev, M. S.; Timoshenko, E. A.; Lenskiy, F. V.
2016-04-01
Adaptive pseudolinear gain and phase compensators of dynamic characteristics of automatic control systems are suggested. The automatic control system performance with adaptive compensators has been explored. The efficiency of pseudolinear adaptive compensators in the automatic control systems with time-varying parameters has been demonstrated.
An analysis of general chain systems
NASA Technical Reports Server (NTRS)
Passerello, C. E.; Huston, R. L.
1972-01-01
A general analysis of dynamic systems consisting of connected rigid bodies is presented. The number of bodies and their manner of connection is arbitrary so long as no closed loops are formed. The analysis represents a dynamic finite element method, which is computer-oriented and designed so that nonworking, interval constraint forces are automatically eliminated. The method is based upon Lagrange's form of d'Alembert's principle. Shifter matrix transformations are used with the geometrical aspects of the analysis. The method is illustrated with a space manipulator.
DOT National Transportation Integrated Search
1980-06-01
Volume 3 contains the application of the three-dimensional (3-D) finite element program, Automatic Dynamic Incremental Nonlinear Analysis (ADINA), which was designed to replace the traditional 2-D plane strain analysis, to a specific location. The lo...
Application of automatic threshold in dynamic target recognition with low contrast
NASA Astrophysics Data System (ADS)
Miao, Hua; Guo, Xiaoming; Chen, Yu
2014-11-01
Hybrid photoelectric joint transform correlator can realize automatic real-time recognition with high precision through the combination of optical devices and electronic devices. When recognizing targets with low contrast using photoelectric joint transform correlator, because of the difference of attitude, brightness and grayscale between target and template, only four to five frames of dynamic targets can be recognized without any processing. CCD camera is used to capture the dynamic target images and the capturing speed of CCD is 25 frames per second. Automatic threshold has many advantages like fast processing speed, effectively shielding noise interference, enhancing diffraction energy of useful information and better reserving outline of target and template, so this method plays a very important role in target recognition with optical correlation method. However, the automatic obtained threshold by program can not achieve the best recognition results for dynamic targets. The reason is that outline information is broken to some extent. Optimal threshold is obtained by manual intervention in most cases. Aiming at the characteristics of dynamic targets, the processing program of improved automatic threshold is finished by multiplying OTSU threshold of target and template by scale coefficient of the processed image, and combining with mathematical morphology. The optimal threshold can be achieved automatically by improved automatic threshold processing for dynamic low contrast target images. The recognition rate of dynamic targets is improved through decreased background noise effect and increased correlation information. A series of dynamic tank images with the speed about 70 km/h are adapted as target images. The 1st frame of this series of tanks can correlate only with the 3rd frame without any processing. Through OTSU threshold, the 80th frame can be recognized. By automatic threshold processing of the joint images, this number can be increased to 89 frames. Experimental results show that the improved automatic threshold processing has special application value for the recognition of dynamic target with low contrast.
Automatic Conflict Detection on Contracts
NASA Astrophysics Data System (ADS)
Fenech, Stephen; Pace, Gordon J.; Schneider, Gerardo
Many software applications are based on collaborating, yet competing, agents or virtual organisations exchanging services. Contracts, expressing obligations, permissions and prohibitions of the different actors, can be used to protect the interests of the organisations engaged in such service exchange. However, the potentially dynamic composition of services with different contracts, and the combination of service contracts with local contracts can give rise to unexpected conflicts, exposing the need for automatic techniques for contract analysis. In this paper we look at automatic analysis techniques for contracts written in the contract language mathcal{CL}. We present a trace semantics of mathcal{CL} suitable for conflict analysis, and a decision procedure for detecting conflicts (together with its proof of soundness, completeness and termination). We also discuss its implementation and look into the applications of the contract analysis approach we present. These techniques are applied to a small case study of an airline check-in desk.
Hsu, Li-Yueh; Wragg, Andrew; Anderson, Stasia A; Balaban, Robert S; Boehm, Manfred; Arai, Andrew E
2008-02-01
This study presents computerized automatic image analysis for quantitatively evaluating dynamic contrast-enhanced MRI in an ischemic rat hindlimb model. MRI at 7 T was performed on animals in a blinded placebo-controlled experiment comparing multipotent adult progenitor cell-derived progenitor cell (MDPC)-treated, phosphate buffered saline (PBS)-injected, and sham-operated rats. Ischemic and non-ischemic limb regions of interest were automatically segmented from time-series images for detecting changes in perfusion and late enhancement. In correlation analysis of the time-signal intensity histograms, the MDPC-treated limbs correlated well with their corresponding non-ischemic limbs. However, the correlation coefficient of the PBS control group was significantly lower than that of the MDPC-treated and sham-operated groups. In semi-quantitative parametric maps of contrast enhancement, there was no significant difference in hypo-enhanced area between the MDPC and PBS groups at early perfusion-dependent time frames. However, the late-enhancement area was significantly larger in the PBS than the MDPC group. The results of this exploratory study show that MDPC-treated rats could be objectively distinguished from PBS controls. The differences were primarily determined by late contrast enhancement of PBS-treated limbs. These computerized methods appear promising for assessing perfusion and late enhancement in dynamic contrast-enhanced MRI.
Harms, Hendrik Johannes; Tolbod, Lars Poulsen; Hansson, Nils Henrik Stubkjær; Kero, Tanja; Orndahl, Lovisa Holm; Kim, Won Yong; Bjerner, Tomas; Bouchelouche, Kirsten; Wiggers, Henrik; Frøkiær, Jørgen; Sörensen, Jens
2015-12-01
The aim of this study was to develop and validate an automated method for extracting forward stroke volume (FSV) using indicator dilution theory directly from dynamic positron emission tomography (PET) studies for two different tracers and scanners. 35 subjects underwent a dynamic (11)C-acetate PET scan on a Siemens Biograph TruePoint-64 PET/CT (scanner I). In addition, 10 subjects underwent both dynamic (15)O-water PET and (11)C-acetate PET scans on a GE Discovery-ST PET/CT (scanner II). The left ventricular (LV)-aortic time-activity curve (TAC) was extracted automatically from PET data using cluster analysis. The first-pass peak was isolated by automatic extrapolation of the downslope of the TAC. FSV was calculated as the injected dose divided by the product of heart rate and the area under the curve of the first-pass peak. Gold standard FSV was measured using phase-contrast cardiovascular magnetic resonance (CMR). FSVPET correlated highly with FSVCMR (r = 0.87, slope = 0.90 for scanner I, r = 0.87, slope = 1.65, and r = 0.85, slope = 1.69 for scanner II for (15)O-water and (11)C-acetate, respectively) although a systematic bias was observed for both scanners (p < 0.001 for all). FSV based on (11)C-acetate and (15)O-water correlated highly (r = 0.99, slope = 1.03) with no significant difference between FSV estimates (p = 0.14). FSV can be obtained automatically using dynamic PET/CT and cluster analysis. Results are almost identical for (11)C-acetate and (15)O-water. A scanner-dependent bias was observed, and a scanner calibration factor is required for multi-scanner studies. Generalization of the method to other tracers and scanners requires further validation.
Formalization and analysis of reasoning by assumption.
Bosse, Tibor; Jonker, Catholijn M; Treur, Jan
2006-01-02
This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning traces can be formalized and automatically analyzed against dynamic properties they fulfill. To this end, for the pattern of reasoning by assumption a variety of dynamic properties have been specified, some of which are considered characteristic for the reasoning pattern, whereas some other properties can be used to discriminate among different approaches to the reasoning. These properties have been automatically checked for the traces acquired in experiments undertaken. The approach turned out to be beneficial from two perspectives. First, checking characteristic properties contributes to the empirical validation of a theory on reasoning by assumption. Second, checking discriminating properties allows the analyst to identify different classes of human reasoners. 2006 Lawrence Erlbaum Associates, Inc.
Video enhancement workbench: an operational real-time video image processing system
NASA Astrophysics Data System (ADS)
Yool, Stephen R.; Van Vactor, David L.; Smedley, Kirk G.
1993-01-01
Video image sequences can be exploited in real-time, giving analysts rapid access to information for military or criminal investigations. Video-rate dynamic range adjustment subdues fluctuations in image intensity, thereby assisting discrimination of small or low- contrast objects. Contrast-regulated unsharp masking enhances differentially shadowed or otherwise low-contrast image regions. Real-time removal of localized hotspots, when combined with automatic histogram equalization, may enhance resolution of objects directly adjacent. In video imagery corrupted by zero-mean noise, real-time frame averaging can assist resolution and location of small or low-contrast objects. To maximize analyst efficiency, lengthy video sequences can be screened automatically for low-frequency, high-magnitude events. Combined zoom, roam, and automatic dynamic range adjustment permit rapid analysis of facial features captured by video cameras recording crimes in progress. When trying to resolve small objects in murky seawater, stereo video places the moving imagery in an optimal setting for human interpretation.
The integrated manual and automatic control of complex flight systems
NASA Technical Reports Server (NTRS)
Schmidt, D. K.
1985-01-01
Pilot/vehicle analysis techniques for optimizing aircraft handling qualities are presented. The analysis approach considered is based on the optimal control frequency domain techniques. These techniques stem from an optimal control approach of a Neal-Smith like analysis on aircraft attitude dynamics extended to analyze the flared landing task. Some modifications to the technique are suggested and discussed. An in depth analysis of the effect of the experimental variables, such as prefilter, is conducted to gain further insight into the flared land task for this class of vehicle dynamics.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Expedited Rulemaking To Establish Dynamic Automatic Suppression System Test Procedures for Federal Motor... subpart, the following definitions apply: (a) Dynamic automatic suppression system (DASS) means a portion of an air bag system that automatically controls whether or not the air bag deploys during a crash by...
Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System
NASA Astrophysics Data System (ADS)
Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li
The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.
Improved dynamical scaling analysis using the kernel method for nonequilibrium relaxation.
Echinaka, Yuki; Ozeki, Yukiyasu
2016-10-01
The dynamical scaling analysis for the Kosterlitz-Thouless transition in the nonequilibrium relaxation method is improved by the use of Bayesian statistics and the kernel method. This allows data to be fitted to a scaling function without using any parametric model function, which makes the results more reliable and reproducible and enables automatic and faster parameter estimation. Applying this method, the bootstrap method is introduced and a numerical discrimination for the transition type is proposed.
Vibration analysis on automatic take-up device of belt conveyor
NASA Astrophysics Data System (ADS)
Qin, Tailong; Wei, Jin
2008-10-01
Through introducing application condition of belt conveyor in the modern mining industry, the paper proposed, in the dynamic course of its starting, braking or loading, it would produce moving tension and elastic wave. And analyzed the factors cause the automatic take-up device of belt conveyor vibrating: the take-up device's structure and the elastic wave. Finally the paper proposed the measure to reduce vibration and carried on the modeling and simulation on the tension buffer device.
G-DYN Multibody Dynamics Engine
NASA Technical Reports Server (NTRS)
Acikmese, Behcet; Blackmore, James C.; Broderick, Daniel
2011-01-01
G-DYN is a multi-body dynamic simulation software engine that automatically assembles and integrates equations of motion for arbitrarily connected multibody dynamic systems. The algorithm behind G-DYN is based on a primal-dual formulation of the dynamics that captures the position and velocity vectors (primal variables) of each body and the interaction forces (dual variables) between bodies, which are particularly useful for control and estimation analysis and synthesis. It also takes full advantage of the spare matrix structure resulting from the system dynamics to numerically integrate the equations of motion efficiently. Furthermore, the dynamic model for each body can easily be replaced without re-deriving the overall equations of motion, and the assembly of the equations of motion is done automatically. G-DYN proved an essential software tool in the simulation of spacecraft systems used for small celestial body surface sampling, specifically in simulating touch-and-go (TAG) maneuvers of a robotic sampling system from a comet and asteroid. It is used extensively in validating mission concepts for small body sample return, such as Comet Odyssey and Galahad New Frontiers proposals.
Research in Parallel Algorithms and Software for Computational Aerosciences
DOT National Transportation Integrated Search
1996-04-01
Phase I is complete for the development of a Computational Fluid Dynamics : with automatic grid generation and adaptation for the Euler : analysis of flow over complex geometries. SPLITFLOW, an unstructured Cartesian : grid code developed at Lockheed...
A vibration-based health monitoring program for a large and seismically vulnerable masonry dome
NASA Astrophysics Data System (ADS)
Pecorelli, M. L.; Ceravolo, R.; De Lucia, G.; Epicoco, R.
2017-05-01
Vibration-based health monitoring of monumental structures must rely on efficient and, as far as possible, automatic modal analysis procedures. Relatively low excitation energy provided by traffic, wind and other sources is usually sufficient to detect structural changes, as those produced by earthquakes and extreme events. Above all, in-operation modal analysis is a non-invasive diagnostic technique that can support optimal strategies for the preservation of architectural heritage, especially if complemented by model-driven procedures. In this paper, the preliminary steps towards a fully automated vibration-based monitoring of the world’s largest masonry oval dome (internal axes of 37.23 by 24.89 m) are presented. More specifically, the paper reports on signal treatment operations conducted to set up the permanent dynamic monitoring system of the dome and to realise a robust automatic identification procedure. Preliminary considerations on the effects of temperature on dynamic parameters are finally reported.
Kaakinen, M; Huttunen, S; Paavolainen, L; Marjomäki, V; Heikkilä, J; Eklund, L
2014-01-01
Phase-contrast illumination is simple and most commonly used microscopic method to observe nonstained living cells. Automatic cell segmentation and motion analysis provide tools to analyze single cell motility in large cell populations. However, the challenge is to find a sophisticated method that is sufficiently accurate to generate reliable results, robust to function under the wide range of illumination conditions encountered in phase-contrast microscopy, and also computationally light for efficient analysis of large number of cells and image frames. To develop better automatic tools for analysis of low magnification phase-contrast images in time-lapse cell migration movies, we investigated the performance of cell segmentation method that is based on the intrinsic properties of maximally stable extremal regions (MSER). MSER was found to be reliable and effective in a wide range of experimental conditions. When compared to the commonly used segmentation approaches, MSER required negligible preoptimization steps thus dramatically reducing the computation time. To analyze cell migration characteristics in time-lapse movies, the MSER-based automatic cell detection was accompanied by a Kalman filter multiobject tracker that efficiently tracked individual cells even in confluent cell populations. This allowed quantitative cell motion analysis resulting in accurate measurements of the migration magnitude and direction of individual cells, as well as characteristics of collective migration of cell groups. Our results demonstrate that MSER accompanied by temporal data association is a powerful tool for accurate and reliable analysis of the dynamic behaviour of cells in phase-contrast image sequences. These techniques tolerate varying and nonoptimal imaging conditions and due to their relatively light computational requirements they should help to resolve problems in computationally demanding and often time-consuming large-scale dynamical analysis of cultured cells. © 2013 The Authors Journal of Microscopy © 2013 Royal Microscopical Society.
Development of an Automatic Ground Collision Avoidance System Using a Digital Terrain Database
1989-12-01
release; distribution unlimited I I I I The purpose of this study was to develop a working control system that would perform automatic ground... control system analysis. I also wish to extend a hand of appreciation to my sponsor Mr. I Finley Barfield of the Flight Dynamics Laboratory for the use of...facilities, as- sistance in deciphering control law diagrams, and his expert knowledge of the F-16. Under the area of morale, I wish to thank all of my
Dynamic Information and Library Processing.
ERIC Educational Resources Information Center
Salton, Gerard
This book provides an introduction to automated information services: collection, analysis, classification, storage, retrieval, transmission, and dissemination. An introductory chapter is followed by an overview of mechanized processes for acquisitions, cataloging, and circulation. Automatic indexing and abstracting methods are covered, followed…
Unsupervised analysis of small animal dynamic Cerenkov luminescence imaging
NASA Astrophysics Data System (ADS)
Spinelli, Antonello E.; Boschi, Federico
2011-12-01
Clustering analysis (CA) and principal component analysis (PCA) were applied to dynamic Cerenkov luminescence images (dCLI). In order to investigate the performances of the proposed approaches, two distinct dynamic data sets obtained by injecting mice with 32P-ATP and 18F-FDG were acquired using the IVIS 200 optical imager. The k-means clustering algorithm has been applied to dCLI and was implemented using interactive data language 8.1. We show that cluster analysis allows us to obtain good agreement between the clustered and the corresponding emission regions like the bladder, the liver, and the tumor. We also show a good correspondence between the time activity curves of the different regions obtained by using CA and manual region of interest analysis on dCLIT and PCA images. We conclude that CA provides an automatic unsupervised method for the analysis of preclinical dynamic Cerenkov luminescence image data.
An independent software system for the analysis of dynamic MR images.
Torheim, G; Lombardi, M; Rinck, P A
1997-01-01
A computer system for the manual, semi-automatic, and automatic analysis of dynamic MR images was to be developed on UNIX and personal computer platforms. The system was to offer an integrated and standardized way of performing both image processing and analysis that was independent of the MR unit used. The system consists of modules that are easily adaptable to special needs. Data from MR units or other diagnostic imaging equipment in techniques such as CT, ultrasonography, or nuclear medicine can be processed through the ACR-NEMA/DICOM standard file formats. A full set of functions is available, among them cine-loop visual analysis, and generation of time-intensity curves. Parameters such as cross-correlation coefficients, area under the curve, peak/maximum intensity, wash-in and wash-out slopes, time to peak, and relative signal intensity/contrast enhancement can be calculated. Other parameters can be extracted by fitting functions like the gamma-variate function. Region-of-interest data and parametric values can easily be exported. The system has been successfully tested in animal and patient examinations.
Turner, Alexander P; Caves, Leo S D; Stepney, Susan; Tyrrell, Andy M; Lones, Michael A
2017-01-01
This paper describes the artificial epigenetic network, a recurrent connectionist architecture that is able to dynamically modify its topology in order to automatically decompose and solve dynamical problems. The approach is motivated by the behavior of gene regulatory networks, particularly the epigenetic process of chromatin remodeling that leads to topological change and which underlies the differentiation of cells within complex biological organisms. We expected this approach to be useful in situations where there is a need to switch between different dynamical behaviors, and do so in a sensitive and robust manner in the absence of a priori information about problem structure. This hypothesis was tested using a series of dynamical control tasks, each requiring solutions that could express different dynamical behaviors at different stages within the task. In each case, the addition of topological self-modification was shown to improve the performance and robustness of controllers. We believe this is due to the ability of topological changes to stabilize attractors, promoting stability within a dynamical regime while allowing rapid switching between different regimes. Post hoc analysis of the controllers also demonstrated how the partitioning of the networks could provide new insights into problem structure.
Sensitivity analysis of automatic flight control systems using singular value concepts
NASA Technical Reports Server (NTRS)
Herrera-Vaillard, A.; Paduano, J.; Downing, D.
1985-01-01
A sensitivity analysis is presented that can be used to judge the impact of vehicle dynamic model variations on the relative stability of multivariable continuous closed-loop control systems. The sensitivity analysis uses and extends the singular-value concept by developing expressions for the gradients of the singular value with respect to variations in the vehicle dynamic model and the controller design. Combined with a priori estimates of the accuracy of the model, the gradients are used to identify the elements in the vehicle dynamic model and controller that could severely impact the system's relative stability. The technique is demonstrated for a yaw/roll damper stability augmentation designed for a business jet.
Towards a Certified Lightweight Array Bound Checker for Java Bytecode
NASA Technical Reports Server (NTRS)
Pichardie, David
2009-01-01
Dynamic array bound checks are crucial elements for the security of a Java Virtual Machines. These dynamic checks are however expensive and several static analysis techniques have been proposed to eliminate explicit bounds checks. Such analyses require advanced numerical and symbolic manipulations that 1) penalize bytecode loading or dynamic compilation, 2) complexify the trusted computing base. Following the Foundational Proof Carrying Code methodology, our goal is to provide a lightweight bytecode verifier for eliminating array bound checks that is both efficient and trustable. In this work, we define a generic relational program analysis for an imperative, stackoriented byte code language with procedures, arrays and global variables and instantiate it with a relational abstract domain as polyhedra. The analysis has automatic inference of loop invariants and method pre-/post-conditions, and efficient checking of analysis results by a simple checker. Invariants, which can be large, can be specialized for proving a safety policy using an automatic pruning technique which reduces their size. The result of the analysis can be checked efficiently by annotating the program with parts of the invariant together with certificates of polyhedral inclusions. The resulting checker is sufficiently simple to be entirely certified within the Coq proof assistant for a simple fragment of the Java bytecode language. During the talk, we will also report on our ongoing effort to scale this approach for the full sequential JVM.
Sensitivity analysis of dynamic biological systems with time-delays.
Wu, Wu Hsiung; Wang, Feng Sheng; Chang, Maw Shang
2010-10-15
Mathematical modeling has been applied to the study and analysis of complex biological systems for a long time. Some processes in biological systems, such as the gene expression and feedback control in signal transduction networks, involve a time delay. These systems are represented as delay differential equation (DDE) models. Numerical sensitivity analysis of a DDE model by the direct method requires the solutions of model and sensitivity equations with time-delays. The major effort is the computation of Jacobian matrix when computing the solution of sensitivity equations. The computation of partial derivatives of complex equations either by the analytic method or by symbolic manipulation is time consuming, inconvenient, and prone to introduce human errors. To address this problem, an automatic approach to obtain the derivatives of complex functions efficiently and accurately is necessary. We have proposed an efficient algorithm with an adaptive step size control to compute the solution and dynamic sensitivities of biological systems described by ordinal differential equations (ODEs). The adaptive direct-decoupled algorithm is extended to solve the solution and dynamic sensitivities of time-delay systems describing by DDEs. To save the human effort and avoid the human errors in the computation of partial derivatives, an automatic differentiation technique is embedded in the extended algorithm to evaluate the Jacobian matrix. The extended algorithm is implemented and applied to two realistic models with time-delays: the cardiovascular control system and the TNF-α signal transduction network. The results show that the extended algorithm is a good tool for dynamic sensitivity analysis on DDE models with less user intervention. By comparing with direct-coupled methods in theory, the extended algorithm is efficient, accurate, and easy to use for end users without programming background to do dynamic sensitivity analysis on complex biological systems with time-delays.
Aeroelastic Analysis for Rotorcraft
NASA Technical Reports Server (NTRS)
Johnson, W.
1982-01-01
Aeroelastic-analysis computer program incorporates an analytical model of aeroelastic behavior of wide range of rotorcraft. Such an analytical model is desirable for both pretest predictions and posttest correlations. Program can be applied in investigations of isolated rotor aeroelasticity and helicopter-flight dynamics and could be employed as basis for more-extensive investigations or aeroelastic behavior, such as automatic control system design.
The symbolic computation and automatic analysis of trajectories
NASA Technical Reports Server (NTRS)
Grossman, Robert
1991-01-01
Research was generally done on computation of trajectories of dynamical systems, especially control systems. Algorithms were further developed for rewriting expressions involving differential operators. The differential operators involved arise in the local analysis of nonlinear control systems. An initial design was completed of the system architecture for software to analyze nonlinear control systems using data base computing.
Rocketdyne automated dynamics data analysis and management system
NASA Technical Reports Server (NTRS)
Tarn, Robert B.
1988-01-01
An automated dynamics data analysis and management systems implemented on a DEC VAX minicomputer cluster is described. Multichannel acquisition, Fast Fourier Transformation analysis, and an online database have significantly improved the analysis of wideband transducer responses from Space Shuttle Main Engine testing. Leakage error correction to recover sinusoid amplitudes and correct for frequency slewing is described. The phase errors caused by FM recorder/playback head misalignment are automatically measured and used to correct the data. Data compression methods are described and compared. The system hardware is described. Applications using the data base are introduced, including software for power spectral density, instantaneous time history, amplitude histogram, fatigue analysis, and rotordynamics expert system analysis.
Automated region selection for analysis of dynamic cardiac SPECT data
NASA Astrophysics Data System (ADS)
Di Bella, E. V. R.; Gullberg, G. T.; Barclay, A. B.; Eisner, R. L.
1997-06-01
Dynamic cardiac SPECT using Tc-99m labeled teboroxime can provide kinetic parameters (washin, washout) indicative of myocardial blood flow. A time-consuming and subjective step of the data analysis is drawing regions of interest to delineate blood pool and myocardial tissue regions. The time-activity curves of the regions are then used to estimate local kinetic parameters. In this work, the appropriate regions are found automatically, in a manner similar to that used for calculating maximum count circumferential profiles in conventional static cardiac studies. The drawbacks to applying standard static circumferential profile methods are the high noise level and high liver uptake common in dynamic teboroxime studies. Searching along each ray for maxima to locate the myocardium does not typically provide useful information. Here we propose an iterative scheme in which constraints are imposed on the radii searched along each ray. The constraints are based on the shape of the time-activity curves of the circumferential profile members and on an assumption that the short axis slices are approximately circular. The constraints eliminate outliers and help to reduce the effects of noise and liver activity. Kinetic parameter estimates from the automatically generated regions were comparable to estimates from manually selected regions in dynamic canine teboroxime studies.
Automatic Whistler Detector and Analyzer system: Implementation of the analyzer algorithm
NASA Astrophysics Data System (ADS)
Lichtenberger, JáNos; Ferencz, Csaba; Hamar, Daniel; Steinbach, Peter; Rodger, Craig J.; Clilverd, Mark A.; Collier, Andrew B.
2010-12-01
The full potential of whistlers for monitoring plasmaspheric electron density variations has not yet been realized. The primary reason is the vast human effort required for the analysis of whistler traces. Recently, the first part of a complete whistler analysis procedure was successfully automated, i.e., the automatic detection of whistler traces from the raw broadband VLF signal was achieved. This study describes a new algorithm developed to determine plasmaspheric electron density measurements from whistler traces, based on a Virtual (Whistler) Trace Transformation, using a 2-D fast Fourier transform transformation. This algorithm can be automated and can thus form the final step to complete an Automatic Whistler Detector and Analyzer (AWDA) system. In this second AWDA paper, the practical implementation of the Automatic Whistler Analyzer (AWA) algorithm is discussed and a feasible solution is presented. The practical implementation of the algorithm is able to track the variations of plasmasphere in quasi real time on a PC cluster with 100 CPU cores. The electron densities obtained by the AWA method can be used in investigations such as plasmasphere dynamics, ionosphere-plasmasphere coupling, or in space weather models.
NASA Technical Reports Server (NTRS)
Johnson, Charles S.
1986-01-01
Physical quantities using various units of measurement can be well represented in Ada by the use of abstract types. Computation involving these quantities (electric potential, mass, volume) can also automatically invoke the computation and checking of some of the implicitly associable attributes of measurements. Quantities can be held internally in SI units, transparently to the user, with automatic conversion. Through dimensional analysis, the type of the derived quantity resulting from a computation is known, thereby allowing dynamic checks of the equations used. The impact of the possible implementation of these techniques in integration and test applications is discussed. The overhead of computing and transporting measurement attributes is weighed against the advantages gained by their use. The construction of a run time interpreter using physical quantities in equations can be aided by the dynamic equation checks provided by dimensional analysis. The effects of high levels of abstraction on the generation and maintenance of software used in integration and test applications are also discussed.
Sultan, Mohammad M; Kiss, Gert; Shukla, Diwakar; Pande, Vijay S
2014-12-09
Given the large number of crystal structures and NMR ensembles that have been solved to date, classical molecular dynamics (MD) simulations have become powerful tools in the atomistic study of the kinetics and thermodynamics of biomolecular systems on ever increasing time scales. By virtue of the high-dimensional conformational state space that is explored, the interpretation of large-scale simulations faces difficulties not unlike those in the big data community. We address this challenge by introducing a method called clustering based feature selection (CB-FS) that employs a posterior analysis approach. It combines supervised machine learning (SML) and feature selection with Markov state models to automatically identify the relevant degrees of freedom that separate conformational states. We highlight the utility of the method in the evaluation of large-scale simulations and show that it can be used for the rapid and automated identification of relevant order parameters involved in the functional transitions of two exemplary cell-signaling proteins central to human disease states.
2D automatic body-fitted structured mesh generation using advancing extraction method
NASA Astrophysics Data System (ADS)
Zhang, Yaoxin; Jia, Yafei
2018-01-01
This paper presents an automatic mesh generation algorithm for body-fitted structured meshes in Computational Fluids Dynamics (CFD) analysis using the Advancing Extraction Method (AEM). The method is applicable to two-dimensional domains with complex geometries, which have the hierarchical tree-like topography with extrusion-like structures (i.e., branches or tributaries) and intrusion-like structures (i.e., peninsula or dikes). With the AEM, the hierarchical levels of sub-domains can be identified, and the block boundary of each sub-domain in convex polygon shape in each level can be extracted in an advancing scheme. In this paper, several examples were used to illustrate the effectiveness and applicability of the proposed algorithm for automatic structured mesh generation, and the implementation of the method.
NASA Technical Reports Server (NTRS)
Klein, M.; Reynolds, J.; Ricks, E.
1989-01-01
Load and stress recovery from transient dynamic studies are improved upon using an extended acceleration vector in the modal acceleration technique applied to structural analysis. Extension of the normal LTM (load transformation matrices) stress recovery to automatically compute margins of safety is presented with an application to the Hubble space telescope.
Effectiveness Testing of Embedded User Support for U.S. Army Installation-Level Software
1991-06-01
under what conditions Dynamic Help could influence performance and satisfaction. The ACIFS program was modified to provide automatic collection of all...under what conditions Dynamic Help can influence user performance and satisfaction. This chapter reports the design, implementation, and analysis of...ambiguous or is hidden in the body of the message. The ACIFS program has many user interface deficiencies, but it does allow the user to use trial and
Research on Generating Method of Embedded Software Test Document Based on Dynamic Model
NASA Astrophysics Data System (ADS)
Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Liu, Ying
2018-03-01
This paper provides a dynamic model-based test document generation method for embedded software that provides automatic generation of two documents: test requirements specification documentation and configuration item test documentation. This method enables dynamic test requirements to be implemented in dynamic models, enabling dynamic test demand tracking to be easily generated; able to automatically generate standardized, standardized test requirements and test documentation, improved document-related content inconsistency and lack of integrity And other issues, improve the efficiency.
Automatic segmentation and supervised learning-based selection of nuclei in cancer tissue images.
Nandy, Kaustav; Gudla, Prabhakar R; Amundsen, Ryan; Meaburn, Karen J; Misteli, Tom; Lockett, Stephen J
2012-09-01
Analysis of preferential localization of certain genes within the cell nuclei is emerging as a new technique for the diagnosis of breast cancer. Quantitation requires accurate segmentation of 100-200 cell nuclei in each tissue section to draw a statistically significant result. Thus, for large-scale analysis, manual processing is too time consuming and subjective. Fortuitously, acquired images generally contain many more nuclei than are needed for analysis. Therefore, we developed an integrated workflow that selects, following automatic segmentation, a subpopulation of accurately delineated nuclei for positioning of fluorescence in situ hybridization-labeled genes of interest. Segmentation was performed by a multistage watershed-based algorithm and screening by an artificial neural network-based pattern recognition engine. The performance of the workflow was quantified in terms of the fraction of automatically selected nuclei that were visually confirmed as well segmented and by the boundary accuracy of the well-segmented nuclei relative to a 2D dynamic programming-based reference segmentation method. Application of the method was demonstrated for discriminating normal and cancerous breast tissue sections based on the differential positioning of the HES5 gene. Automatic results agreed with manual analysis in 11 out of 14 cancers, all four normal cases, and all five noncancerous breast disease cases, thus showing the accuracy and robustness of the proposed approach. Published 2012 Wiley Periodicals, Inc.
Nguyen, Thanh; Bui, Vy; Lam, Van; Raub, Christopher B; Chang, Lin-Ching; Nehmetallah, George
2017-06-26
We propose a fully automatic technique to obtain aberration free quantitative phase imaging in digital holographic microscopy (DHM) based on deep learning. The traditional DHM solves the phase aberration compensation problem by manually detecting the background for quantitative measurement. This would be a drawback in real time implementation and for dynamic processes such as cell migration phenomena. A recent automatic aberration compensation approach using principle component analysis (PCA) in DHM avoids human intervention regardless of the cells' motion. However, it corrects spherical/elliptical aberration only and disregards the higher order aberrations. Traditional image segmentation techniques can be employed to spatially detect cell locations. Ideally, automatic image segmentation techniques make real time measurement possible. However, existing automatic unsupervised segmentation techniques have poor performance when applied to DHM phase images because of aberrations and speckle noise. In this paper, we propose a novel method that combines a supervised deep learning technique with convolutional neural network (CNN) and Zernike polynomial fitting (ZPF). The deep learning CNN is implemented to perform automatic background region detection that allows for ZPF to compute the self-conjugated phase to compensate for most aberrations.
STARS: A general-purpose finite element computer program for analysis of engineering structures
NASA Technical Reports Server (NTRS)
Gupta, K. K.
1984-01-01
STARS (Structural Analysis Routines) is primarily an interactive, graphics-oriented, finite-element computer program for analyzing the static, stability, free vibration, and dynamic responses of damped and undamped structures, including rotating systems. The element library consists of one-dimensional (1-D) line elements, two-dimensional (2-D) triangular and quadrilateral shell elements, and three-dimensional (3-D) tetrahedral and hexahedral solid elements. These elements enable the solution of structural problems that include truss, beam, space frame, plane, plate, shell, and solid structures, or any combination thereof. Zero, finite, and interdependent deflection boundary conditions can be implemented by the program. The associated dynamic response analysis capability provides for initial deformation and velocity inputs, whereas the transient excitation may be either forces or accelerations. An effective in-core or out-of-core solution strategy is automatically employed by the program, depending on the size of the problem. Data input may be at random within a data set, and the program offers certain automatic data-generation features. Input data are formatted as an optimal combination of free and fixed formats. Interactive graphics capabilities enable convenient display of nodal deformations, mode shapes, and element stresses.
A method of automatic control procedures cardiopulmonary resuscitation
NASA Astrophysics Data System (ADS)
Bureev, A. Sh.; Zhdanov, D. S.; Kiseleva, E. Yu.; Kutsov, M. S.; Trifonov, A. Yu.
2015-11-01
The study is to present the results of works on creation of methods of automatic control procedures of cardiopulmonary resuscitation (CPR). A method of automatic control procedure of CPR by evaluating the acoustic data of the dynamics of blood flow in the bifurcation of carotid arteries and the dynamics of air flow in a trachea according to the current guidelines for CPR is presented. Evaluation of the patient is carried out by analyzing the respiratory noise and blood flow in the interspaces between the chest compressions and artificial pulmonary ventilation. The device operation algorithm of automatic control procedures of CPR and its block diagram has been developed.
Buffet test in the National Transonic Facility
NASA Technical Reports Server (NTRS)
Young, Clarence P., Jr.; Hergert, Dennis W.; Butler, Thomas W.; Herring, Fred M.
1992-01-01
A buffet test of a commercial transport model was accomplished in the National Transonic Facility at the NASA Langley Research Center. This aeroelastic test was unprecedented for this wind tunnel and posed a high risk for the facility. Presented here are the test results from a structural dynamics and aeroelastic response point of view. The activities required for the safety analysis and risk assessment are described. The test was conducted in the same manner as a flutter test and employed on-board dynamic instrumentation, real time dynamic data monitoring, and automatic and manual tunnel interlock systems for protecting the model.
Overview of Sensitivity Analysis and Shape Optimization for Complex Aerodynamic Configurations
NASA Technical Reports Server (NTRS)
Newman, Perry A.; Newman, James C., III; Barnwell, Richard W.; Taylor, Arthur C., III; Hou, Gene J.-W.
1998-01-01
This paper presents a brief overview of some of the more recent advances in steady aerodynamic shape-design sensitivity analysis and optimization, based on advanced computational fluid dynamics. The focus here is on those methods particularly well- suited to the study of geometrically complex configurations and their potentially complex associated flow physics. When nonlinear state equations are considered in the optimization process, difficulties are found in the application of sensitivity analysis. Some techniques for circumventing such difficulties are currently being explored and are included here. Attention is directed to methods that utilize automatic differentiation to obtain aerodynamic sensitivity derivatives for both complex configurations and complex flow physics. Various examples of shape-design sensitivity analysis for unstructured-grid computational fluid dynamics algorithms are demonstrated for different formulations of the sensitivity equations. Finally, the use of advanced, unstructured-grid computational fluid dynamics in multidisciplinary analyses and multidisciplinary sensitivity analyses within future optimization processes is recommended and encouraged.
Visualization of system dynamics using phasegrams
Herbst, Christian T.; Herzel, Hanspeter; Švec, Jan G.; Wyman, Megan T.; Fitch, W. Tecumseh
2013-01-01
A new tool for visualization and analysis of system dynamics is introduced: the phasegram. Its application is illustrated with both classical nonlinear systems (logistic map and Lorenz system) and with biological voice signals. Phasegrams combine the advantages of sliding-window analysis (such as the spectrogram) with well-established visualization techniques from the domain of nonlinear dynamics. In a phasegram, time is mapped onto the x-axis, and various vibratory regimes, such as periodic oscillation, subharmonics or chaos, are identified within the generated graph by the number and stability of horizontal lines. A phasegram can be interpreted as a bifurcation diagram in time. In contrast to other analysis techniques, it can be automatically constructed from time-series data alone: no additional system parameter needs to be known. Phasegrams show great potential for signal classification and can act as the quantitative basis for further analysis of oscillating systems in many scientific fields, such as physics (particularly acoustics), biology or medicine. PMID:23697715
1987-09-01
have shown that gun barrel heating, and hence thermal expansion , is both axially and circumferentially asymmetric. Circumferential, or cross-barrel...element code, which ended in the selection of ABAQUS . The code will perform static, dynamic, and thermal anal- ysis on a broad range of structures...analysis may be performed by a user supplied FORTRAN subroutine which is automatically linked to the code and supplements the stand- ard ABAQUS
Calculation of three-dimensional, inviscid, supersonic, steady flows
NASA Technical Reports Server (NTRS)
Moretti, G.
1981-01-01
A detailed description of a computational program for the evaluation of three dimensional supersonic, inviscid, steady flow past airplanes is presented. Emphasis was put on how a powerful, automatic mapping technique is coupled to the fluid mechanical analysis. Each of the three constituents of the analysis (body geometry, mapping technique, and gas dynamical effects) was carefully coded and described. Results of computations based on sample geometrics and discussions are also presented.
Vehicle-to-Grid Automatic Load Sharing with Driver Preference in Micro-Grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yubo; Nazaripouya, Hamidreza; Chu, Chi-Cheng
Integration of Electrical Vehicles (EVs) with power grid not only brings new challenges for load management, but also opportunities for distributed storage and generation. This paper comprehensively models and analyzes distributed Vehicle-to-Grid (V2G) for automatic load sharing with driver preference. In a micro-grid with limited communications, V2G EVs need to decide load sharing based on their own power and voltage profile. A droop based controller taking into account driver preference is proposed in this paper to address the distributed control of EVs. Simulations are designed for three fundamental V2G automatic load sharing scenarios that include all system dynamics of suchmore » applications. Simulation results demonstrate that active power sharing is achieved proportionally among V2G EVs with consideration of driver preference. In additional, the results also verify the system stability and reactive power sharing analysis in system modelling, which sheds light on large scale V2G automatic load sharing in more complicated cases.« less
Musical structure analysis using similarity matrix and dynamic programming
NASA Astrophysics Data System (ADS)
Shiu, Yu; Jeong, Hong; Kuo, C.-C. Jay
2005-10-01
Automatic music segmentation and structure analysis from audio waveforms based on a three-level hierarchy is examined in this research, where the three-level hierarchy includes notes, measures and parts. The pitch class profile (PCP) feature is first extracted at the note level. Then, a similarity matrix is constructed at the measure level, where a dynamic time warping (DTW) technique is used to enhance the similarity computation by taking the temporal distortion of similar audio segments into account. By processing the similarity matrix, we can obtain a coarse-grain music segmentation result. Finally, dynamic programming is applied to the coarse-grain segments so that a song can be decomposed into several major parts such as intro, verse, chorus, bridge and outro. The performance of the proposed music structure analysis system is demonstrated for pop and rock music.
Dynamic analysis of elastic rubber tired car wheel breaking under variable normal load
NASA Astrophysics Data System (ADS)
Fedotov, A. I.; Zedgenizov, V. G.; Ovchinnikova, N. I.
2017-10-01
The purpose of the paper is to analyze the dynamics of the braking of the wheel under normal load variations. The paper uses a mathematical simulation method according to which the calculation model of an object as a mechanical system is associated with a dynamically equivalent schematic structure of the automatic control. Transfer function tool analyzing structural and technical characteristics of an object as well as force disturbances were used. It was proved that the analysis of dynamic characteristics of the wheel subjected to external force disturbances has to take into account amplitude and phase-frequency characteristics. Normal load variations impact car wheel braking subjected to disturbances. The closer slip to the critical point is, the higher the impact is. In the super-critical area, load variations cause fast wheel blocking.
López Pérez, David; Leonardi, Giuseppe; Niedźwiecka, Alicja; Radkowska, Alicja; Rączaszek-Leonardi, Joanna; Tomalski, Przemysław
2017-01-01
The analysis of parent-child interactions is crucial for the understanding of early human development. Manual coding of interactions is a time-consuming task, which is a limitation in many projects. This becomes especially demanding if a frame-by-frame categorization of movement needs to be achieved. To overcome this, we present a computational approach for studying movement coupling in natural settings, which is a combination of a state-of-the-art automatic tracker, Tracking-Learning-Detection (TLD), and nonlinear time-series analysis, Cross-Recurrence Quantification Analysis (CRQA). We investigated the use of TLD to extract and automatically classify movement of each partner from 21 video recordings of interactions, where 5.5-month-old infants and mothers engaged in free play in laboratory settings. As a proof of concept, we focused on those face-to-face episodes, where the mother animated an object in front of the infant, in order to measure the coordination between the infants' head movement and the mothers' hand movement. We also tested the feasibility of using such movement data to study behavioral coupling between partners with CRQA. We demonstrate that movement can be extracted automatically from standard definition video recordings and used in subsequent CRQA to quantify the coupling between movement of the parent and the infant. Finally, we assess the quality of this coupling using an extension of CRQA called anisotropic CRQA and show asymmetric dynamics between the movement of the parent and the infant. When combined these methods allow automatic coding and classification of behaviors, which results in a more efficient manner of analyzing movements than manual coding.
López Pérez, David; Leonardi, Giuseppe; Niedźwiecka, Alicja; Radkowska, Alicja; Rączaszek-Leonardi, Joanna; Tomalski, Przemysław
2017-01-01
The analysis of parent-child interactions is crucial for the understanding of early human development. Manual coding of interactions is a time-consuming task, which is a limitation in many projects. This becomes especially demanding if a frame-by-frame categorization of movement needs to be achieved. To overcome this, we present a computational approach for studying movement coupling in natural settings, which is a combination of a state-of-the-art automatic tracker, Tracking-Learning-Detection (TLD), and nonlinear time-series analysis, Cross-Recurrence Quantification Analysis (CRQA). We investigated the use of TLD to extract and automatically classify movement of each partner from 21 video recordings of interactions, where 5.5-month-old infants and mothers engaged in free play in laboratory settings. As a proof of concept, we focused on those face-to-face episodes, where the mother animated an object in front of the infant, in order to measure the coordination between the infants' head movement and the mothers' hand movement. We also tested the feasibility of using such movement data to study behavioral coupling between partners with CRQA. We demonstrate that movement can be extracted automatically from standard definition video recordings and used in subsequent CRQA to quantify the coupling between movement of the parent and the infant. Finally, we assess the quality of this coupling using an extension of CRQA called anisotropic CRQA and show asymmetric dynamics between the movement of the parent and the infant. When combined these methods allow automatic coding and classification of behaviors, which results in a more efficient manner of analyzing movements than manual coding. PMID:29312075
NASA Astrophysics Data System (ADS)
Meng, Fei; Tao, Gang; Zhang, Tao; Hu, Yihuai; Geng, Peng
2015-08-01
Shifting quality is a crucial factor in all parts of the automobile industry. To ensure an optimal gear shifting strategy with best fuel economy for a stepped automatic transmission, the controller should be designed to meet the challenge of lacking of a feedback sensor to measure the relevant variables. This paper focuses on a new kind of automatic transmission using proportional solenoid valve to control the clutch pressure, a speed difference of the clutch based control strategy is designed for the shift control during the inertia phase. First, the mechanical system is shown and the system dynamic model is built. Second, the control strategy is designed based on the characterization analysis of models which are derived from dynamics of the drive line and electro-hydraulic actuator. Then, the controller uses conventional Proportional-Integral-Derivative control theory, and a robust two-degree-of-freedom controller is also carried out to determine the optimal control parameters to further improve the system performance. Finally, the designed control strategy with different controller is implemented on a simulation model. The compared results show that the speed difference of clutch can track the desired trajectory well and improve the shift quality effectively.
Automatic programming via iterated local search for dynamic job shop scheduling.
Nguyen, Su; Zhang, Mengjie; Johnston, Mark; Tan, Kay Chen
2015-01-01
Dispatching rules have been commonly used in practice for making sequencing and scheduling decisions. Due to specific characteristics of each manufacturing system, there is no universal dispatching rule that can dominate in all situations. Therefore, it is important to design specialized dispatching rules to enhance the scheduling performance for each manufacturing environment. Evolutionary computation approaches such as tree-based genetic programming (TGP) and gene expression programming (GEP) have been proposed to facilitate the design task through automatic design of dispatching rules. However, these methods are still limited by their high computational cost and low exploitation ability. To overcome this problem, we develop a new approach to automatic programming via iterated local search (APRILS) for dynamic job shop scheduling. The key idea of APRILS is to perform multiple local searches started with programs modified from the best obtained programs so far. The experiments show that APRILS outperforms TGP and GEP in most simulation scenarios in terms of effectiveness and efficiency. The analysis also shows that programs generated by APRILS are more compact than those obtained by genetic programming. An investigation of the behavior of APRILS suggests that the good performance of APRILS comes from the balance between exploration and exploitation in its search mechanism.
NASA Technical Reports Server (NTRS)
Hicks, John W.; Moulton, Bryan J.
1988-01-01
The camber control loop of the X-29A FSW aircraft was designed to furnish the optimum L/D for trimmed, stabilized flight. A marked difference was noted between automatic wing camber control loop behavior in dynamic maneuvers and in stabilized flight conditions, which in turn affected subsonic aerodynamic performance. The degree of drag level increase was a direct function of maneuver rate. Attention is given to the aircraft flight drag polar effects of maneuver dynamics in light of wing camber control loop schedule. The effect of changing camber scheduling to better track the optimum automatic camber control L/D schedule is discussed.
Dynamic multiplexed analysis method using ion mobility spectrometer
Belov, Mikhail E [Richland, WA
2010-05-18
A method for multiplexed analysis using ion mobility spectrometer in which the effectiveness and efficiency of the multiplexed method is optimized by automatically adjusting rates of passage of analyte materials through an IMS drift tube during operation of the system. This automatic adjustment is performed by the IMS instrument itself after determining the appropriate levels of adjustment according to the method of the present invention. In one example, the adjustment of the rates of passage for these materials is determined by quantifying the total number of analyte molecules delivered to the ion trap in a preselected period of time, comparing this number to the charge capacity of the ion trap, selecting a gate opening sequence; and implementing the selected gate opening sequence to obtain a preselected rate of analytes within said IMS drift tube.
Alex, J; Kolisch, G; Krause, K
2002-01-01
The objective of this presented project is to use the results of an CFD simulation to automatically, systematically and reliably generate an appropriate model structure for simulation of the biological processes using CSTR activated sludge compartments. Models and dynamic simulation have become important tools for research but also increasingly for the design and optimisation of wastewater treatment plants. Besides the biological models several cases are reported about the application of computational fluid dynamics ICFD) to wastewater treatment plants. One aim of the presented method to derive model structures from CFD results is to exclude the influence of empirical structure selection to the result of dynamic simulations studies of WWTPs. The second application of the approach developed is the analysis of badly performing treatment plants where the suspicion arises that bad flow behaviour such as short cut flows is part of the problem. The method suggested requires as the first step the calculation of fluid dynamics of the biological treatment step at different loading situations by use of 3-dimensional CFD simulation. The result of this information is used to generate a suitable model structure for conventional dynamic simulation of the treatment plant by use of a number of CSTR modules with a pattern of exchange flows between the tanks automatically. The method is explained in detail and the application to the WWTP Wuppertal Buchenhofen is presented.
Wind modeling and lateral control for automatic landing
NASA Technical Reports Server (NTRS)
Holley, W. E.; Bryson, A. E., Jr.
1975-01-01
For the purposes of aircraft control system design and analysis, the wind can be characterized by a mean component which varies with height and by turbulent components which are described by the von Karman correlation model. The aircraft aero-dynamic forces and moments depend linearly on uniform and gradient gust components obtained by averaging over the aircraft's length and span. The correlations of the averaged components are then approximated by the outputs of linear shaping filters forced by white noise. The resulting model of the crosswind shear and turbulence effects is used in the design of a lateral control system for the automatic landing of a DC-8 aircraft.
Assume-Guarantee Abstraction Refinement Meets Hybrid Systems
NASA Technical Reports Server (NTRS)
Bogomolov, Sergiy; Frehse, Goran; Greitschus, Marius; Grosu, Radu; Pasareanu, Corina S.; Podelski, Andreas; Strump, Thomas
2014-01-01
Compositional verification techniques in the assume- guarantee style have been successfully applied to transition systems to efficiently reduce the search space by leveraging the compositional nature of the systems under consideration. We adapt these techniques to the domain of hybrid systems with affine dynamics. To build assumptions we introduce an abstraction based on location merging. We integrate the assume-guarantee style analysis with automatic abstraction refinement. We have implemented our approach in the symbolic hybrid model checker SpaceEx. The evaluation shows its practical potential. To the best of our knowledge, this is the first work combining assume-guarantee reasoning with automatic abstraction-refinement in the context of hybrid automata.
Automated quantification of the synchrogram by recurrence plot analysis.
Nguyen, Chinh Duc; Wilson, Stephen James; Crozier, Stuart
2012-04-01
Recently, the concept of phase synchronization of two weakly coupled oscillators has raised a great research interest and has been applied to characterize synchronization phenomenon in physiological data. Phase synchronization of cardiorespiratory coupling is often studied by a synchrogram analysis, a graphical tool investigating the relationship between instantaneous phases of two signals. Although several techniques have been proposed to automatically quantify the synchrogram, most of them require a preselection of a phase-locking ratio by trial and error. One technique does not require this information; however, it is based on the power spectrum of phase's distribution in the synchrogram, which is vulnerable to noise. This study aims to introduce a new technique to automatically quantify the synchrogram by studying its dynamic structure. Our technique exploits recurrence plot analysis, which is a well-established tool for characterizing recurring patterns and nonstationarities in experiments. We applied our technique to detect synchronization in simulated and measured infants' cardiorespiratory data. Our results suggest that the proposed technique is able to systematically detect synchronization in noisy and chaotic data without preselecting the phase-locking ratio. By embedding phase information of the synchrogram into phase space, the phase-locking ratio is automatically unveiled as the number of attractors.
The integrated manual and automatic control of complex flight systems
NASA Technical Reports Server (NTRS)
Schmidt, D. K.
1986-01-01
The topics of research in this program include pilot/vehicle analysis techniques, identification of pilot dynamics, and control and display synthesis techniques for optimizing aircraft handling qualities. The project activities are discussed. The current technical activity is directed at extending and validating the active display synthesis procedure, and the pilot/vehicle analysis of the NLR rate-command flight configurations in the landing task. Two papers published by the researchers are attached as appendices.
Learning Enterprise Malware Triage from Automatic Dynamic Analysis
2013-03-01
Kolter and Maloof n-gram method, Dube’s malware target recognition (MaTR) static method performs significantly more accurately at the 95% confidence...from the static method as in Kolter and Maloof. The MIST approach with behavior sequences 9 allows researchers to tailor the level of analysis to the...citations, none publish work that implements it. Only Kolter and Maloof use nearly as long gram structures, although that research uses static grams rather
Bechar, Ikhlef; Trubuil, Alain
2006-01-01
We describe a novel automatic approach for vesicle trafficking analysis in 3D+T videomicroscopy. Tracking individually objects in time in 3D+T videomicroscopy is known to be a very tedious job and leads generally to unreliable results. So instead, our method proceeds by first identifying trafficking regions in the 3D volume and next analysing at them the vesicle trafficking. The latter is viewed as significant change in the fluorescence of a region in the image. We embed the problem in a model selection framework and we resolve it using dynamic programming. We applied the proposed approach to analyse the vesicle dynamics related to the trafficking of the RAB6A protein between the Golgi apparatus and ER cell compartments.
Vasconcelos, Maria J M; Ventura, Sandra M R; Freitas, Diamantino R S; Tavares, João Manuel R S
2012-03-01
The morphological and dynamic characterisation of the vocal tract during speech production has been gaining greater attention due to the motivation of the latest improvements in magnetic resonance (MR) imaging; namely, with the use of higher magnetic fields, such as 3.0 Tesla. In this work, the automatic study of the vocal tract from 3.0 Tesla MR images was assessed through the application of statistical deformable models. Therefore, the primary goal focused on the analysis of the shape of the vocal tract during the articulation of European Portuguese sounds, followed by the evaluation of the results concerning the automatic segmentation, i.e. identification of the vocal tract in new MR images. In what concerns speech production, this is the first attempt to automatically characterise and reconstruct the vocal tract shape of 3.0 Tesla MR images by using deformable models; particularly, by using active and appearance shape models. The achieved results clearly evidence the adequacy and advantage of the automatic analysis of the 3.0 Tesla MR images of these deformable models in order to extract the vocal tract shape and assess the involved articulatory movements. These achievements are mostly required, for example, for a better knowledge of speech production, mainly of patients suffering from articulatory disorders, and to build enhanced speech synthesizer models.
Complex Networks Analysis of Manual and Machine Translations
NASA Astrophysics Data System (ADS)
Amancio, Diego R.; Antiqueira, Lucas; Pardo, Thiago A. S.; da F. Costa, Luciano; Oliveira, Osvaldo N.; Nunes, Maria G. V.
Complex networks have been increasingly used in text analysis, including in connection with natural language processing tools, as important text features appear to be captured by the topology and dynamics of the networks. Following previous works that apply complex networks concepts to text quality measurement, summary evaluation, and author characterization, we now focus on machine translation (MT). In this paper we assess the possible representation of texts as complex networks to evaluate cross-linguistic issues inherent in manual and machine translation. We show that different quality translations generated by MT tools can be distinguished from their manual counterparts by means of metrics such as in- (ID) and out-degrees (OD), clustering coefficient (CC), and shortest paths (SP). For instance, we demonstrate that the average OD in networks of automatic translations consistently exceeds the values obtained for manual ones, and that the CC values of source texts are not preserved for manual translations, but are for good automatic translations. This probably reflects the text rearrangements humans perform during manual translation. We envisage that such findings could lead to better MT tools and automatic evaluation metrics.
NASA Astrophysics Data System (ADS)
Migiyama, Go; Sugimura, Atsuhiko; Osa, Atsushi; Miike, Hidetoshi
Recently, digital cameras are offering technical advantages rapidly. However, the shot image is different from the sight image generated when that scenery is seen with the naked eye. There are blown-out highlights and crushed blacks in the image that photographed the scenery of wide dynamic range. The problems are hardly generated in the sight image. These are contributory cause of difference between the shot image and the sight image. Blown-out highlights and crushed blacks are caused by the difference of dynamic range between the image sensor installed in a digital camera such as CCD and CMOS and the human visual system. Dynamic range of the shot image is narrower than dynamic range of the sight image. In order to solve the problem, we propose an automatic method to decide an effective exposure range in superposition of edges. We integrate multi-step exposure images using the method. In addition, we try to erase pseudo-edges using the process to blend exposure values. Afterwards, we get a pseudo wide dynamic range image automatically.
Finite Element Analysis of New Crankshaft Automatic Adjustment Mechanism of Pumping Unit
NASA Astrophysics Data System (ADS)
Wu, Jufei; Wang, Qian
2017-12-01
In this paper, the crankshaft automatic adjustment mechanism designed on CYJY10-4.2-53HF pumping unit is used as the research object. The simulation of the friction and bending moment of the crank is carried out by ANSYS Workbench, and the finite element simulation results are compared with the theoretical calculation results to verify the theoretical calculation. The final result is that the finite element analysis of the friction of the crank is basically consistent with the theoretical calculation; The analysis and calculation of the stress and deformation about the two kinds of ultimate conditions of the guide platform are carried out too; The dynamic state analysis of the mechanism is carried out to obtain the vibration modes and natural frequencies of the vibration of the different parts of the counterweight under the condition of no preload force so that the frequency of the array can avoid the natural frequency, and can effectively avoid the resonance phenomenon, and for different modes we can improve the stiffness of the structure.
NASA automatic system for computer program documentation, volume 2
NASA Technical Reports Server (NTRS)
Simmons, D. B.
1972-01-01
The DYNASOR 2 program is used for the dynamic nonlinear analysis of shells of revolution. The equations of motion of the shell are solved using Houbolt's numerical procedure. The displacements and stress resultants are determined for both symmetrical and asymmetrical loading conditions. Asymmetrical dynamic buckling can be investigated. Solutions can be obtained for highly nonlinear problems utilizing as many as five of the harmonics generated by SAMMSOR program. A restart capability allows the user to restart the program at a specified time. For Vol. 1, see N73-22129.
ERIC Educational Resources Information Center
Dorça, Fabiano A.; Araújo, Rafael D.; de Carvalho, Vitor C.; Resende, Daniel T.; Cattelan, Renan G.
2016-01-01
Content personalization in educational systems is an increasing research area. Studies show that students tend to have better performances when the content is customized according to his/her preferences. One important aspect of students particularities is how they prefer to learn. In this context, students learning styles should be considered, due…
ERIC Educational Resources Information Center
Salton, Gerald; And Others
The present report is the twenty-first in a series describing research in information storage and retrieval conducted by the Department of Computer Science at Cornell University. The report covering work carried out by the SMART project for approximately two years (summer 1970 to summer 1972) is separated into five parts: automatic content…
SIMA: Python software for analysis of dynamic fluorescence imaging data.
Kaifosh, Patrick; Zaremba, Jeffrey D; Danielson, Nathan B; Losonczy, Attila
2014-01-01
Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs), and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI) for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.
Design automation techniques for custom LSI arrays
NASA Technical Reports Server (NTRS)
Feller, A.
1975-01-01
The standard cell design automation technique is described as an approach for generating random logic PMOS, CMOS or CMOS/SOS custom large scale integration arrays with low initial nonrecurring costs and quick turnaround time or design cycle. The system is composed of predesigned circuit functions or cells and computer programs capable of automatic placement and interconnection of the cells in accordance with an input data net list. The program generates a set of instructions to drive an automatic precision artwork generator. A series of support design automation and simulation programs are described, including programs for verifying correctness of the logic on the arrays, performing dc and dynamic analysis of MOS devices, and generating test sequences.
Parametric diagnosis of the adaptive gas path in the automatic control system of the aircraft engine
NASA Astrophysics Data System (ADS)
Kuznetsova, T. A.
2017-01-01
The paper dwells on the adaptive multimode mathematical model of the gas-turbine aircraft engine (GTE) embedded in the automatic control system (ACS). The mathematical model is based on the throttle performances, and is characterized by high accuracy of engine parameters identification in stationary and dynamic modes. The proposed on-board engine model is the state space linearized low-level simulation. The engine health is identified by the influence of the coefficient matrix. The influence coefficient is determined by the GTE high-level mathematical model based on measurements of gas-dynamic parameters. In the automatic control algorithm, the sum of squares of the deviation between the parameters of the mathematical model and real GTE is minimized. The proposed mathematical model is effectively used for gas path defects detecting in on-line GTE health monitoring. The accuracy of the on-board mathematical model embedded in ACS determines the quality of adaptive control and reliability of the engine. To improve the accuracy of identification solutions and sustainability provision, the numerical method of Monte Carlo was used. The parametric diagnostic algorithm based on the LPτ - sequence was developed and tested. Analysis of the results suggests that the application of the developed algorithms allows achieving higher identification accuracy and reliability than similar models used in practice.
de Souza Baptista, Roberto; Bo, Antonio P L; Hayashibe, Mitsuhiro
2017-06-01
Performance assessment of human movement is critical in diagnosis and motor-control rehabilitation. Recent developments in portable sensor technology enable clinicians to measure spatiotemporal aspects to aid in the neurological assessment. However, the extraction of quantitative information from such measurements is usually done manually through visual inspection. This paper presents a novel framework for automatic human movement assessment that executes segmentation and motor performance parameter extraction in time-series of measurements from a sequence of human movements. We use the elements of a Switching Linear Dynamic System model as building blocks to translate formal definitions and procedures from human movement analysis. Our approach provides a method for users with no expertise in signal processing to create models for movements using labeled dataset and later use it for automatic assessment. We validated our framework on preliminary tests involving six healthy adult subjects that executed common movements in functional tests and rehabilitation exercise sessions, such as sit-to-stand and lateral elevation of the arms and five elderly subjects, two of which with limited mobility, that executed the sit-to-stand movement. The proposed method worked on random motion sequences for the dual purpose of movement segmentation (accuracy of 72%-100%) and motor performance assessment (mean error of 0%-12%).
Integrating Multibody Simulation and CFD: toward Complex Multidisciplinary Design Optimization
NASA Astrophysics Data System (ADS)
Pieri, Stefano; Poloni, Carlo; Mühlmeier, Martin
This paper describes the use of integrated multidisciplinary analysis and optimization of a race car model on a predefined circuit. The objective is the definition of the most efficient geometric configuration that can guarantee the lowest lap time. In order to carry out this study it has been necessary to interface the design optimization software modeFRONTIER with the following softwares: CATIA v5, a three dimensional CAD software, used for the definition of the parametric geometry; A.D.A.M.S./Motorsport, a multi-body dynamic simulation software; IcemCFD, a mesh generator, for the automatic generation of the CFD grid; CFX, a Navier-Stokes code, for the fluid-dynamic forces prediction. The process integration gives the possibility to compute, for each geometrical configuration, a set of aerodynamic coefficients that are then used in the multiboby simulation for the computation of the lap time. Finally an automatic optimization procedure is started and the lap-time minimized. The whole process is executed on a Linux cluster running CFD simulations in parallel.
Papenmeier, Frank; Huff, Markus
2010-02-01
Analyzing gaze behavior with dynamic stimulus material is of growing importance in experimental psychology; however, there is still a lack of efficient analysis tools that are able to handle dynamically changing areas of interest. In this article, we present DynAOI, an open-source tool that allows for the definition of dynamic areas of interest. It works automatically with animations that are based on virtual three-dimensional models. When one is working with videos of real-world scenes, a three-dimensional model of the relevant content needs to be created first. The recorded eye-movement data are matched with the static and dynamic objects in the model underlying the video content, thus creating static and dynamic areas of interest. A validation study asking participants to track particular objects demonstrated that DynAOI is an efficient tool for handling dynamic areas of interest.
Static Methods in the Design of Nonlinear Automatic Control Systems,
1984-06-27
227 Chapter VI. Ways of Decrease of the Number of Statistical Nodes During the Research of Nonlinear Systems...at present occupies the central place. This region of research was called the statistical dynamics of nonlinear H automatic control systems...receives further development in the numerous research of Soviet and C foreign scientists. Special role in the development of the statistical dynamics of
New insight in spiral drawing analysis methods - Application to action tremor quantification.
Legrand, André Pierre; Rivals, Isabelle; Richard, Aliénor; Apartis, Emmanuelle; Roze, Emmanuel; Vidailhet, Marie; Meunier, Sabine; Hainque, Elodie
2017-10-01
Spiral drawing is one of the standard tests used to assess tremor severity for the clinical evaluation of medical treatments. Tremor severity is estimated through visual rating of the drawings by movement disorders experts. Different approaches based on the mathematical signal analysis of the recorded spiral drawings were proposed to replace this rater dependent estimate. The objective of the present study is to propose new numerical methods and to evaluate them in terms of agreement with visual rating and reproducibility. Series of spiral drawings of patients with essential tremor were visually rated by a board of experts. In addition to the usual velocity analysis, three new numerical methods were tested and compared, namely static and dynamic unraveling, and empirical mode decomposition. The reproducibility of both visual and numerical ratings was estimated, and their agreement was evaluated. The statistical analysis demonstrated excellent agreement between visual and numerical ratings, and more reproducible results with numerical methods than with visual ratings. The velocity method and the new numerical methods are in good agreement. Among the latter, static and dynamic unravelling both display a smaller dispersion and are easier for automatic analysis. The reliable scores obtained through the proposed numerical methods allow considering that their implementation on a digitized tablet, be it connected with a computer or independent, provides an efficient automatic tool for tremor severity assessment. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.
A new method for automatic tracking of facial landmarks in 3D motion captured images (4D).
Al-Anezi, T; Khambay, B; Peng, M J; O'Leary, E; Ju, X; Ayoub, A
2013-01-01
The aim of this study was to validate the automatic tracking of facial landmarks in 3D image sequences. 32 subjects (16 males and 16 females) aged 18-35 years were recruited. 23 anthropometric landmarks were marked on the face of each subject with non-permanent ink using a 0.5mm pen. The subjects were asked to perform three facial animations (maximal smile, lip purse and cheek puff) from rest position. Each animation was captured by the 3D imaging system. A single operator manually digitised the landmarks on the 3D facial models and their locations were compared with those of the automatically tracked ones. To investigate the accuracy of manual digitisation, the operator re-digitised the same set of 3D images of 10 subjects (5 male and 5 female) at 1 month interval. The discrepancies in x, y and z coordinates between the 3D position of the manual digitised landmarks and that of the automatic tracked facial landmarks were within 0.17mm. The mean distance between the manually digitised and the automatically tracked landmarks using the tracking software was within 0.55 mm. The automatic tracking of facial landmarks demonstrated satisfactory accuracy which would facilitate the analysis of the dynamic motion during facial animations. Copyright © 2012 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Pattern Recognition for a Flight Dynamics Monte Carlo Simulation
NASA Technical Reports Server (NTRS)
Restrepo, Carolina; Hurtado, John E.
2011-01-01
The design, analysis, and verification and validation of a spacecraft relies heavily on Monte Carlo simulations. Modern computational techniques are able to generate large amounts of Monte Carlo data but flight dynamics engineers lack the time and resources to analyze it all. The growing amounts of data combined with the diminished available time of engineers motivates the need to automate the analysis process. Pattern recognition algorithms are an innovative way of analyzing flight dynamics data efficiently. They can search large data sets for specific patterns and highlight critical variables so analysts can focus their analysis efforts. This work combines a few tractable pattern recognition algorithms with basic flight dynamics concepts to build a practical analysis tool for Monte Carlo simulations. Current results show that this tool can quickly and automatically identify individual design parameters, and most importantly, specific combinations of parameters that should be avoided in order to prevent specific system failures. The current version uses a kernel density estimation algorithm and a sequential feature selection algorithm combined with a k-nearest neighbor classifier to find and rank important design parameters. This provides an increased level of confidence in the analysis and saves a significant amount of time.
DOT National Transportation Integrated Search
1971-06-01
ATO (automatic train operation) and ATC (automatic train control) systems are evaluated relative to available technology and cost-benefit. The technological evaluation shows that suitable mathematical models of the dynamics of long trains are require...
Develop Advanced Nonlinear Signal Analysis Topographical Mapping System
NASA Technical Reports Server (NTRS)
Jong, Jen-Yi
1997-01-01
During the development of the SSME, a hierarchy of advanced signal analysis techniques for mechanical signature analysis has been developed by NASA and AI Signal Research Inc. (ASRI) to improve the safety and reliability for Space Shuttle operations. These techniques can process and identify intelligent information hidden in a measured signal which is often unidentifiable using conventional signal analysis methods. Currently, due to the highly interactive processing requirements and the volume of dynamic data involved, detailed diagnostic analysis is being performed manually which requires immense man-hours with extensive human interface. To overcome this manual process, NASA implemented this program to develop an Advanced nonlinear signal Analysis Topographical Mapping System (ATMS) to provide automatic/unsupervised engine diagnostic capabilities. The ATMS will utilize a rule-based Clips expert system to supervise a hierarchy of diagnostic signature analysis techniques in the Advanced Signal Analysis Library (ASAL). ASAL will perform automatic signal processing, archiving, and anomaly detection/identification tasks in order to provide an intelligent and fully automated engine diagnostic capability. The ATMS has been successfully developed under this contract. In summary, the program objectives to design, develop, test and conduct performance evaluation for an automated engine diagnostic system have been successfully achieved. Software implementation of the entire ATMS system on MSFC's OISPS computer has been completed. The significance of the ATMS developed under this program is attributed to the fully automated coherence analysis capability for anomaly detection and identification which can greatly enhance the power and reliability of engine diagnostic evaluation. The results have demonstrated that ATMS can significantly save time and man-hours in performing engine test/flight data analysis and performance evaluation of large volumes of dynamic test data.
Cavailloles, F; Bazin, J P; Capderou, A; Valette, H; Herbert, J L; Di Paola, R
1987-05-01
A method for automatic processing of cardiac first-pass radionuclide study is presented. This technique, factor analysis of dynamic structures (FADS) provides an automatic separation of anatomical structures according to their different temporal behaviour, even if they are superimposed. FADS has been applied to 76 studies. A description of factor patterns obtained in various pathological categories is presented. FADS provides easy diagnosis of shunts and tricuspid insufficiency. Quantitative information derived from the factors (cardiac output and mean transit time) were compared to those obtained by the region of interest method. Using FADS, a higher correlation with cardiac catheterization was found for cardiac output calculation. Thus compared to the ROI method, FADS presents obvious advantages: a good separation of overlapping cardiac chambers is obtained; this operator independant method provides more objective and reproducible results. A number of parameters of the cardio-pulmonary function can be assessed by first-pass radionuclide angiocardiography (RNA) [1,2]. Usually, they are calculated using time-activity curves (TAC) from regions of interest (ROI) drawn on the cardiac chambers and the lungs. This method has two main drawbacks: (1) the lack of inter and intra-observers reproducibility; (2) the problem of crosstalk which affects the evaluation of the cardio-pulmonary performance. The crosstalk on planar imaging is due to anatomical superimposition of the cardiac chambers and lungs. The activity measured in any ROI is the sum of the activity in several organs and 'decontamination' of the TAC cannot easily be performed using the ROI method [3]. Factor analysis of dynamic structures (FADS) [4,5] can solve the two problems mentioned above. It provides an automatic separation of anatomical structures according to their different temporal behaviour, even if they are superimposed. The resulting factors are estimates of the time evolution of the activity in each structure (underlying physiological components), and the associated factor images are estimates of the spatial distribution of each factor. The aim of this study was to assess the reliability of FADS in first pass RNA and compare the results to those obtained by the ROI method which is generally considered as the routine procedure.
An automatic frequency control loop using overlapping DFTs (Discrete Fourier Transforms)
NASA Technical Reports Server (NTRS)
Aguirre, S.
1988-01-01
An automatic frequency control (AFC) loop is introduced and analyzed in detail. The new scheme is a generalization of the well known Cross Product AFC loop that uses running overlapping discrete Fourier transforms (DFTs) to create a discriminator curve. Linear analysis is included and supported with computer simulations. The algorithm is tested in a low carrier to noise ratio (CNR) dynamic environment, and the probability of loss of lock is estimated via computer simulations. The algorithm discussed is a suboptimum tracking scheme with a larger frequency error variance compared to an optimum strategy, but offers simplicity of implementation and a very low operating threshold CNR. This technique can be applied during the carrier acquisition and re-acquisition process in the Advanced Receiver.
NASA Technical Reports Server (NTRS)
Noor, A. K. (Editor); Hayduk, R. J. (Editor)
1985-01-01
Among the topics discussed are developments in structural engineering hardware and software, computation for fracture mechanics, trends in numerical analysis and parallel algorithms, mechanics of materials, advances in finite element methods, composite materials and structures, determinations of random motion and dynamic response, optimization theory, automotive tire modeling methods and contact problems, the damping and control of aircraft structures, and advanced structural applications. Specific topics covered include structural design expert systems, the evaluation of finite element system architectures, systolic arrays for finite element analyses, nonlinear finite element computations, hierarchical boundary elements, adaptive substructuring techniques in elastoplastic finite element analyses, automatic tracking of crack propagation, a theory of rate-dependent plasticity, the torsional stability of nonlinear eccentric structures, a computation method for fluid-structure interaction, the seismic analysis of three-dimensional soil-structure interaction, a stress analysis for a composite sandwich panel, toughness criterion identification for unidirectional composite laminates, the modeling of submerged cable dynamics, and damping synthesis for flexible spacecraft structures.
ERIC Educational Resources Information Center
Sideridis, Georgios D.; Simos, Panagiotis; Mouzaki, Angeliki; Stamovlasis, Dimitrios
2016-01-01
The study explored the moderating role of rapid automatized naming (RAN) in reading achievement through a cusp-catastrophe model grounded on nonlinear dynamic systems theory. Data were obtained from a community sample of 496 second through fourth graders who were followed longitudinally over 2 years and split into 2 random subsamples (validation…
Evolutionary game dynamics of controlled and automatic decision-making
NASA Astrophysics Data System (ADS)
Toupo, Danielle F. P.; Strogatz, Steven H.; Cohen, Jonathan D.; Rand, David G.
2015-07-01
We integrate dual-process theories of human cognition with evolutionary game theory to study the evolution of automatic and controlled decision-making processes. We introduce a model in which agents who make decisions using either automatic or controlled processing compete with each other for survival. Agents using automatic processing act quickly and so are more likely to acquire resources, but agents using controlled processing are better planners and so make more effective use of the resources they have. Using the replicator equation, we characterize the conditions under which automatic or controlled agents dominate, when coexistence is possible and when bistability occurs. We then extend the replicator equation to consider feedback between the state of the population and the environment. Under conditions in which having a greater proportion of controlled agents either enriches the environment or enhances the competitive advantage of automatic agents, we find that limit cycles can occur, leading to persistent oscillations in the population dynamics. Critically, however, these limit cycles only emerge when feedback occurs on a sufficiently long time scale. Our results shed light on the connection between evolution and human cognition and suggest necessary conditions for the rise and fall of rationality.
Evolutionary game dynamics of controlled and automatic decision-making.
Toupo, Danielle F P; Strogatz, Steven H; Cohen, Jonathan D; Rand, David G
2015-07-01
We integrate dual-process theories of human cognition with evolutionary game theory to study the evolution of automatic and controlled decision-making processes. We introduce a model in which agents who make decisions using either automatic or controlled processing compete with each other for survival. Agents using automatic processing act quickly and so are more likely to acquire resources, but agents using controlled processing are better planners and so make more effective use of the resources they have. Using the replicator equation, we characterize the conditions under which automatic or controlled agents dominate, when coexistence is possible and when bistability occurs. We then extend the replicator equation to consider feedback between the state of the population and the environment. Under conditions in which having a greater proportion of controlled agents either enriches the environment or enhances the competitive advantage of automatic agents, we find that limit cycles can occur, leading to persistent oscillations in the population dynamics. Critically, however, these limit cycles only emerge when feedback occurs on a sufficiently long time scale. Our results shed light on the connection between evolution and human cognition and suggest necessary conditions for the rise and fall of rationality.
Gorzalczany, Marian B; Rudzinski, Filip
2017-06-07
This paper presents a generalization of self-organizing maps with 1-D neighborhoods (neuron chains) that can be effectively applied to complex cluster analysis problems. The essence of the generalization consists in introducing mechanisms that allow the neuron chain--during learning--to disconnect into subchains, to reconnect some of the subchains again, and to dynamically regulate the overall number of neurons in the system. These features enable the network--working in a fully unsupervised way (i.e., using unlabeled data without a predefined number of clusters)--to automatically generate collections of multiprototypes that are able to represent a broad range of clusters in data sets. First, the operation of the proposed approach is illustrated on some synthetic data sets. Then, this technique is tested using several real-life, complex, and multidimensional benchmark data sets available from the University of California at Irvine (UCI) Machine Learning repository and the Knowledge Extraction based on Evolutionary Learning data set repository. A sensitivity analysis of our approach to changes in control parameters and a comparative analysis with an alternative approach are also performed.
NASA Technical Reports Server (NTRS)
Smith, G. A.; Meyer, G.
1980-01-01
The results of a simulation study of an alternative design concept for an automatic landing control system are presented. The alternative design concept for an automatic landing control system is described. The design concept is the total aircraft flight control system (TAFCOS). TAFCOS is an open loop, feed forward system that commands the proper instantaneous thrust, angle of attack, and roll angle to achieve the forces required to follow the desired trajector. These dynamic trim conditions are determined by an inversion of the aircraft nonlinear force characteristics. The concept was applied to an A-7E aircraft approaching an aircraft carrier. The implementation details with an airborne digital computer are discussed. The automatic carrier landing situation is described. The simulation results are presented for a carrier approach with atmospheric disturbances, an approach with no disturbances, and for tailwind and headwind gusts.
Zakeri, Fahimeh Sadat; Setarehdan, Seyed Kamaledin; Norouzi, Somayye
2017-10-01
Segmentation of the arterial wall boundaries from intravascular ultrasound images is an important image processing task in order to quantify arterial wall characteristics such as shape, area, thickness and eccentricity. Since manual segmentation of these boundaries is a laborious and time consuming procedure, many researchers attempted to develop (semi-) automatic segmentation techniques as a powerful tool for educational and clinical purposes in the past but as yet there is no any clinically approved method in the market. This paper presents a deterministic-statistical strategy for automatic media-adventitia border detection by a fourfold algorithm. First, a smoothed initial contour is extracted based on the classification in the sparse representation framework which is combined with the dynamic directional convolution vector field. Next, an active contour model is utilized for the propagation of the initial contour toward the interested borders. Finally, the extracted contour is refined in the leakage, side branch openings and calcification regions based on the image texture patterns. The performance of the proposed algorithm is evaluated by comparing the results to those manually traced borders by an expert on 312 different IVUS images obtained from four different patients. The statistical analysis of the results demonstrates the efficiency of the proposed method in the media-adventitia border detection with enough consistency in the leakage and calcification regions. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Davenport, Jack H.
2016-05-01
Intelligence analysts demand rapid information fusion capabilities to develop and maintain accurate situational awareness and understanding of dynamic enemy threats in asymmetric military operations. The ability to extract relationships between people, groups, and locations from a variety of text datasets is critical to proactive decision making. The derived network of entities must be automatically created and presented to analysts to assist in decision making. DECISIVE ANALYTICS Corporation (DAC) provides capabilities to automatically extract entities, relationships between entities, semantic concepts about entities, and network models of entities from text and multi-source datasets. DAC's Natural Language Processing (NLP) Entity Analytics model entities as complex systems of attributes and interrelationships which are extracted from unstructured text via NLP algorithms. The extracted entities are automatically disambiguated via machine learning algorithms, and resolution recommendations are presented to the analyst for validation; the analyst's expertise is leveraged in this hybrid human/computer collaborative model. Military capability is enhanced by these NLP Entity Analytics because analysts can now create/update an entity profile with intelligence automatically extracted from unstructured text, thereby fusing entity knowledge from structured and unstructured data sources. Operational and sustainment costs are reduced since analysts do not have to manually tag and resolve entities.
Kinematic and Dynamic Analysis of High-Speed Intermittent-Motion Mechanisms.
1984-01-16
intermittent-motion mechanisms which -"have potential application to the high-speed automatic weapon system , and an investigation on the workspace of a robotic...manipulator system . The problems of this investigation belong to a selected group of unsolved or partially solved problems which are relevant and...design of high-speed machinery and automated manufacturing systems . Accession For IiTIS GRA&I DTIC TAB Unamounced 0 Justificatio By_, Distribut ion
Zaknun, John J; Rajabi, Hossein; Piepsz, Amy; Roca, Isabel; Dondi, Maurizio
2011-01-01
Under the auspices of the International Atomic Energy Agency, a new-generation, platform-independent, and x86-compatible software package was developed for the analysis of scintigraphic renal dynamic imaging studies. It provides nuclear medicine professionals cost-free access to the most recent developments in the field. The software package is a step forward towards harmonization and standardization. Embedded functionalities render it a suitable tool for education, research, and for receiving distant expert's opinions. Another objective of this effort is to allow introducing clinically useful parameters of drainage, including normalized residual activity and outflow efficiency. Furthermore, it provides an effective teaching tool for young professionals who are being introduced to dynamic kidney studies by selected teaching case studies. The software facilitates a better understanding through practically approaching different variables and settings and their effect on the numerical results. An effort was made to introduce instruments of quality assurance at the various levels of the program's execution, including visual inspection and automatic detection and correction of patient's motion, automatic placement of regions of interest around the kidneys, cortical regions, and placement of reproducible background region on both primary dynamic and on postmicturition studies. The user can calculate the differential renal function through 2 independent methods, the integral or the Rutland-Patlak approaches. Standardized digital reports, storage and retrieval of regions of interest, and built-in database operations allow the generation and tracing of full image reports and of numerical outputs. The software package is undergoing quality assurance procedures to verify the accuracy and the interuser reproducibility with the final aim of launching the program for use by professionals and teaching institutions worldwide. Copyright © 2011 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Kirk, R. G.; Gunter, E. J.
1972-01-01
The dynamic unabalance response and transient motion of the single mass Jeffcott rotor in elastic bearings mounted on damped, flexible supports are discussed. A steady state analysis of the shaft and the bearing housing motion was made by assuming synchronous precession of the system. The conditions under which the support system would act as a dynamic vibration absorber at the rotor critical speed were studied. Plots of the rotor and support amplitudes, phase angles, and forces transmitted were evaluated by the computer and the performance curves were plotted by an automatic plotter unit. Curves are presented on the optimization of the support housing characteristics of attenuate the rotor synchronous unbalance response.
Automatically monitoring driftwood in large rivers: preliminary results
NASA Astrophysics Data System (ADS)
Piegay, H.; Lemaire, P.; MacVicar, B.; Mouquet-Noppe, C.; Tougne, L.
2014-12-01
Driftwood in rivers impact sediment transport, riverine habitat and human infrastructures. Quantifying it, in particular large woods on fairly large rivers where it can move easily, would allow us to improve our knowledge on fluvial transport processes. There are several means of studying this phenomenon, amongst which RFID sensors tracking, photo and video monitoring. In this abstract, we are interested in the latter, being easier and cheaper to deploy. However, video monitoring of driftwood generates a huge amount of images and manually labeling it is tedious. It is essential to automate such a monitoring process, which is a difficult task in the field of computer vision, and more specifically automatic video analysis. Detecting foreground into dynamic background remains an open problem to date. We installed a video camera at the riverside of a gauging station on the Ain River, a 3500 km² Piedmont River in France. Several floods were manually annotated by a human operator. We developed software that automatically extracts and characterizes wood blocks within a video stream. This algorithm is based upon a statistical model and combines static, dynamic and spatial data. Segmented wood objects are further described with the help of a skeleton-based approach that helps us to automatically determine its shape, diameter and length. The first detailed comparisons between manual annotations and automatically extracted data show that we can fairly well detect large wood until a given size (approximately 120 cm in length or 15 cm in diameter) whereas smaller ones are difficult to detect and tend to be missed by either the human operator, either the algorithm. Detection is fairly accurate in high flow conditions where the water channel is usually brown because of suspended sediment transport. In low flow context, our algorithm still needs improvement to reduce the number of false positive so as to better distinguish shadow or turbulence structures from wood pieces.
Automated vessel segmentation using cross-correlation and pooled covariance matrix analysis.
Du, Jiang; Karimi, Afshin; Wu, Yijing; Korosec, Frank R; Grist, Thomas M; Mistretta, Charles A
2011-04-01
Time-resolved contrast-enhanced magnetic resonance angiography (CE-MRA) provides contrast dynamics in the vasculature and allows vessel segmentation based on temporal correlation analysis. Here we present an automated vessel segmentation algorithm including automated generation of regions of interest (ROIs), cross-correlation and pooled sample covariance matrix analysis. The dynamic images are divided into multiple equal-sized regions. In each region, ROIs for artery, vein and background are generated using an iterative thresholding algorithm based on the contrast arrival time map and contrast enhancement map. Region-specific multi-feature cross-correlation analysis and pooled covariance matrix analysis are performed to calculate the Mahalanobis distances (MDs), which are used to automatically separate arteries from veins. This segmentation algorithm is applied to a dual-phase dynamic imaging acquisition scheme where low-resolution time-resolved images are acquired during the dynamic phase followed by high-frequency data acquisition at the steady-state phase. The segmented low-resolution arterial and venous images are then combined with the high-frequency data in k-space and inverse Fourier transformed to form the final segmented arterial and venous images. Results from volunteer and patient studies demonstrate the advantages of this automated vessel segmentation and dual phase data acquisition technique. Copyright © 2011 Elsevier Inc. All rights reserved.
Chénier, Félix; Aissaoui, Rachid; Gauthier, Cindy; Gagnon, Dany H
2017-02-01
The commercially available SmartWheel TM is largely used in research and increasingly used in clinical practice to measure the forces and moments applied on the wheelchair pushrims by the user. However, in some situations (i.e. cambered wheels or increased pushrim weight), the recorded kinetics may include dynamic offsets that affect the accuracy of the measurements. In this work, an automatic method to identify and cancel these offsets is proposed and tested. First, the method was tested on an experimental bench with different cambers and pushrim weights. Then, the method was generalized to wheelchair propulsion. Nine experienced wheelchair users propelled their own wheelchairs instrumented with two SmartWheels with anti-slip pushrim covers. The dynamic offsets were correctly identified using the propulsion acquisition, without needing a separate baseline acquisition. A kinetic analysis was performed with and without dynamic offset cancellation using the proposed method. The most altered kinetic variables during propulsion were the vertical and total forces, with errors of up to 9N (p<0.001, large effect size of 5). This method is simple to implement, fully automatic and requires no further acquisitions. Therefore, we advise to use it systematically to enhance the accuracy of existing and future kinetic measurements. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.
An Automatic Medium to High Fidelity Low-Thrust Global Trajectory Toolchain; EMTG-GMAT
NASA Technical Reports Server (NTRS)
Beeson, Ryne T.; Englander, Jacob A.; Hughes, Steven P.; Schadegg, Maximillian
2015-01-01
Solving the global optimization, low-thrust, multiple-flyby interplanetary trajectory problem with high-fidelity dynamical models requires an unreasonable amount of computational resources. A better approach, and one that is demonstrated in this paper, is a multi-step process whereby the solution of the aforementioned problem is solved at a lower-fidelity and this solution is used as an initial guess for a higher-fidelity solver. The framework presented in this work uses two tools developed by NASA Goddard Space Flight Center: the Evolutionary Mission Trajectory Generator (EMTG) and the General Mission Analysis Tool (GMAT). EMTG is a medium to medium-high fidelity low-thrust interplanetary global optimization solver, which now has the capability to automatically generate GMAT script files for seeding a high-fidelity solution using GMAT's local optimization capabilities. A discussion of the dynamical models as well as thruster and power modeling for both EMTG and GMAT are given in this paper. Current capabilities are demonstrated with examples that highlight the toolchains ability to efficiently solve the difficult low-thrust global optimization problem with little human intervention.
Motion-aware stroke volume quantification in 4D PC-MRI data of the human aorta.
Köhler, Benjamin; Preim, Uta; Grothoff, Matthias; Gutberlet, Matthias; Fischbach, Katharina; Preim, Bernhard
2016-02-01
4D PC-MRI enables the noninvasive measurement of time-resolved, three-dimensional blood flow data that allow quantification of the hemodynamics. Stroke volumes are essential to assess the cardiac function and evolution of different cardiovascular diseases. The calculation depends on the wall position and vessel orientation, which both change during the cardiac cycle due to the heart muscle contraction and the pumped blood. However, current systems for the quantitative 4D PC-MRI data analysis neglect the dynamic character and instead employ a static 3D vessel approximation. We quantify differences between stroke volumes in the aorta obtained with and without consideration of its dynamics. We describe a method that uses the approximating 3D segmentation to automatically initialize segmentation algorithms that require regions inside and outside the vessel for each temporal position. This enables the use of graph cuts to obtain 4D segmentations, extract vessel surfaces including centerlines for each temporal position and derive motion information. The stroke volume quantification is compared using measuring planes in static (3D) vessels, planes with fixed angulation inside dynamic vessels (this corresponds to the common 2D PC-MRI) and moving planes inside dynamic vessels. Seven datasets with different pathologies such as aneurysms and coarctations were evaluated in close collaboration with radiologists. Compared to the experts' manual stroke volume estimations, motion-aware quantification performs, on average, 1.57% better than calculations without motion consideration. The mean difference between stroke volumes obtained with the different methods is 7.82%. Automatically obtained 4D segmentations overlap by 85.75% with manually generated ones. Incorporating motion information in the stroke volume quantification yields slight but not statistically significant improvements. The presented method is feasible for the clinical routine, since computation times are low and essential parts run fully automatically. The 4D segmentations can be used for other algorithms as well. The simultaneous visualization and quantification may support the understanding and interpretation of cardiac blood flow.
Kim, Yongsoo; Kim, Taek-Kyun; Kim, Yungu; Yoo, Jiho; You, Sungyong; Lee, Inyoul; Carlson, George; Hood, Leroy; Choi, Seungjin; Hwang, Daehee
2011-01-01
Motivation: Systems biology attempts to describe complex systems behaviors in terms of dynamic operations of biological networks. However, there is lack of tools that can effectively decode complex network dynamics over multiple conditions. Results: We present principal network analysis (PNA) that can automatically capture major dynamic activation patterns over multiple conditions and then generate protein and metabolic subnetworks for the captured patterns. We first demonstrated the utility of this method by applying it to a synthetic dataset. The results showed that PNA correctly captured the subnetworks representing dynamics in the data. We further applied PNA to two time-course gene expression profiles collected from (i) MCF7 cells after treatments of HRG at multiple doses and (ii) brain samples of four strains of mice infected with two prion strains. The resulting subnetworks and their interactions revealed network dynamics associated with HRG dose-dependent regulation of cell proliferation and differentiation and early PrPSc accumulation during prion infection. Availability: The web-based software is available at: http://sbm.postech.ac.kr/pna. Contact: dhhwang@postech.ac.kr; seungjin@postech.ac.kr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:21193522
NASA Astrophysics Data System (ADS)
Meng, Fei; Shi, Peng; Karimi, Hamid Reza; Zhang, Hui
2016-02-01
The main objective of this paper is to investigate the sensitivity analysis and optimal design of a proportional solenoid valve (PSV) operated pressure reducing valve (PRV) for heavy-duty automatic transmission clutch actuators. The nonlinear electro-hydraulic valve model is developed based on fluid dynamics. In order to implement the sensitivity analysis and optimization for the PRV, the PSV model is validated by comparing the results with data obtained from a real test-bench. The sensitivity of the PSV pressure response with regard to the structural parameters is investigated by using Sobol's method. Finally, simulations and experimental investigations are performed on the optimized prototype and the results reveal that the dynamical characteristics of the valve have been improved in comparison with the original valve.
Automatic Beam Path Analysis of Laser Wakefield Particle Acceleration Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubel, Oliver; Geddes, Cameron G.R.; Cormier-Michel, Estelle
2009-10-19
Numerical simulations of laser wakefield particle accelerators play a key role in the understanding of the complex acceleration process and in the design of expensive experimental facilities. As the size and complexity of simulation output grows, an increasingly acute challenge is the practical need for computational techniques that aid in scientific knowledge discovery. To that end, we present a set of data-understanding algorithms that work in concert in a pipeline fashion to automatically locate and analyze high energy particle bunches undergoing acceleration in very large simulation datasets. These techniques work cooperatively by first identifying features of interest in individual timesteps,more » then integrating features across timesteps, and based on the information derived perform analysis of temporally dynamic features. This combination of techniques supports accurate detection of particle beams enabling a deeper level of scientific understanding of physical phenomena than hasbeen possible before. By combining efficient data analysis algorithms and state-of-the-art data management we enable high-performance analysis of extremely large particle datasets in 3D. We demonstrate the usefulness of our methods for a variety of 2D and 3D datasets and discuss the performance of our analysis pipeline.« less
DynamicRoots: A Software Platform for the Reconstruction and Analysis of Growing Plant Roots.
Symonova, Olga; Topp, Christopher N; Edelsbrunner, Herbert
2015-01-01
We present a software platform for reconstructing and analyzing the growth of a plant root system from a time-series of 3D voxelized shapes. It aligns the shapes with each other, constructs a geometric graph representation together with the function that records the time of growth, and organizes the branches into a hierarchy that reflects the order of creation. The software includes the automatic computation of structural and dynamic traits for each root in the system enabling the quantification of growth on fine-scale. These are important advances in plant phenotyping with applications to the study of genetic and environmental influences on growth.
The architecture of Newton, a general-purpose dynamics simulator
NASA Technical Reports Server (NTRS)
Cremer, James F.; Stewart, A. James
1989-01-01
The architecture for Newton, a general-purpose system for simulating the dynamics of complex physical objects, is described. The system automatically formulates and analyzes equations of motion, and performs automatic modification of this system equations when necessitated by changes in kinematic relationships between objects. Impact and temporary contact are handled, although only using simple models. User-directed influence of simulations is achieved using Newton's module, which can be used to experiment with the control of many-degree-of-freedom articulated objects.
Transient Oscilliations in Mechanical Systems of Automatic Control with Random Parameters
NASA Astrophysics Data System (ADS)
Royev, B.; Vinokur, A.; Kulikov, G.
2018-04-01
Transient oscillations in mechanical systems of automatic control with random parameters is a relevant but insufficiently studied issue. In this paper, a modified spectral method was applied to investigate the problem. The nature of dynamic processes and the phase portraits are analyzed depending on the amplitude and frequency of external influence. It is evident from the obtained results, that the dynamic phenomena occurring in the systems with random parameters under external influence are complex, and their study requires further investigation.
Prototype Automated Equipment to Perform Poising and Beat Rate Operations on the M577 MTSQ Fuze.
1978-09-30
Regulation Machine which sets the M577 Fuze Timer beat rate and the Automatic Poising Machine which J dynamically balances the Timer balance wheel...in trouble shooting., The Automatic Poising Machine Figure 3 which inspects and corrects the dynamic I balance of the Balance Wheel Assembly was...machine is intimately related to the fastening method of the wire to the Timer at one end and the Balance Wheel at the other, a review of the history
Lin, Kun-Ju; Huang, Jia-Yann; Chen, Yung-Sheng
2011-12-01
Glomerular filtration rate (GFR) is a common accepted standard estimation of renal function. Gamma camera-based methods for estimating renal uptake of (99m)Tc-diethylenetriaminepentaacetic acid (DTPA) without blood or urine sampling have been widely used. Of these, the method introduced by Gates has been the most common method. Currently, most of gamma cameras are equipped with a commercial program for GFR determination, a semi-quantitative analysis by manually drawing region of interest (ROI) over each kidney. Then, the GFR value can be computed from the scintigraphic determination of (99m)Tc-DTPA uptake within the kidney automatically. Delineating the kidney area is difficult when applying a fixed threshold value. Moreover, hand-drawn ROIs are tedious, time consuming, and dependent highly on operator skill. Thus, we developed a fully automatic renal ROI estimation system based on the temporal changes in intensity counts, intensity-pair distribution image contrast enhancement method, adaptive thresholding, and morphological operations that can locate the kidney area and obtain the GFR value from a (99m)Tc-DTPA renogram. To evaluate the performance of the proposed approach, 30 clinical dynamic renograms were introduced. The fully automatic approach failed in one patient with very poor renal function. Four patients had a unilateral kidney, and the others had bilateral kidneys. The automatic contours from the remaining 54 kidneys were compared with the contours of manual drawing. The 54 kidneys were included for area error and boundary error analyses. There was high correlation between two physicians' manual contours and the contours obtained by our approach. For area error analysis, the mean true positive area overlap is 91%, the mean false negative is 13.4%, and the mean false positive is 9.3%. The boundary error is 1.6 pixels. The GFR calculated using this automatic computer-aided approach is reproducible and may be applied to help nuclear medicine physicians in clinical practice.
Automatic code generation in SPARK: Applications of computer algebra and compiler-compilers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nataf, J.M.; Winkelmann, F.
We show how computer algebra and compiler-compilers are used for automatic code generation in the Simulation Problem Analysis and Research Kernel (SPARK), an object oriented environment for modeling complex physical systems that can be described by differential-algebraic equations. After a brief overview of SPARK, we describe the use of computer algebra in SPARK's symbolic interface, which generates solution code for equations that are entered in symbolic form. We also describe how the Lex/Yacc compiler-compiler is used to achieve important extensions to the SPARK simulation language, including parametrized macro objects and steady-state resetting of a dynamic simulation. The application of thesemore » methods to solving the partial differential equations for two-dimensional heat flow is illustrated.« less
Automatic code generation in SPARK: Applications of computer algebra and compiler-compilers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nataf, J.M.; Winkelmann, F.
We show how computer algebra and compiler-compilers are used for automatic code generation in the Simulation Problem Analysis and Research Kernel (SPARK), an object oriented environment for modeling complex physical systems that can be described by differential-algebraic equations. After a brief overview of SPARK, we describe the use of computer algebra in SPARK`s symbolic interface, which generates solution code for equations that are entered in symbolic form. We also describe how the Lex/Yacc compiler-compiler is used to achieve important extensions to the SPARK simulation language, including parametrized macro objects and steady-state resetting of a dynamic simulation. The application of thesemore » methods to solving the partial differential equations for two-dimensional heat flow is illustrated.« less
DEPEND - A design environment for prediction and evaluation of system dependability
NASA Technical Reports Server (NTRS)
Goswami, Kumar K.; Iyer, Ravishankar K.
1990-01-01
The development of DEPEND, an integrated simulation environment for the design and dependability analysis of fault-tolerant systems, is described. DEPEND models both hardware and software components at a functional level, and allows automatic failure injection to assess system performance and reliability. It relieves the user of the work needed to inject failures, maintain statistics, and output reports. The automatic failure injection scheme is geared toward evaluating a system under high stress (workload) conditions. The failures that are injected can affect both hardware and software components. To illustrate the capability of the simulator, a distributed system which employs a prediction-based, dynamic load-balancing heuristic is evaluated. Experiments were conducted to determine the impact of failures on system performance and to identify the failures to which the system is especially susceptible.
The unbalanced signal measuring of automotive brake drum
NASA Astrophysics Data System (ADS)
Wang, Xiao-Dong; Ye, Sheng-Hua; Zhang, Bang-Cheng
2005-04-01
For the purpose of the research and development of automatic balancing system by mass removing, the dissertation deals with the measuring method of the unbalance signal, the design the automatic balance equipment and the software. This paper emphases the testing system of the balancer of automotive brake drum. The paper designs the band-pass filter product with favorable automatic follow of electronic product, and with favorable automatic follow capability, filtration effect and stability. The system of automatic balancing system by mass removing based on virtual instrument is designed in this paper. A lab system has been constructed. The results of contrast experiments indicate the notable effect of 1-plane automatic balance and the high precision of dynamic balance, and demonstrate the application value of the system.
A procedure for automating CFD simulations of an inlet-bleed problem
NASA Technical Reports Server (NTRS)
Chyu, Wei J.; Rimlinger, Mark J.; Shih, Tom I.-P.
1995-01-01
A procedure was developed to improve the turn-around time for computational fluid dynamics (CFD) simulations of an inlet-bleed problem involving oblique shock-wave/boundary-layer interactions on a flat plate with bleed into a plenum through one or more circular holes. This procedure is embodied in a preprocessor called AUTOMAT. With AUTOMAT, once data for the geometry and flow conditions have been specified (either interactively or via a namelist), it will automatically generate all input files needed to perform a three-dimensional Navier-Stokes simulation of the prescribed inlet-bleed problem by using the PEGASUS and OVERFLOW codes. The input files automatically generated by AUTOMAT include those for the grid system and those for the initial and boundary conditions. The grid systems automatically generated by AUTOMAT are multi-block structured grids of the overlapping type. Results obtained by using AUTOMAT are presented to illustrate its capability.
Automatic Management of Parallel and Distributed System Resources
NASA Technical Reports Server (NTRS)
Yan, Jerry; Ngai, Tin Fook; Lundstrom, Stephen F.
1990-01-01
Viewgraphs on automatic management of parallel and distributed system resources are presented. Topics covered include: parallel applications; intelligent management of multiprocessing systems; performance evaluation of parallel architecture; dynamic concurrent programs; compiler-directed system approach; lattice gaseous cellular automata; and sparse matrix Cholesky factorization.
Data Point Averaging for Computational Fluid Dynamics Data
NASA Technical Reports Server (NTRS)
Norman, Jr., David (Inventor)
2016-01-01
A system and method for generating fluid flow parameter data for use in aerodynamic heating analysis. Computational fluid dynamics data is generated for a number of points in an area on a surface to be analyzed. Sub-areas corresponding to areas of the surface for which an aerodynamic heating analysis is to be performed are identified. A computer system automatically determines a sub-set of the number of points corresponding to each of the number of sub-areas and determines a value for each of the number of sub-areas using the data for the sub-set of points corresponding to each of the number of sub-areas. The value is determined as an average of the data for the sub-set of points corresponding to each of the number of sub-areas. The resulting parameter values then may be used to perform an aerodynamic heating analysis.
Data Point Averaging for Computational Fluid Dynamics Data
NASA Technical Reports Server (NTRS)
Norman, David, Jr. (Inventor)
2014-01-01
A system and method for generating fluid flow parameter data for use in aerodynamic heating analysis. Computational fluid dynamics data is generated for a number of points in an area on a surface to be analyzed. Sub-areas corresponding to areas of the surface for which an aerodynamic heating analysis is to be performed are identified. A computer system automatically determines a sub-set of the number of points corresponding to each of the number of sub-areas and determines a value for each of the number of sub-areas using the data for the sub-set of points corresponding to each of the number of sub-areas. The value is determined as an average of the data for the sub-set of points corresponding to each of the number of sub-areas. The resulting parameter values then may be used to perform an aerodynamic heating analysis.
NASA Astrophysics Data System (ADS)
Gutiérrez-Fragoso, K.; Acosta-Mesa, H. G.; Cruz-Ramírez, N.; Hernández-Jiménez, R.
2013-12-01
Cervical cancer has remained, until now, as a serious public health problem in developing countries. The most common method of screening is the Pap test or cytology. When abnormalities are reported in the result, the patient is referred to a dysplasia clinic for colposcopy. During this test, a solution of acetic acid is applied, which produces a color change in the tissue and is known as acetowhitening phenomenon. This reaction aims to obtaining a sample of tissue and its histological analysis let to establish a final diagnosis. During the colposcopy test, digital images can be acquired to analyze the behavior of the acetowhitening reaction from a temporal approach. In this way, we try to identify precursor lesions of cervical cancer through a process of automatic classification of acetowhite temporal patterns. In this paper, we present the performance analysis of three classification methods: kNN, Naïve Bayes and C4.5. The results showed that there is similarity between some acetowhite temporal patterns of normal and abnormal tissues. Therefore we conclude that it is not sufficient to only consider the temporal dynamic of the acetowhitening reaction to establish a diagnosis by an automatic method. Information from cytologic, colposcopic and histopathologic disciplines should be integrated as well.
NASA Astrophysics Data System (ADS)
Xiao, Xiaojun; Du, Chunsheng; Zhou, Rongsheng
2004-04-01
As a result of data traffic"s exponential growth, network is currently evolving from fixed circuit switched services to dynamic packet switched services, which has brought unprecedented changes to the existing transport infrastructure. It is generally agreed that automatic switched optical network (ASON) is one of the promising solutions for the next generation optical networks. In this paper, we present the results of our experimental tests and economic analysis on ASON. The intention of this paper is to present our perspective, in terms of evolution strategy toward ASON, on next generation optical networks. It is shown through experimental tests that the performance of current Pre-standard ASON enabled equipments satisfies the basic requirements of network operators and is ready for initial deployment. The results of the economic analysis show that network operators can be benefit from the deployment of ASON from three sides. Firstly, ASON can reduce the CAPEX for network expanding by integrating multiple ADM & DCS into one box. Secondly, ASON can reduce the OPEX for network operation by introducing automatic resource control scheme. Finally, ASON can increase margin revenue by providing new optical network services such as Bandwidth on Demand, optical VPN etc. Finally, the evolution strategy is proposed as our perspective toward next generation optical networks. We hope the evolution strategy introduced may be helpful for the network operators to gracefully migrate their fixed ring based legacy networks to next generation dynamic mesh based network.
Neural networks: Alternatives to conventional techniques for automatic docking
NASA Technical Reports Server (NTRS)
Vinz, Bradley L.
1994-01-01
Automatic docking of orbiting spacecraft is a crucial operation involving the identification of vehicle orientation as well as complex approach dynamics. The chaser spacecraft must be able to recognize the target spacecraft within a scene and achieve accurate closing maneuvers. In a video-based system, a target scene must be captured and transformed into a pattern of pixels. Successful recognition lies in the interpretation of this pattern. Due to their powerful pattern recognition capabilities, artificial neural networks offer a potential role in interpretation and automatic docking processes. Neural networks can reduce the computational time required by existing image processing and control software. In addition, neural networks are capable of recognizing and adapting to changes in their dynamic environment, enabling enhanced performance, redundancy, and fault tolerance. Most neural networks are robust to failure, capable of continued operation with a slight degradation in performance after minor failures. This paper discusses the particular automatic docking tasks neural networks can perform as viable alternatives to conventional techniques.
Lerner, Itamar; Bentin, Shlomo; Shriki, Oren
2014-01-01
Semantic priming has long been recognized to reflect, along with automatic semantic mechanisms, the contribution of controlled strategies. However, previous theories of controlled priming were mostly qualitative, lacking common grounds with modern mathematical models of automatic priming based on neural networks. Recently, we have introduced a novel attractor network model of automatic semantic priming with latching dynamics. Here, we extend this work to show how the same model can also account for important findings regarding controlled processes. Assuming the rate of semantic transitions in the network can be adapted using simple reinforcement learning, we show how basic findings attributed to controlled processes in priming can be achieved, including their dependency on stimulus onset asynchrony and relatedness proportion and their unique effect on associative, category-exemplar, mediated and backward prime-target relations. We discuss how our mechanism relates to the classic expectancy theory and how it can be further extended in future developments of the model. PMID:24890261
Bellman Ford algorithm - in Routing Information Protocol (RIP)
NASA Astrophysics Data System (ADS)
Krianto Sulaiman, Oris; Mahmud Siregar, Amir; Nasution, Khairuddin; Haramaini, Tasliyah
2018-04-01
In a large scale network need a routing that can handle a lot number of users, one of the solutions to cope with large scale network is by using a routing protocol, There are 2 types of routing protocol that is static and dynamic, Static routing is manually route input based on network admin, while dynamic routing is automatically route input formed based on existing network. Dynamic routing is efficient used to network extensively because of the input of route automatic formed, Routing Information Protocol (RIP) is one of dynamic routing that uses the bellman-ford algorithm where this algorithm will search for the best path that traversed the network by leveraging the value of each link, so with the bellman-ford algorithm owned by RIP can optimize existing networks.
NASA Astrophysics Data System (ADS)
Romagnan, Jean Baptiste; Aldamman, Lama; Gasparini, Stéphane; Nival, Paul; Aubert, Anaïs; Jamet, Jean Louis; Stemmann, Lars
2016-10-01
The present work aims to show that high throughput imaging systems can be useful to estimate mesozooplankton community size and taxonomic descriptors that can be the base for consistent large scale monitoring of plankton communities. Such monitoring is required by the European Marine Strategy Framework Directive (MSFD) in order to ensure the Good Environmental Status (GES) of European coastal and offshore marine ecosystems. Time and cost-effective, automatic, techniques are of high interest in this context. An imaging-based protocol has been applied to a high frequency time series (every second day between April 2003 to April 2004 on average) of zooplankton obtained in a coastal site of the NW Mediterranean Sea, Villefranche Bay. One hundred eighty four mesozooplankton net collected samples were analysed with a Zooscan and an associated semi-automatic classification technique. The constitution of a learning set designed to maximize copepod identification with more than 10,000 objects enabled the automatic sorting of copepods with an accuracy of 91% (true positives) and a contamination of 14% (false positives). Twenty seven samples were then chosen from the total copepod time series for detailed visual sorting of copepods after automatic identification. This method enabled the description of the dynamics of two well-known copepod species, Centropages typicus and Temora stylifera, and 7 other taxonomically broader copepod groups, in terms of size, biovolume and abundance-size distributions (size spectra). Also, total copepod size spectra underwent significant changes during the sampling period. These changes could be partially related to changes in the copepod assemblage taxonomic composition and size distributions. This study shows that the use of high throughput imaging systems is of great interest to extract relevant coarse (i.e. total abundance, size structure) and detailed (i.e. selected species dynamics) descriptors of zooplankton dynamics. Innovative zooplankton analyses are therefore proposed and open the way for further development of zooplankton community indicators of changes.
An air brake model for longitudinal train dynamics studies
NASA Astrophysics Data System (ADS)
Wei, Wei; Hu, Yang; Wu, Qing; Zhao, Xubao; Zhang, Jun; Zhang, Yuan
2017-04-01
Experience of heavy haul train operation shows that heavy haul train fatigue fracture of coupler and its related components, even the accidents are caused by excessive coupler force. The most economical and effective method to study on train longitudinal impulse by reducing the coupler force is simulation method. The characteristics of train air brake system is an important excitation source for the study of longitudinal impulse. It is very difficult to obtain the braking characteristic by the test method, a better way to get the input parameters of the excitation source in the train longitudinal dynamics is modelling the train air brake system. In this paper, the air brake system model of integrated system of air brake and longitudinal dynamics is introduced. This introduce is focus on the locomotive automatic brake valve and vehicle distribution valve model, and the comparative analysis of the simulation and test results of the braking system is given. It is proved that the model can predict the characteristics of train braking system. This method provides a good solution for the excitation source of longitudinal dynamic analysis system.
Oost, Elco; Koning, Gerhard; Sonka, Milan; Oemrawsingh, Pranobe V; Reiber, Johan H C; Lelieveldt, Boudewijn P F
2006-09-01
This paper describes a new approach to the automated segmentation of X-ray left ventricular (LV) angiograms, based on active appearance models (AAMs) and dynamic programming. A coupling of shape and texture information between the end-diastolic (ED) and end-systolic (ES) frame was achieved by constructing a multiview AAM. Over-constraining of the model was compensated for by employing dynamic programming, integrating both intensity and motion features in the cost function. Two applications are compared: a semi-automatic method with manual model initialization, and a fully automatic algorithm. The first proved to be highly robust and accurate, demonstrating high clinical relevance. Based on experiments involving 70 patient data sets, the algorithm's success rate was 100% for ED and 99% for ES, with average unsigned border positioning errors of 0.68 mm for ED and 1.45 mm for ES. Calculated volumes were accurate and unbiased. The fully automatic algorithm, with intrinsically less user interaction was less robust, but showed a high potential, mostly due to a controlled gradient descent in updating the model parameters. The success rate of the fully automatic method was 91% for ED and 83% for ES, with average unsigned border positioning errors of 0.79 mm for ED and 1.55 mm for ES.
Sequential visibility-graph motifs
NASA Astrophysics Data System (ADS)
Iacovacci, Jacopo; Lacasa, Lucas
2016-04-01
Visibility algorithms transform time series into graphs and encode dynamical information in their topology, paving the way for graph-theoretical time series analysis as well as building a bridge between nonlinear dynamics and network science. In this work we introduce and study the concept of sequential visibility-graph motifs, smaller substructures of n consecutive nodes that appear with characteristic frequencies. We develop a theory to compute in an exact way the motif profiles associated with general classes of deterministic and stochastic dynamics. We find that this simple property is indeed a highly informative and computationally efficient feature capable of distinguishing among different dynamics and robust against noise contamination. We finally confirm that it can be used in practice to perform unsupervised learning, by extracting motif profiles from experimental heart-rate series and being able, accordingly, to disentangle meditative from other relaxation states. Applications of this general theory include the automatic classification and description of physical, biological, and financial time series.
Harder, Nathalie; Mora-Bermúdez, Felipe; Godinez, William J; Wünsche, Annelie; Eils, Roland; Ellenberg, Jan; Rohr, Karl
2009-11-01
Live-cell imaging allows detailed dynamic cellular phenotyping for cell biology and, in combination with small molecule or drug libraries, for high-content screening. Fully automated analysis of live cell movies has been hampered by the lack of computational approaches that allow tracking and recognition of individual cell fates over time in a precise manner. Here, we present a fully automated approach to analyze time-lapse movies of dividing cells. Our method dynamically categorizes cells into seven phases of the cell cycle and five aberrant morphological phenotypes over time. It reliably tracks cells and their progeny and can thus measure the length of mitotic phases and detect cause and effect if mitosis goes awry. We applied our computational scheme to annotate mitotic phenotypes induced by RNAi gene knockdown of CKAP5 (also known as ch-TOG) or by treatment with the drug nocodazole. Our approach can be readily applied to comparable assays aiming at uncovering the dynamic cause of cell division phenotypes.
Some selected quantitative methods of thermal image analysis in Matlab.
Koprowski, Robert
2016-05-01
The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Sidiropoulos, Panagiotis; Muller, Jan-Peter; Watson, Gillian; Michael, Gregory; Walter, Sebastian
2018-02-01
This work presents the coregistered, orthorectified and mosaiced high-resolution products of the MC11 quadrangle of Mars, which have been processed using novel, fully automatic, techniques. We discuss the development of a pipeline that achieves fully automatic and parameter independent geometric alignment of high-resolution planetary images, starting from raw input images in NASA PDS format and following all required steps to produce a coregistered geotiff image, a corresponding footprint and useful metadata. Additionally, we describe the development of a radiometric calibration technique that post-processes coregistered images to make them radiometrically consistent. Finally, we present a batch-mode application of the developed techniques over the MC11 quadrangle to validate their potential, as well as to generate end products, which are released to the planetary science community, thus assisting in the analysis of Mars static and dynamic features. This case study is a step towards the full automation of signal processing tasks that are essential to increase the usability of planetary data, but currently, require the extensive use of human resources.
Martínez, Leandro
2015-01-01
The analysis of structural mobility in molecular dynamics plays a key role in data interpretation, particularly in the simulation of biomolecules. The most common mobility measures computed from simulations are the Root Mean Square Deviation (RMSD) and Root Mean Square Fluctuations (RMSF) of the structures. These are computed after the alignment of atomic coordinates in each trajectory step to a reference structure. This rigid-body alignment is not robust, in the sense that if a small portion of the structure is highly mobile, the RMSD and RMSF increase for all atoms, resulting possibly in poor quantification of the structural fluctuations and, often, to overlooking important fluctuations associated to biological function. The motivation of this work is to provide a robust measure of structural mobility that is practical, and easy to interpret. We propose a Low-Order-Value-Optimization (LOVO) strategy for the robust alignment of the least mobile substructures in a simulation. These substructures are automatically identified by the method. The algorithm consists of the iterative superposition of the fraction of structure displaying the smallest displacements. Therefore, the least mobile substructures are identified, providing a clearer picture of the overall structural fluctuations. Examples are given to illustrate the interpretative advantages of this strategy. The software for performing the alignments was named MDLovoFit and it is available as free-software at: http://leandro.iqm.unicamp.br/mdlovofit.
Martínez, Leandro
2015-01-01
The analysis of structural mobility in molecular dynamics plays a key role in data interpretation, particularly in the simulation of biomolecules. The most common mobility measures computed from simulations are the Root Mean Square Deviation (RMSD) and Root Mean Square Fluctuations (RMSF) of the structures. These are computed after the alignment of atomic coordinates in each trajectory step to a reference structure. This rigid-body alignment is not robust, in the sense that if a small portion of the structure is highly mobile, the RMSD and RMSF increase for all atoms, resulting possibly in poor quantification of the structural fluctuations and, often, to overlooking important fluctuations associated to biological function. The motivation of this work is to provide a robust measure of structural mobility that is practical, and easy to interpret. We propose a Low-Order-Value-Optimization (LOVO) strategy for the robust alignment of the least mobile substructures in a simulation. These substructures are automatically identified by the method. The algorithm consists of the iterative superposition of the fraction of structure displaying the smallest displacements. Therefore, the least mobile substructures are identified, providing a clearer picture of the overall structural fluctuations. Examples are given to illustrate the interpretative advantages of this strategy. The software for performing the alignments was named MDLovoFit and it is available as free-software at: http://leandro.iqm.unicamp.br/mdlovofit PMID:25816325
NASA Astrophysics Data System (ADS)
Juromskiy, V. M.
2016-09-01
It is developed a mathematical model for an electric drive of high-speed separation device in terms of the modeling dynamic systems Simulink, MATLAB. The model is focused on the study of the automatic control systems of the power factor (Cosφ) of an actuator by compensating the reactive component of the total power by switching a capacitor bank in series with the actuator. The model is based on the methodology of the structural modeling of dynamic processes.
Time and frequency domain analysis of sampled data controllers via mixed operation equations
NASA Technical Reports Server (NTRS)
Frisch, H. P.
1981-01-01
Specification of the mathematical equations required to define the dynamic response of a linear continuous plant, subject to sampled data control, is complicated by the fact that the digital components of the control system cannot be modeled via linear ordinary differential equations. This complication can be overcome by introducing two new mathematical operations; namely, the operation of zero order hold and digial delay. It is shown that by direct utilization of these operations, a set of linear mixed operation equations can be written and used to define the dynamic response characteristics of the controlled system. It also is shown how these linear mixed operation equations lead, in an automatable manner, directly to a set of finite difference equations which are in a format compatible with follow on time and frequency domain analysis methods.
NASA MSFC hardware in the loop simulations of automatic rendezvous and capture systems
NASA Technical Reports Server (NTRS)
Tobbe, Patrick A.; Naumann, Charles B.; Sutton, William; Bryan, Thomas C.
1991-01-01
Two complementary hardware-in-the-loop simulation facilities for automatic rendezvous and capture systems at MSFC are described. One, the Flight Robotics Laboratory, uses an 8 DOF overhead manipulator with a work volume of 160 by 40 by 23 feet to evaluate automatic rendezvous algorithms and range/rate sensing systems. The other, the Space Station/Station Operations Mechanism Test Bed, uses a 6 DOF hydraulic table to perform docking and berthing dynamics simulations.
ERIC Educational Resources Information Center
Beale, Ivan L.
2005-01-01
Computer assisted learning (CAL) can involve a computerised intelligent learning environment, defined as an environment capable of automatically, dynamically and continuously adapting to the learning context. One aspect of this adaptive capability involves automatic adjustment of instructional procedures in response to each learner's performance,…
Automatic Construction of English/Chinese Parallel Corpora.
ERIC Educational Resources Information Center
Yang, Christopher C.; Li, Kar Wing
2003-01-01
Discussion of multilingual corpora and cross-lingual information retrieval focuses on research that constructed English/Chinese parallel corpus automatically from the World Wide Web. Presents an alignment method which is based on dynamic programming to identify one-to-one Chinese and English title pairs and discusses results of experiments…
Modeling and Representation of Human Hearts for Volumetric Measurement
Guan, Qiu; Wang, Wanliang; Wu, Guang
2012-01-01
This paper investigates automatic construction of a three-dimensional heart model from a set of medical images, represents it in a deformable shape, and uses it to perform volumetric measurements. This not only significantly improves its reliability and accuracy but also makes it possible to derive valuable novel information, like various assessment and dynamic volumetric measurements. The method is based on a flexible model trained from hundreds of patient image sets by a genetic algorithm, which takes advantage of complete segmentation of the heart shape to form a geometrical heart model. For an image set of a new patient, an interpretation scheme is used to obtain its shape and evaluate some important parameters. Apart from automatic evaluation of traditional heart functions, some new information of cardiovascular diseases may be recognized from the volumetric analysis. PMID:22162723
Automatic Tools for Enhancing the Collaborative Experience in Large Projects
NASA Astrophysics Data System (ADS)
Bourilkov, D.; Rodriquez, J. L.
2014-06-01
With the explosion of big data in many fields, the efficient management of knowledge about all aspects of the data analysis gains in importance. A key feature of collaboration in large scale projects is keeping a log of what is being done and how - for private use, reuse, and for sharing selected parts with collaborators and peers, often distributed geographically on an increasingly global scale. Even better if the log is automatically created on the fly while the scientist or software developer is working in a habitual way, without the need for extra efforts. This saves time and enables a team to do more with the same resources. The CODESH - COllaborative DEvelopment SHell - and CAVES - Collaborative Analysis Versioning Environment System projects address this problem in a novel way. They build on the concepts of virtual states and transitions to enhance the collaborative experience by providing automatic persistent virtual logbooks. CAVES is designed for sessions of distributed data analysis using the popular ROOT framework, while CODESH generalizes the approach for any type of work on the command line in typical UNIX shells like bash or tcsh. Repositories of sessions can be configured dynamically to record and make available the knowledge accumulated in the course of a scientific or software endeavor. Access can be controlled to define logbooks of private sessions or sessions shared within or between collaborating groups. A typical use case is building working scalable systems for analysis of Petascale volumes of data as encountered in the LHC experiments. Our approach is general enough to find applications in many fields.
NASA Astrophysics Data System (ADS)
Park, Sangwook; Lee, Young-Ran; Hwang, Yoola; Javier Santiago Noguero Galilea
2009-12-01
This paper describes the Flight Dynamics Automation (FDA) system for COMS Flight Dynamics System (FDS) and its test result in terms of the performance of the automation jobs. FDA controls the flight dynamics functions such as orbit determination, orbit prediction, event prediction, and fuel accounting. The designed FDA is independent from the specific characteristics which are defined by spacecraft manufacturer or specific satellite missions. Therefore, FDA could easily links its autonomous job control functions to any satellite mission control system with some interface modification. By adding autonomous system along with flight dynamics system, it decreases the operator’s tedious and repeated jobs but increase the usability and reliability of the system. Therefore, FDA is used to improve the completeness of whole mission control system’s quality. The FDA is applied to the real flight dynamics system of a geostationary satellite, COMS and the experimental test is performed. The experimental result shows the stability and reliability of the mission control operations through the automatic job control.
Dynamics of flexible bodies in tree topology - A computer oriented approach
NASA Technical Reports Server (NTRS)
Singh, R. P.; Vandervoort, R. J.; Likins, P. W.
1984-01-01
An approach suited for automatic generation of the equations of motion for large mechanical systems (i.e., large space structures, mechanisms, robots, etc.) is presented. The system topology is restricted to a tree configuration. The tree is defined as an arbitrary set of rigid and flexible bodies connected by hinges characterizing relative translations and rotations of two adjoining bodies. The equations of motion are derived via Kane's method. The resulting equation set is of minimum dimension. Dynamical equations are imbedded in a computer program called TREETOPS. Extensive control simulation capability is built in the TREETOPS program. The simulation is driven by an interactive set-up program resulting in an easy to use analysis tool.
Optimization of a pressure control valve for high power automatic transmission considering stability
NASA Astrophysics Data System (ADS)
Jian, Hongchao; Wei, Wei; Li, Hongcai; Yan, Qingdong
2018-02-01
The pilot-operated electrohydraulic clutch-actuator system is widely utilized by high power automatic transmission because of the demand of large flowrate and the excellent pressure regulating capability. However, a self-excited vibration induced by the inherent non-linear characteristics of valve spool motion coupled with the fluid dynamics can be generated during the working state of hydraulic systems due to inappropriate system parameters, which causes sustaining instability in the system and leads to unexpected performance deterioration and hardware damage. To ensure a stable and fast response performance of the clutch actuator system, an optimal design method for the pressure control valve considering stability is proposed in this paper. A non-linear dynamic model of the clutch actuator system is established based on the motion of the valve spool and coupling fluid dynamics in the system. The stability boundary in the parameter space is obtained by numerical stability analysis. Sensitivity of the stability boundary and output pressure response time corresponding to the valve parameters are identified using design of experiment (DOE) approach. The pressure control valve is optimized using particle swarm optimization (PSO) algorithm with the stability boundary as constraint. The simulation and experimental results reveal that the optimization method proposed in this paper helps in improving the response characteristics while ensuring the stability of the clutch actuator system during the entire gear shift process.
Baltzer, Pascal Andreas Thomas; Renz, Diane M; Kullnig, Petra E; Gajda, Mieczyslaw; Camara, Oumar; Kaiser, Werner A
2009-04-01
The identification of the most suspect enhancing part of a lesion is regarded as a major diagnostic criterion in dynamic magnetic resonance mammography. Computer-aided diagnosis (CAD) software allows the semi-automatic analysis of the kinetic characteristics of complete enhancing lesions, providing additional information about lesion vasculature. The diagnostic value of this information has not yet been quantified. Consecutive patients from routine diagnostic studies (1.5 T, 0.1 mmol gadopentetate dimeglumine, dynamic gradient-echo sequences at 1-minute intervals) were analyzed prospectively using CAD. Dynamic sequences were processed and reduced to a parametric map. Curve types were classified by initial signal increase (not significant, intermediate, and strong) and the delayed time course of signal intensity (continuous, plateau, and washout). Lesion enhancement was measured using CAD. The most suspect curve, the curve-type distribution percentage, and combined dynamic data were compared. Statistical analysis included logistic regression analysis and receiver-operating characteristic analysis. Fifty-one patients with 46 malignant and 44 benign lesions were enrolled. On receiver-operating characteristic analysis, the most suspect curve showed diagnostic accuracy of 76.7 +/- 5%. In comparison, the curve-type distribution percentage demonstrated accuracy of 80.2 +/- 4.9%. Combined dynamic data had the highest diagnostic accuracy (84.3 +/- 4.2%). These differences did not achieve statistical significance. With appropriate cutoff values, sensitivity and specificity, respectively, were found to be 80.4% and 72.7% for the most suspect curve, 76.1% and 83.6% for the curve-type distribution percentage, and 78.3% and 84.5% for both parameters. The integration of whole-lesion dynamic data tends to improve specificity. However, no statistical significance backs up this finding.
Automatic generation of the non-holonomic equations of motion for vehicle stability analysis
NASA Astrophysics Data System (ADS)
Minaker, B. P.; Rieveley, R. J.
2010-09-01
The mathematical analysis of vehicle stability has been utilised as an important tool in the design, development, and evaluation of vehicle architectures and stability controls. This paper presents a novel method for automatic generation of the linearised equations of motion for mechanical systems that is well suited to vehicle stability analysis. Unlike conventional methods for generating linearised equations of motion in standard linear second order form, the proposed method allows for the analysis of systems with non-holonomic constraints. In the proposed method, the algebraic constraint equations are eliminated after linearisation and reduction to first order. The described method has been successfully applied to an assortment of classic dynamic problems of varying complexity including the classic rolling coin, the planar truck-trailer, and the bicycle, as well as in more recent problems such as a rotor-stator and a benchmark road vehicle with suspension. This method has also been applied in the design and analysis of a novel three-wheeled narrow tilting vehicle with zero roll-stiffness. An application for determining passively stable configurations using the proposed method together with a genetic search algorithm is detailed. The proposed method and software implementation has been shown to be robust and provides invaluable conceptual insight into the stability of vehicles and mechanical systems.
An Experiment of GMPLS-Based Dispersion Compensation Control over In-Field Fibers
NASA Astrophysics Data System (ADS)
Seno, Shoichiro; Horiuchi, Eiichi; Yoshida, Sota; Sugihara, Takashi; Onohara, Kiyoshi; Kamei, Misato; Baba, Yoshimasa; Kubo, Kazuo; Mizuochi, Takashi
As ROADMs (Reconfigurable Optical Add/Drop Multiplexers) are becoming widely used in metro/core networks, distributed control of wavelength paths by extended GMPLS (Generalized MultiProtocol Label Switching) protocols has attracted much attention. For the automatic establishment of an arbitrary wavelength path satisfying dynamic traffic demands over a ROADM or WXC (Wavelength Cross Connect)-based network, precise determination of chromatic dispersion over the path and optimized assignment of dispersion compensation capabilities at related nodes are essential. This paper reports an experiment over in-field fibers where GMPLS-based control was applied for the automatic discovery of chromatic dispersion, path computation, and wavelength path establishment with dynamic adjustment of variable dispersion compensation. The GMPLS-based control scheme, which the authors called GMPLS-Plus, extended GMPLS's distributed control architecture with attributes for automatic discovery, advertisement, and signaling of chromatic dispersion. In this experiment, wavelength paths with distances of 24km and 360km were successfully established and error-free data transmission was verified. The experiment also confirmed path restoration with dynamic compensation adjustment upon fiber failure.
Buffet test in the National Transonic Facility
NASA Technical Reports Server (NTRS)
Young, Clarence P., Jr.; Hergert, Dennis W.; Butler, Thomas W.; Herring, Fred M.
1992-01-01
A buffet test of a commercial transport model was accomplished in the National Transonic Facility at the NASA Langley Research Center. This aeroelastic test was unprecedented for this wind tunnel and posed a high risk to the facility. This paper presents the test results from a structural dynamics and aeroelastic response point of view and describes the activities required for the safety analysis and risk assessment. The test was conducted in the same manner as a flutter test and employed onboard dynamic instrumentation, real time dynamic data monitoring, automatic, and manual tunnel interlock systems for protecting the model. The procedures and test techniques employed for this test are expected to serve as the basis for future aeroelastic testing in the National Transonic Facility. This test program was a cooperative effort between the Boeing Commercial Airplane Company and the NASA Langley Research Center.
Characteristic Analysis and Experiment of a Dynamic Flow Balance Valve
NASA Astrophysics Data System (ADS)
Bin, Li; Song, Guo; Xuyao, Mao; Chao, Wu; Deman, Zhang; Jin, Shang; Yinshui, Liu
2017-12-01
Comprehensive characteristics of a dynamic flow balance valve of water system were analysed. The flow balance valve can change the drag efficient automatically according to the condition of system, and the effective control flowrate is constant in the range of job pressure. The structure of the flow balance valve was introduced, and the theoretical calculation formula for the variable opening of the valve core was derived. A rated pressure of 20kPa to 200kPa and a rated flowrate of 10m3/h were offered in the numerical work. Static and fluent CFX analyses show good behaviours: through the valve core structure optimization and improve design of the compressive spring, the dynamic flow balance valve can stabilize the flowrate of system evidently. And experiments show that the flow control accuracy is within 5%.
MAG4 Versus Alternative Techniques for Forecasting Active-Region Flare Productivity
NASA Technical Reports Server (NTRS)
Falconer, David A.; Moore, Ronald L.; Barghouty, Abdulnasser F.; Khazanov, Igor
2014-01-01
MAG4 (Magnetogram Forecast), developed originally for NASA/SRAG (Space Radiation Analysis Group), is an automated program that analyzes magnetograms from the HMI (Helioseismic and Magnetic Imager) instrument on NASA SDO (Solar Dynamics Observatory), and automatically converts the rate (or probability) of major flares (M- and X-class), Coronal Mass Ejections (CMEs), and Solar Energetic Particle Events. MAG4 does not forecast that a flare will occur at a particular time in the next 24 or 48 hours; rather the probability of one occurring.
2013-02-01
Pavlovian drug cues to produce excessive “wanting” to...motivation: Incentive salience boosts of drug or appetite states. Behavioral Brain Science 31:440-‐441...learning into motivation. In Gutkin, B. and Ahmed, S.H. (Eds.) Computational Neuroscience of Drug
Solid Modeling Aerospace Research Tool (SMART) user's guide, version 2.0
NASA Technical Reports Server (NTRS)
Mcmillin, Mark L.; Spangler, Jan L.; Dahmen, Stephen M.; Rehder, John J.
1993-01-01
The Solid Modeling Aerospace Research Tool (SMART) software package is used in the conceptual design of aerospace vehicles. It provides a highly interactive and dynamic capability for generating geometries with Bezier cubic patches. Features include automatic generation of commonly used aerospace constructs (e.g., wings and multilobed tanks); cross-section skinning; wireframe and shaded presentation; area, volume, inertia, and center-of-gravity calculations; and interfaces to various aerodynamic and structural analysis programs. A comprehensive description of SMART and how to use it is provided.
How MAG4 Improves Space Weather Forecasting
NASA Technical Reports Server (NTRS)
Falconer, David; Khazanov, Igor; Barghouty, Nasser
2013-01-01
Dangerous space weather is driven by solar flares and Coronal Mass Ejection (CMEs). Forecasting flares and CMEs is the first step to forecasting either dangerous space weather or All Clear. MAG4 (Magnetogram Forecast), developed originally for NASA/SRAG (Space Radiation Analysis Group), is an automated program that analyzes magnetograms from the HMI (Helioseismic and Magnetic Imager) instrument on NASA SDO (Solar Dynamics Observatory), and automatically converts the rate (or probability) of major flares (M- and X-class), Coronal Mass Ejections (CMEs), and Solar Energetic Particle Events.
Knowledge Base for Automatic Generation of Online IMS LD Compliant Course Structures
ERIC Educational Resources Information Center
Pacurar, Ecaterina Giacomini; Trigano, Philippe; Alupoaie, Sorin
2006-01-01
Our article presents a pedagogical scenarios-based web application that allows the automatic generation and development of pedagogical websites. These pedagogical scenarios are represented in the IMS Learning Design standard. Our application is a web portal helping teachers to dynamically generate web course structures, to edit pedagogical content…
Beyond Prediction: First Steps toward Automatic Intervention in MOOC Student Stopout
ERIC Educational Resources Information Center
Whitehill, Jacob; Williams, Joseph; Lopez, Glenn; Coleman, Cody; Reich, Justin
2015-01-01
High attrition rates in massive open online courses (MOOCs) have motivated growing interest in the automatic detection of student "stopout". Stopout classifiers can be used to orchestrate an intervention before students quit, and to survey students dynamically about why they ceased participation. In this paper we expand on existing…
NASA Astrophysics Data System (ADS)
Tsuchiya, Yuichiro; Kodera, Yoshie
2006-03-01
In the picture archiving and communication system (PACS) environment, it is important that all images be stored in the correct location. However, if information such as the patient's name or identification number has been entered incorrectly, it is difficult to notice the error. The present study was performed to develop a system of patient collation automatically for dynamic radiogram examination by a kinetic analysis, and to evaluate the performance of the system. Dynamic chest radiographs during respiration were obtained by using a modified flat panel detector system. Our computer algorithm developed in this study was consisted of two main procedures, kinetic map imaging processing, and collation processing. Kinetic map processing is a new algorithm to visualize a movement for dynamic radiography; direction classification of optical flows and intensity-density transformation technique was performed. Collation processing consisted of analysis with an artificial neural network (ANN) and discrimination for Mahalanobis' generalized distance, those procedures were performed to evaluate a similarity of combination for the same person. Finally, we investigated the performance of our system using eight healthy volunteers' radiographs. The performance was shown as a sensitivity and specificity. The sensitivity and specificity for our system were shown 100% and 100%, respectively. This result indicated that our system has excellent performance for recognition of a patient. Our system will be useful in PACS management for dynamic chest radiography.
Automatic anatomical structures location based on dynamic shape measurement
NASA Astrophysics Data System (ADS)
Witkowski, Marcin; Rapp, Walter; Sitnik, Robert; Kujawinska, Malgorzata; Vander Sloten, Jos; Haex, Bart; Bogaert, Nico; Heitmann, Kjell
2005-09-01
New image processing methods and active photonics apparatus have made possible the development of relatively inexpensive optical systems for complex shape and object measurements. We present dynamic 360° scanning method for analysis of human lower body biomechanics, with an emphasis on the analysis of the knee joint. The anatomical structure (of high medical interest) that is possible to scan and analyze, is patella. Tracking of patella position and orientation under dynamic conditions may lead to detect pathological patella movements and help in knee joint disease diagnosis. The processed data is obtained from a dynamic laser triangulation surface measurement system, able to capture slow to normal movements with a scan frequency between 15 and 30 Hz. These frequency rates are enough to capture controlled movements used e.g. for medical examination purposes. The purpose of the work presented is to develop surface analysis methods that may be used as support of diagnosis of motoric abilities of lower limbs. The paper presents algorithms used to process acquired lower limbs surface data in order to find the position and orientation of patella. The algorithms implemented include input data preparation, curvature description methods, knee region discrimination and patella assumed position/orientation calculation. Additionally, a method of 4D (3D + time) medical data visualization is proposed. Also some exemplary results are presented.
Civil (French/US) certification of the Coast Guard's HH-65A Dauphin
NASA Technical Reports Server (NTRS)
Hart, J. C.; Besse, J. M.; Mcelreath, K. W.
1982-01-01
Certification programs with particular emphasis on handling qualities requirements are described. A dynamic simulator was designed and constructed to support and verify the dynamic aspects of the avionics system, particularly the Automatic Flight Control System (AFCS). The role of the Dynamic Simulator is discussed.
Automatic acquisition of motion trajectories: tracking hockey players
NASA Astrophysics Data System (ADS)
Okuma, Kenji; Little, James J.; Lowe, David
2003-12-01
Computer systems that have the capability of analyzing complex and dynamic scenes play an essential role in video annotation. Scenes can be complex in such a way that there are many cluttered objects with different colors, shapes and sizes, and can be dynamic with multiple interacting moving objects and a constantly changing background. In reality, there are many scenes that are complex, dynamic, and challenging enough for computers to describe. These scenes include games of sports, air traffic, car traffic, street intersections, and cloud transformations. Our research is about the challenge of inventing a descriptive computer system that analyzes scenes of hockey games where multiple moving players interact with each other on a constantly moving background due to camera motions. Ultimately, such a computer system should be able to acquire reliable data by extracting the players" motion as their trajectories, querying them by analyzing the descriptive information of data, and predict the motions of some hockey players based on the result of the query. Among these three major aspects of the system, we primarily focus on visual information of the scenes, that is, how to automatically acquire motion trajectories of hockey players from video. More accurately, we automatically analyze the hockey scenes by estimating parameters (i.e., pan, tilt, and zoom) of the broadcast cameras, tracking hockey players in those scenes, and constructing a visual description of the data by displaying trajectories of those players. Many technical problems in vision such as fast and unpredictable players' motions and rapid camera motions make our challenge worth tackling. To the best of our knowledge, there have not been any automatic video annotation systems for hockey developed in the past. Although there are many obstacles to overcome, our efforts and accomplishments would hopefully establish the infrastructure of the automatic hockey annotation system and become a milestone for research in automatic video annotation in this domain.
NASA Astrophysics Data System (ADS)
Berngardt, Oleg; Bubnova, Tatyana; Podlesnyi, Aleksey
2018-03-01
We propose and test a method of analyzing ionograms of vertical ionospheric sounding, which is based on detecting deviations of the shape of an ionogram from its regular (averaged) shape. We interpret these deviations in terms of reflection from the electron density irregularities at heights corresponding to the effective height. We examine the irregularities thus discovered within the framework of a model of a localized uniformly moving irregularity, and determine their characteristic parameters: effective heights and observed vertical velocities. We analyze selected experimental data for three seasons (spring, winter, autumn) obtained nearby Irkutsk with a fast chirp ionosonde of ISTP SB RAS in 2013-2015. The analysis of six days of observations conducted in these seasons has shown that in the observed vertical drift of the irregularities there are two characteristic distributions: wide velocity distribution with nearly 0 m/s mean and with the standard deviation of ∼250 m/s and narrow distribution with nearly -160 m/s mean. The analysis has demonstrated the effectiveness of the proposed algorithm for the automatic analysis of vertical sounding data with high repetition rate.
Zone analysis in biology articles as a basis for information extraction.
Mizuta, Yoko; Korhonen, Anna; Mullen, Tony; Collier, Nigel
2006-06-01
In the field of biomedicine, an overwhelming amount of experimental data has become available as a result of the high throughput of research in this domain. The amount of results reported has now grown beyond the limits of what can be managed by manual means. This makes it increasingly difficult for the researchers in this area to keep up with the latest developments. Information extraction (IE) in the biological domain aims to provide an effective automatic means to dynamically manage the information contained in archived journal articles and abstract collections and thus help researchers in their work. However, while considerable advances have been made in certain areas of IE, pinpointing and organizing factual information (such as experimental results) remains a challenge. In this paper we propose tackling this task by incorporating into IE information about rhetorical zones, i.e. classification of spans of text in terms of argumentation and intellectual attribution. As the first step towards this goal, we introduce a scheme for annotating biological texts for rhetorical zones and provide a qualitative and quantitative analysis of the data annotated according to this scheme. We also discuss our preliminary research on automatic zone analysis, and its incorporation into our IE framework.
Trans-dimensional MCMC methods for fully automatic motion analysis in tagged MRI.
Smal, Ihor; Carranza-Herrezuelo, Noemí; Klein, Stefan; Niessen, Wiro; Meijering, Erik
2011-01-01
Tagged magnetic resonance imaging (tMRI) is a well-known noninvasive method allowing quantitative analysis of regional heart dynamics. Its clinical use has so far been limited, in part due to the lack of robustness and accuracy of existing tag tracking algorithms in dealing with low (and intrinsically time-varying) image quality. In this paper, we propose a novel probabilistic method for tag tracking, implemented by means of Bayesian particle filtering and a trans-dimensional Markov chain Monte Carlo (MCMC) approach, which efficiently combines information about the imaging process and tag appearance with prior knowledge about the heart dynamics obtained by means of non-rigid image registration. Experiments using synthetic image data (with ground truth) and real data (with expert manual annotation) from preclinical (small animal) and clinical (human) studies confirm that the proposed method yields higher consistency, accuracy, and intrinsic tag reliability assessment in comparison with other frequently used tag tracking methods.
NASA Astrophysics Data System (ADS)
Kalinkina, M. E.; Kozlov, A. S.; Labkovskaia, R. I.; Pirozhnikova, O. I.; Tkalich, V. L.; Shmakov, N. A.
2018-05-01
The object of research is the element base of devices of control and automation systems, including in its composition annular elastic sensitive elements, methods of their modeling, calculation algorithms and software complexes for automation of their design processes. The article is devoted to the development of the computer-aided design system of elastic sensitive elements used in weight- and force-measuring automation devices. Based on the mathematical modeling of deformation processes in a solid, as well as the results of static and dynamic analysis, the calculation of elastic elements is given using the capabilities of modern software systems based on numerical simulation. In the course of the simulation, the model was a divided hexagonal grid of finite elements with a maximum size not exceeding 2.5 mm. The results of modal and dynamic analysis are presented in this article.
NASA Technical Reports Server (NTRS)
Kirk, R. G.; Gunter, E. J.
1972-01-01
A steady state analysis of the shaft and the bearing housing motion was made by assuming synchronous precession of the system. The conditions under which the support system would act as a dynamic vibration absorber at the rotor critical speed were studied; plots of the rotor and support amplitudes, phase angles, and forces transmitted were evaluated by the computer, and the performance curves were automatically plotted by a CalComp plotter unit. Curves are presented on the optimization of the support housing characteristics to attenuate the rotor unbalance response over the entire rotor speed range. The complete transient motion including rotor unbalance was examined by integrating the equations of motion numerically using a modified fourth order Runge-Kutta procedure, and the resulting whirl orbits were plotted by the CalComp plotter unit. The results of the transient analysis are discussed with regards to the design optimization procedure derived from the steady-state analysis.
Erraguntla, Madhav; Zapletal, Josef; Lawley, Mark
2017-12-01
The impact of infectious disease on human populations is a function of many factors including environmental conditions, vector dynamics, transmission mechanics, social and cultural behaviors, and public policy. A comprehensive framework for disease management must fully connect the complete disease lifecycle, including emergence from reservoir populations, zoonotic vector transmission, and impact on human societies. The Framework for Infectious Disease Analysis is a software environment and conceptual architecture for data integration, situational awareness, visualization, prediction, and intervention assessment. Framework for Infectious Disease Analysis automatically collects biosurveillance data using natural language processing, integrates structured and unstructured data from multiple sources, applies advanced machine learning, and uses multi-modeling for analyzing disease dynamics and testing interventions in complex, heterogeneous populations. In the illustrative case studies, natural language processing from social media, news feeds, and websites was used for information extraction, biosurveillance, and situation awareness. Classification machine learning algorithms (support vector machines, random forests, and boosting) were used for disease predictions.
Dynamic analysis of flexible gear trains/transmissions - An automated approach
NASA Technical Reports Server (NTRS)
Amirouche, F. M. L.; Shareef, N. H.; Xie, M.
1992-01-01
In this paper an automated algorithmic method is presented for the dynamic analysis of geared trains/transmissions. These are treated as a system of interconnected flexible bodies. The procedure developed explains the switching of constraints with time as a result of the change in the contacting areas at the gear teeth. The elastic behavior of the system is studied through the employment of three-dimensional isoparametric elements having six degrees-of-freedom at each node. The contact between the bodies is assumed at the various nodes, which could be either a line or a plane. The kinematical expressions, together with the equations of motion using Kane's method, strain energy concepts, are presented in a matrix form suitable for computer implementation. The constraint Jacobian matrices are generated automatically based on the contact information between the bodies. The concepts of the relative velocity at the contacting points at the tooth pairs and the subsequent use of the transmission ratios in the analysis is presented.
Fulcher, Ben D; Jones, Nick S
2017-11-22
Phenotype measurements frequently take the form of time series, but we currently lack a systematic method for relating these complex data streams to scientifically meaningful outcomes, such as relating the movement dynamics of organisms to their genotype or measurements of brain dynamics of a patient to their disease diagnosis. Previous work addressed this problem by comparing implementations of thousands of diverse scientific time-series analysis methods in an approach termed highly comparative time-series analysis. Here, we introduce hctsa, a software tool for applying this methodological approach to data. hctsa includes an architecture for computing over 7,700 time-series features and a suite of analysis and visualization algorithms to automatically select useful and interpretable time-series features for a given application. Using exemplar applications to high-throughput phenotyping experiments, we show how hctsa allows researchers to leverage decades of time-series research to quantify and understand informative structure in time-series data. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wray, Richard B.
1991-12-01
A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.
NASA Technical Reports Server (NTRS)
Wray, Richard B.
1991-01-01
A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.
Großekathöfer, Ulf; Manyakov, Nikolay V.; Mihajlović, Vojkan; Pandina, Gahan; Skalkin, Andrew; Ness, Seth; Bangerter, Abigail; Goodwin, Matthew S.
2017-01-01
A number of recent studies using accelerometer features as input to machine learning classifiers show promising results for automatically detecting stereotypical motor movements (SMM) in individuals with Autism Spectrum Disorder (ASD). However, replicating these results across different types of accelerometers and their position on the body still remains a challenge. We introduce a new set of features in this domain based on recurrence plot and quantification analyses that are orientation invariant and able to capture non-linear dynamics of SMM. Applying these features to an existing published data set containing acceleration data, we achieve up to 9% average increase in accuracy compared to current state-of-the-art published results. Furthermore, we provide evidence that a single torso sensor can automatically detect multiple types of SMM in ASD, and that our approach allows recognition of SMM with high accuracy in individuals when using a person-independent classifier. PMID:28261082
Großekathöfer, Ulf; Manyakov, Nikolay V; Mihajlović, Vojkan; Pandina, Gahan; Skalkin, Andrew; Ness, Seth; Bangerter, Abigail; Goodwin, Matthew S
2017-01-01
A number of recent studies using accelerometer features as input to machine learning classifiers show promising results for automatically detecting stereotypical motor movements (SMM) in individuals with Autism Spectrum Disorder (ASD). However, replicating these results across different types of accelerometers and their position on the body still remains a challenge. We introduce a new set of features in this domain based on recurrence plot and quantification analyses that are orientation invariant and able to capture non-linear dynamics of SMM. Applying these features to an existing published data set containing acceleration data, we achieve up to 9% average increase in accuracy compared to current state-of-the-art published results. Furthermore, we provide evidence that a single torso sensor can automatically detect multiple types of SMM in ASD, and that our approach allows recognition of SMM with high accuracy in individuals when using a person-independent classifier.
NASA Astrophysics Data System (ADS)
Sabanin, V. R.; Starostin, A. A.; Repin, A. I.; Popov, A. I.
2017-02-01
The problems of operation effectiveness increase of steam boilers are considered. To maintain the optimum fuel combustion modes, it is proposed to use an extremal controller (EC) determining the value of airflow rate, at which the boiler generating the desired amount of heat will consume a minimum amount of fuel. EC sets the determined value of airflow rate to airflow rate controller (ARC). The test results of numerical simulation dynamic nonlinear model of steam boiler with the connected system of automatic control of load and combustion efficiency using EC are presented. The model is created in the Simulink modeling package of MATLAB software and can be used to optimize the combustion modes. Based on the modeling results, the conclusion was drawn about the possibility in principle of simultaneously boiler load control and optimizing by EC the combustion modes when changing the fuel combustion heat and the boiler characteristics and its operating mode. It is shown that it is possible to automatically control the operation efficiency of steam boilers when using EC without applying the standard flue gas analyzers. The article considers the numerical simulation dynamic model of steam boiler with the schemes of control of fuel consumption and airflow rate, the steam pressure and EC; the purpose of using EC in the scheme with linear controllers and the requirements to the quality of its operation; the results of operation of boiler control schemes without EC with estimation of influence of roughness of thermal mode maps on the nature of static and dynamic connection of the control units of fuel consumption and airflow rate; the phase trajectories and the diagrams of transient processes occurring in the control scheme with EC with stepped changing the fuel quality and boiler characteristics; analysis of modeling results and prospects for using EC in the control schemes of boilers.
Parametric Modelling of As-Built Beam Framed Structure in Bim Environment
NASA Astrophysics Data System (ADS)
Yang, X.; Koehl, M.; Grussenmeyer, P.
2017-02-01
A complete documentation and conservation of a historic timber roof requires the integration of geometry modelling, attributional and dynamic information management and results of structural analysis. Recently developed as-built Building Information Modelling (BIM) technique has the potential to provide a uniform platform, which provides possibility to integrate the traditional geometry modelling, parametric elements management and structural analysis together. The main objective of the project presented in this paper is to develop a parametric modelling tool for a timber roof structure whose elements are leaning and crossing beam frame. Since Autodesk Revit, as the typical BIM software, provides the platform for parametric modelling and information management, an API plugin, able to automatically create the parametric beam elements and link them together with strict relationship, was developed. The plugin under development is introduced in the paper, which can obtain the parametric beam model via Autodesk Revit API from total station points and terrestrial laser scanning data. The results show the potential of automatizing the parametric modelling by interactive API development in BIM environment. It also integrates the separate data processing and different platforms into the uniform Revit software.
Measuring patterns in team interaction sequences using a discrete recurrence approach.
Gorman, Jamie C; Cooke, Nancy J; Amazeen, Polemnia G; Fouse, Shannon
2012-08-01
Recurrence-based measures of communication determinism and pattern information are described and validated using previously collected team interaction data. Team coordination dynamics has revealed that"mixing" team membership can lead to flexible interaction processes, but keeping a team "intact" can lead to rigid interaction processes. We hypothesized that communication of intact teams would have greater determinism and higher pattern information compared to that of mixed teams. Determinism and pattern information were measured from three-person Uninhabited Air Vehicle team communication sequences over a series of 40-minute missions. Because team members communicated using push-to-talk buttons, communication sequences were automatically generated during each mission. The Composition x Mission determinism effect was significant. Intact teams' determinism increased over missions, whereas mixed teams' determinism did not change. Intact teams had significantly higher maximum pattern information than mixed teams. Results from these new communication analysis methods converge with content-based methods and support our hypotheses. Because they are not content based, and because they are automatic and fast, these new methods may be amenable to real-time communication pattern analysis.
NASA Astrophysics Data System (ADS)
Arason, P.; Barsotti, S.; De'Michieli Vitturi, M.; Jónsson, S.; Arngrímsson, H.; Bergsson, B.; Pfeffer, M. A.; Petersen, G. N.; Bjornsson, H.
2016-12-01
Plume height and mass eruption rate are the principal scale parameters of explosive volcanic eruptions. Weather radars are important instruments in estimating plume height, due to their independence of daylight, weather and visibility. The Icelandic Meteorological Office (IMO) operates two fixed position C-band weather radars and two mobile X-band radars. All volcanoes in Iceland can be monitored by IMO's radar network, and during initial phases of an eruption all available radars will be set to a more detailed volcano scan. When the radar volume data is retrived at IMO-headquarters in Reykjavík, an automatic analysis is performed on the radar data above the proximity of the volcano. The plume height is automatically estimated taking into account the radar scanning strategy, beam width, and a likely reflectivity gradient at the plume top. This analysis provides a distribution of the likely plume height. The automatically determined plume height estimates from the radar data are used as input to a numerical suite that calculates the eruptive source parameters through an inversion algorithm. This is done by using the coupled system DAKOTA-PlumeMoM which solves the 1D plume model equations iteratively by varying the input values of vent radius and vertical velocity. The model accounts for the effect of wind on the plume dynamics, using atmospheric vertical profiles extracted from the ECMWF numerical weather prediction model. Finally, the resulting estimates of mass eruption rate are used to initialize the dispersal model VOL-CALPUFF to assess hazard due to tephra fallout, and communicated to London VAAC to support their modelling activity for aviation safety purposes.
1981-03-31
logic testing element and a concomitant testability criterion ideally suited to dynamic circuit applications and appro- priate for automatic computer...making connections automatically . PF is an experimental feature which provides users with only four different chip sizes (full, half, quarter, and eighth...initial solution is found constructively which is improved by pair-wise swapping. Results show, however, that the constructive initial sorter , which
Automatic speech recognition technology development at ITT Defense Communications Division
NASA Technical Reports Server (NTRS)
White, George M.
1977-01-01
An assessment of the applications of automatic speech recognition to defense communication systems is presented. Future research efforts include investigations into the following areas: (1) dynamic programming; (2) recognition of speech degraded by noise; (3) speaker independent recognition; (4) large vocabulary recognition; (5) word spotting and continuous speech recognition; and (6) isolated word recognition.
NASA Astrophysics Data System (ADS)
Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia
2017-05-01
A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.
Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia
2017-05-07
A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.
Three-Loop Automatic of Control System the Landfill of Household Solid Waste
NASA Astrophysics Data System (ADS)
Sereda, T. G.; Kostarev, S. N.
2017-05-01
The analysis of models of governance ground municipal solid waste (MSW). Considered a distributed circuit (spatio-temporal) ground control model. Developed a dynamic model of multicontour control landfill. Adjustable parameters are defined (the ratio of CH4 CO2 emission/fluxes, concentrations of heavy metals ions) and control (purging array, irrigation, adding reagents). Based on laboratory studies carried out with the analysis of equity flows and procedures developed by the transferring matrix that takes into account the relationship control loops. A system of differential equations in the frequency and time domains. Given the numerical approaches solving systems of differential equations in finite differential form.
Space Construction Experiment Definition Study (SCEDS), part 2. Volume 2: Study results
NASA Technical Reports Server (NTRS)
1982-01-01
The Space Construction Experiment (SCE) was defined for integration into the Space Shuttle. This included development of flight assignment data, revision and update of preliminary mission timelines and test plans, analysis of flight safety issues, and definition of ground operations scenarios. New requirements for the flight experiment and changes for a large space antenna feed mask test article were incorporated. The program plan and cost estimates were updated. Revised SCE structural dynamics characteristics were provided for simulation and analysis of experimental tests to define and verify control limits and interactions effects between the SCE and the Orbiter digital automatic pilot.
Miyawaki, Shinjiro; Tawhai, Merryn H.; Hoffman, Eric A.; Wenzel, Sally E.; Lin, Ching-Long
2016-01-01
We propose a method to construct three-dimensional airway geometric models based on airway skeletons, or centerlines (CLs). Given a CT-segmented airway skeleton and surface, the proposed CL-based method automatically constructs subject-specific models that contain anatomical information regarding branches, include bifurcations and trifurcations, and extend from the trachea to terminal bronchioles. The resulting model can be anatomically realistic with the assistance of an image-based surface; alternatively a model with an idealized skeleton and/or branch diameters is also possible. This method systematically identifies and classifies trifurcations to successfully construct the models, which also provides the number and type of trifurcations for the analysis of the airways from an anatomical point of view. We applied this method to 16 normal and 16 severe asthmatic subjects using their computed tomography images. The average distance between the surface of the model and the image-based surface was 11% of the average voxel size of the image. The four most frequent locations of trifurcations were the left upper division bronchus, left lower lobar bronchus, right upper lobar bronchus, and right intermediate bronchus. The proposed method automatically constructed accurate subject-specific three-dimensional airway geometric models that contain anatomical information regarding branches using airway skeleton, diameters, and image-based surface geometry. The proposed method can construct (i) geometry automatically for population-based studies, (ii) trifurcations to retain the original airway topology, (iii) geometry that can be used for automatic generation of computational fluid dynamics meshes, and (iv) geometry based only on a skeleton and diameters for idealized branches. PMID:27704229
A new airborne laser rangefinder dynamic target simulator for non-stationary environment
NASA Astrophysics Data System (ADS)
Ma, Pengge; Pang, Dongdong; Yi, Yang
2017-11-01
For the non-stationary environment simulation in laser range finder product testing, a new dynamic target simulation system is studied. First of all, the three-pulsed laser ranging principle, laser target signal composition and mathematical representation are introduced. Then, the actual nonstationary working environment of laser range finder is analyzed, and points out that the real sunshine background light clutter and target shielding effect in laser echo become the main influencing factors. After that, the dynamic laser target signal simulation method is given. Eventlly, the implementation of automatic test system based on arbitrary waveform generator is described. Practical application shows that the new echo signal automatic test system can simulate the real laser ranging environment of laser range finder, and is suitable for performance test of products.
General Framework for Animal Food Safety Traceability Using GS1 and RFID
NASA Astrophysics Data System (ADS)
Cao, Weizhu; Zheng, Limin; Zhu, Hong; Wu, Ping
GS1 is global traceability standard, which is composed by the encoding system (EAN/UCC, EPC), the data carriers identified automatically (bar codes, RFID), electronic data interchange standards (EDI, XML). RFID is a non-contact, multi-objective automatic identification technique. Tracing of source food, standardization of RFID tags, sharing of dynamic data are problems to solve urgently for recent traceability systems. The paper designed general framework for animal food safety traceability using GS1 and RFID. This framework uses RFID tags encoding with EPCglobal tag data standards. Each information server has access tier, business tier and resource tier. These servers are heterogeneous and distributed, providing user access interfaces by SOAP or HTTP protocols. For sharing dynamic data, discovery service and object name service are used to locate dynamic distributed information servers.
76 FR 66220 - Automatic Underfrequency Load Shedding and Load Shedding Plans Reliability Standards
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-26
..., EPRI Power Systems Dynamics Tutorial, Chapter 4 at page 4-78 (2009), available at http://www.epri.com.... Power systems consist of static components (e.g., transformers and transmission lines) and dynamic... decisions on simulations, both static and dynamic, using area power system models to meet requirements in...
Greco, Alberto; Lanata, Antonio; Valenza, Gaetano; Di Francesco, Fabio; Scilingo, Enzo Pasquale
2016-08-01
This study reports on the development of a gender-specific classification system able to discern between two valence levels of smell, through information gathered from electrodermal activity (EDA) dynamics. Specifically, two odorants were administered to 32 healthy volunteers (16 males) while monitoring EDA. CvxEDA model was used to process the EDA signal and extract features from both tonic and phasic components. The feature set was used as input to a K-NN classifier implementing a leave-one-subject-out procedure. Results show strong differences in the accuracy of valence recognition between men (62.5%) and women (78%). We can conclude that affective olfactory stimulation significantly affect EDA dynamics with a highly specific gender dependency.
NASA Astrophysics Data System (ADS)
He, Youmin; Qu, Yueqiao; Zhang, Yi; Ma, Teng; Zhu, Jiang; Miao, Yusi; Humayun, Mark; Zhou, Qifa; Chen, Zhongping
2017-02-01
Age-related macular degeneration (AMD) is an eye condition that is considered to be one of the leading causes of blindness among people over 50. Recent studies suggest that the mechanical properties in retina layers are affected during the early onset of disease. Therefore, it is necessary to identify such changes in the individual layers of the retina so as to provide useful information for disease diagnosis. In this study, we propose using an acoustic radiation force optical coherence elastography (ARF-OCE) system to dynamically excite the porcine retina and detect the vibrational displacement with phase resolved Doppler optical coherence tomography. Due to the vibrational mechanism of the tissue response, the image quality is compromised during elastogram acquisition. In order to properly analyze the images, all signals, including the trigger and control signals for excitation, as well as detection and scanning signals, are synchronized within the OCE software and are kept consistent between frames, making it possible for easy phase unwrapping and elasticity analysis. In addition, a combination of segmentation algorithms is used to accommodate the compromised image quality. An automatic 3D segmentation method has been developed to isolate and measure the relative elasticity of every individual retinal layer. Two different segmentation schemes based on random walker and dynamic programming are implemented. The algorithm has been validated using a 3D region of the porcine retina, where individual layers have been isolated and analyzed using statistical methods. The errors compared to manual segmentation will be calculated.
A VxD-based automatic blending system using multithreaded programming.
Wang, L; Jiang, X; Chen, Y; Tan, K C
2004-01-01
This paper discusses the object-oriented software design for an automatic blending system. By combining the advantages of a programmable logic controller (PLC) and an industrial control PC (ICPC), an automatic blending control system is developed for a chemical plant. The system structure and multithread-based communication approach are first presented in this paper. The overall software design issues, such as system requirements and functionalities, are then discussed in detail. Furthermore, by replacing the conventional dynamic link library (DLL) with virtual X device drivers (VxD's), a practical and cost-effective solution is provided to improve the robustness of the Windows platform-based automatic blending system in small- and medium-sized plants.
Application of automatic image analysis in wood science
Charles W. McMillin
1982-01-01
In this paper I describe an image analysis system and illustrate with examples the application of automatic quantitative measurement to wood science. Automatic image analysis, a powerful and relatively new technology, uses optical, video, electronic, and computer components to rapidly derive information from images with minimal operator interaction. Such instruments...
Automatic cortical thickness analysis on rodent brain
NASA Astrophysics Data System (ADS)
Lee, Joohwi; Ehlers, Cindy; Crews, Fulton; Niethammer, Marc; Budin, Francois; Paniagua, Beatriz; Sulik, Kathy; Johns, Josephine; Styner, Martin; Oguz, Ipek
2011-03-01
Localized difference in the cortex is one of the most useful morphometric traits in human and animal brain studies. There are many tools and methods already developed to automatically measure and analyze cortical thickness for the human brain. However, these tools cannot be directly applied to rodent brains due to the different scales; even adult rodent brains are 50 to 100 times smaller than humans. This paper describes an algorithm for automatically measuring the cortical thickness of mouse and rat brains. The algorithm consists of three steps: segmentation, thickness measurement, and statistical analysis among experimental groups. The segmentation step provides the neocortex separation from other brain structures and thus is a preprocessing step for the thickness measurement. In the thickness measurement step, the thickness is computed by solving a Laplacian PDE and a transport equation. The Laplacian PDE first creates streamlines as an analogy of cortical columns; the transport equation computes the length of the streamlines. The result is stored as a thickness map over the neocortex surface. For the statistical analysis, it is important to sample thickness at corresponding points. This is achieved by the particle correspondence algorithm which minimizes entropy between dynamically moving sample points called particles. Since the computational cost of the correspondence algorithm may limit the number of corresponding points, we use thin-plate spline based interpolation to increase the number of corresponding sample points. As a driving application, we measured the thickness difference to assess the effects of adolescent intermittent ethanol exposure that persist into adulthood and performed t-test between the control and exposed rat groups. We found significantly differing regions in both hemispheres.
A superpixel-based framework for automatic tumor segmentation on breast DCE-MRI
NASA Astrophysics Data System (ADS)
Yu, Ning; Wu, Jia; Weinstein, Susan P.; Gaonkar, Bilwaj; Keller, Brad M.; Ashraf, Ahmed B.; Jiang, YunQing; Davatzikos, Christos; Conant, Emily F.; Kontos, Despina
2015-03-01
Accurate and efficient automated tumor segmentation in breast dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is highly desirable for computer-aided tumor diagnosis. We propose a novel automatic segmentation framework which incorporates mean-shift smoothing, superpixel-wise classification, pixel-wise graph-cuts partitioning, and morphological refinement. A set of 15 breast DCE-MR images, obtained from the American College of Radiology Imaging Network (ACRIN) 6657 I-SPY trial, were manually segmented to generate tumor masks (as ground truth) and breast masks (as regions of interest). Four state-of-the-art segmentation approaches based on diverse models were also utilized for comparison. Based on five standard evaluation metrics for segmentation, the proposed framework consistently outperformed all other approaches. The performance of the proposed framework was: 1) 0.83 for Dice similarity coefficient, 2) 0.96 for pixel-wise accuracy, 3) 0.72 for VOC score, 4) 0.79 mm for mean absolute difference, and 5) 11.71 mm for maximum Hausdorff distance, which surpassed the second best method (i.e., adaptive geodesic transformation), a semi-automatic algorithm depending on precise initialization. Our results suggest promising potential applications of our segmentation framework in assisting analysis of breast carcinomas.
Pereira, Clayton R; Pereira, Danilo R; Rosa, Gustavo H; Albuquerque, Victor H C; Weber, Silke A T; Hook, Christian; Papa, João P
2018-05-01
Parkinson's disease (PD) is considered a degenerative disorder that affects the motor system, which may cause tremors, micrography, and the freezing of gait. Although PD is related to the lack of dopamine, the triggering process of its development is not fully understood yet. In this work, we introduce convolutional neural networks to learn features from images produced by handwritten dynamics, which capture different information during the individual's assessment. Additionally, we make available a dataset composed of images and signal-based data to foster the research related to computer-aided PD diagnosis. The proposed approach was compared against raw data and texture-based descriptors, showing suitable results, mainly in the context of early stage detection, with results nearly to 95%. The analysis of handwritten dynamics using deep learning techniques showed to be useful for automatic Parkinson's disease identification, as well as it can outperform handcrafted features. Copyright © 2018 Elsevier B.V. All rights reserved.
Advances in Software Tools for Pre-processing and Post-processing of Overset Grid Computations
NASA Technical Reports Server (NTRS)
Chan, William M.
2004-01-01
Recent developments in three pieces of software for performing pre-processing and post-processing work on numerical computations using overset grids are presented. The first is the OVERGRID graphical interface which provides a unified environment for the visualization, manipulation, generation and diagnostics of geometry and grids. Modules are also available for automatic boundary conditions detection, flow solver input preparation, multiple component dynamics input preparation and dynamics animation, simple solution viewing for moving components, and debris trajectory analysis input preparation. The second is a grid generation script library that enables rapid creation of grid generation scripts. A sample of recent applications will be described. The third is the OVERPLOT graphical interface for displaying and analyzing history files generated by the flow solver. Data displayed include residuals, component forces and moments, number of supersonic and reverse flow points, and various dynamics parameters.
Serçinoglu, Onur; Ozbek, Pemra
2018-05-25
Atomistic molecular dynamics (MD) simulations generate a wealth of information related to the dynamics of proteins. If properly analyzed, this information can lead to new insights regarding protein function and assist wet-lab experiments. Aiming to identify interactions between individual amino acid residues and the role played by each in the context of MD simulations, we present a stand-alone software called gRINN (get Residue Interaction eNergies and Networks). gRINN features graphical user interfaces (GUIs) and a command-line interface for generating and analyzing pairwise residue interaction energies and energy correlations from protein MD simulation trajectories. gRINN utilizes the features of NAMD or GROMACS MD simulation packages and automatizes the steps necessary to extract residue-residue interaction energies from user-supplied simulation trajectories, greatly simplifying the analysis for the end-user. A GUI, including an embedded molecular viewer, is provided for visualization of interaction energy time-series, distributions, an interaction energy matrix, interaction energy correlations and a residue correlation matrix. gRINN additionally offers construction and analysis of Protein Energy Networks, providing residue-based metrics such as degrees, betweenness-centralities, closeness centralities as well as shortest path analysis. gRINN is free and open to all users without login requirement at http://grinn.readthedocs.io.
Application of the ADAMS program to deployable space truss structures
NASA Technical Reports Server (NTRS)
Calleson, R. E.
1985-01-01
The need for a computer program to perform kinematic and dynamic analyses of large truss structures while deploying from a packaged configuration in space led to the evaluation of several existing programs. ADAMS (automatic dynamic analysis of mechanical systems), a generalized program from performing the dynamic simulation of mechanical systems undergoing large displacements, is applied to two concepts of deployable space antenna units. One concept is a one cube folding unit of Martin Marietta's Box Truss Antenna and the other is a tetrahedral truss unit of a Tetrahedral Truss Antenna. Adequate evaluation of dynamic forces during member latch-up into the deployed configuration is not yet available from the present version of ADAMS since it is limited to the assembly of rigid bodies. Included is a method for estimating the maximum bending stress in a surface member at latch-up. Results include member displacement and velocity responses during extension and an example of member bending stresses at latch-up.
Fast and robust segmentation in the SDO-AIA era
NASA Astrophysics Data System (ADS)
Verbeeck, Cis; Delouille, Véronique; Mampaey, Benjamin; Hochedez, Jean-François; Boyes, David; Barra, Vincent
Solar images from the Atmospheric Imaging Assembly (AIA) aboard the Solar Dynamics Ob-servatory (SDO) will flood the solar physics community with a wealth of information on solar variability, of great importance both in solar physics and in view of Space Weather applica-tions. Obtaining this information, however, requires the ability to automatically process large amounts of data in an objective fashion. In previous work, we have proposed a multi-channel unsupervised spatially-constrained multi-channel fuzzy clustering algorithm (SPoCA) that automatically segments EUV solar images into Active Regions (AR), Coronal Holes (CH), and Quiet Sun (QS). This algorithm will run in near real time on AIA data as part of the SDO Feature Finding Project, a suite of software pipeline modules for automated feature recognition and analysis for the imagery from SDO. After having corrected for the limb brightening effect, SPoCA computes an optimal clustering with respect to the regions of interest using fuzzy logic on a quality criterion to manage the various noises present in the images and the imprecision in the definition of the above regions. Next, the algorithm applies a morphological opening operation, smoothing the cluster edges while preserving their general shape. The process is fast and automatic. A lower size limit is used to distinguish AR from Bright Points. As the algorithm segments the coronal images according to their brightness, it might happen that an AR is detected as several disjoint pieces, if the brightness in between is somewhat lower. Morphological dilation is employed to reconstruct the AR themselves from their constituent pieces. Combining SPoCA's detection of AR, CH, and QS on subsequent images allows automatic tracking and naming of any region of interest. In the SDO software pipeline, SPoCA will auto-matically populate the Heliophysics Events Knowledgebase(HEK) with Active Region events. Further, the algorithm has a huge potential for correct and automatic identification of AR, CH, and QS in any study that aims to address properties of those specific regions in the corona. SPoCA is now ready and waiting to tackle solar cycle 24 using SDO data. While we presently apply SPoCA to EUV data, the method is generic enough to allow the introduction of other channels or data, e.g., Differential Emission Measure (DEM) maps. Because of the unprecedented challenges brought up by the quantity of SDO data, European partners have gathered within an ISSI team on `Mining and Exploiting the NASA Solar Dynam-ics Observatory data in Europe' (a.k.a. Soldyneuro). Its aim is to provide automated feature recognition algorithms for scanning the SDO archive, as well as conducting scientific studies that combine different algorithm's outputs. Within the Soldyneuro project, we will use data from the EUV Variability Experiment (EVE) spectrometer in order to estimate the full Sun DEM. This DEM will next be used to estimate the total flux from AIA images so as to provide a validation for the calibration of AIA.
NASA Technical Reports Server (NTRS)
Halyo, N.
1976-01-01
A digital automatic control law to capture a steep glideslope and track the glideslope to a specified altitude is developed for the longitudinal/vertical dynamics of a CTOL aircraft using modern estimation and control techniques. The control law uses a constant gain Kalman filter to process guidance information from the microwave landing system, and acceleration from body mounted accelerometer data. The filter outputs navigation data and wind velocity estimates which are used in controlling the aircraft. Results from a digital simulation of the aircraft dynamics and the control law are presented for various wind conditions.
Automatic control of the Skylab Astronaut Maneuvering Research Vehicle.
NASA Technical Reports Server (NTRS)
Murtagh, T. B.; Goodwin, M. A.; Greenlee, J. E.; Whitsett , C. E.
1973-01-01
The two automatic control modes of the Astronaut Maneuvering Research Vehicle (AMRV) are analyzed: the control moment gyro (CMG) and the rate gyro (RG). The AMRV is an autonomous maneuvering unit which translates and rotates the pilot by means of hand-controller input commands. The CMG normal operation, desaturation, and cage/lock dynamics are described in terms of a realistic AMRV mass property configuration. No propellant is used for normal operation in the CMG mode, and the maximum rotation rate is 5 deg/sec about each AMRV axis. The RG attitude maneuvering and limit cycle submode dynamic are described in terms of the same AMRV mass property configuration.
Power-based Shift Schedule for Pure Electric Vehicle with a Two-speed Automatic Transmission
NASA Astrophysics Data System (ADS)
Wang, Jiaqi; Liu, Yanfang; Liu, Qiang; Xu, Xiangyang
2016-11-01
This paper introduces a comprehensive shift schedule for a two-speed automatic transmission of pure electric vehicle. Considering about driving ability and efficiency performance of electric vehicles, the power-based shift schedule is proposed with three principles. This comprehensive shift schedule regards the vehicle current speed and motor load power as input parameters to satisfy the vehicle driving power demand with lowest energy consumption. A simulation model has been established to verify the dynamic and economic performance of comprehensive shift schedule. Compared with traditional dynamic and economic shift schedules, simulation results indicate that the power-based shift schedule is superior to traditional shift schedules.
An Algorithm for Automatic Checking of Exercises in a Dynamic Geometry System: iGeom
ERIC Educational Resources Information Center
Isotani, Seiji; de Oliveira Brandao, Leonidas
2008-01-01
One of the key issues in e-learning environments is the possibility of creating and evaluating exercises. However, the lack of tools supporting the authoring and automatic checking of exercises for specifics topics (e.g., geometry) drastically reduces advantages in the use of e-learning environments on a larger scale, as usually happens in Brazil.…
Automatic facial mimicry in response to dynamic emotional stimuli in five-month-old infants.
Isomura, Tomoko; Nakano, Tamami
2016-12-14
Human adults automatically mimic others' emotional expressions, which is believed to contribute to sharing emotions with others. Although this behaviour appears fundamental to social reciprocity, little is known about its developmental process. Therefore, we examined whether infants show automatic facial mimicry in response to others' emotional expressions. Facial electromyographic activity over the corrugator supercilii (brow) and zygomaticus major (cheek) of four- to five-month-old infants was measured while they viewed dynamic clips presenting audiovisual, visual and auditory emotions. The audiovisual bimodal emotion stimuli were a display of a laughing/crying facial expression with an emotionally congruent vocalization, whereas the visual/auditory unimodal emotion stimuli displayed those emotional faces/vocalizations paired with a neutral vocalization/face, respectively. Increased activation of the corrugator supercilii muscle in response to audiovisual cries and the zygomaticus major in response to audiovisual laughter were observed between 500 and 1000 ms after stimulus onset, which clearly suggests rapid facial mimicry. By contrast, both visual and auditory unimodal emotion stimuli did not activate the infants' corresponding muscles. These results revealed that automatic facial mimicry is present as early as five months of age, when multimodal emotional information is present. © 2016 The Author(s).
CAVER 3.0: A Tool for the Analysis of Transport Pathways in Dynamic Protein Structures
Strnad, Ondrej; Brezovsky, Jan; Kozlikova, Barbora; Gora, Artur; Sustr, Vilem; Klvana, Martin; Medek, Petr; Biedermannova, Lada; Sochor, Jiri; Damborsky, Jiri
2012-01-01
Tunnels and channels facilitate the transport of small molecules, ions and water solvent in a large variety of proteins. Characteristics of individual transport pathways, including their geometry, physico-chemical properties and dynamics are instrumental for understanding of structure-function relationships of these proteins, for the design of new inhibitors and construction of improved biocatalysts. CAVER is a software tool widely used for the identification and characterization of transport pathways in static macromolecular structures. Herein we present a new version of CAVER enabling automatic analysis of tunnels and channels in large ensembles of protein conformations. CAVER 3.0 implements new algorithms for the calculation and clustering of pathways. A trajectory from a molecular dynamics simulation serves as the typical input, while detailed characteristics and summary statistics of the time evolution of individual pathways are provided in the outputs. To illustrate the capabilities of CAVER 3.0, the tool was applied for the analysis of molecular dynamics simulation of the microbial enzyme haloalkane dehalogenase DhaA. CAVER 3.0 safely identified and reliably estimated the importance of all previously published DhaA tunnels, including the tunnels closed in DhaA crystal structures. Obtained results clearly demonstrate that analysis of molecular dynamics simulation is essential for the estimation of pathway characteristics and elucidation of the structural basis of the tunnel gating. CAVER 3.0 paves the way for the study of important biochemical phenomena in the area of molecular transport, molecular recognition and enzymatic catalysis. The software is freely available as a multiplatform command-line application at http://www.caver.cz. PMID:23093919
CAVER 3.0: a tool for the analysis of transport pathways in dynamic protein structures.
Chovancova, Eva; Pavelka, Antonin; Benes, Petr; Strnad, Ondrej; Brezovsky, Jan; Kozlikova, Barbora; Gora, Artur; Sustr, Vilem; Klvana, Martin; Medek, Petr; Biedermannova, Lada; Sochor, Jiri; Damborsky, Jiri
2012-01-01
Tunnels and channels facilitate the transport of small molecules, ions and water solvent in a large variety of proteins. Characteristics of individual transport pathways, including their geometry, physico-chemical properties and dynamics are instrumental for understanding of structure-function relationships of these proteins, for the design of new inhibitors and construction of improved biocatalysts. CAVER is a software tool widely used for the identification and characterization of transport pathways in static macromolecular structures. Herein we present a new version of CAVER enabling automatic analysis of tunnels and channels in large ensembles of protein conformations. CAVER 3.0 implements new algorithms for the calculation and clustering of pathways. A trajectory from a molecular dynamics simulation serves as the typical input, while detailed characteristics and summary statistics of the time evolution of individual pathways are provided in the outputs. To illustrate the capabilities of CAVER 3.0, the tool was applied for the analysis of molecular dynamics simulation of the microbial enzyme haloalkane dehalogenase DhaA. CAVER 3.0 safely identified and reliably estimated the importance of all previously published DhaA tunnels, including the tunnels closed in DhaA crystal structures. Obtained results clearly demonstrate that analysis of molecular dynamics simulation is essential for the estimation of pathway characteristics and elucidation of the structural basis of the tunnel gating. CAVER 3.0 paves the way for the study of important biochemical phenomena in the area of molecular transport, molecular recognition and enzymatic catalysis. The software is freely available as a multiplatform command-line application at http://www.caver.cz.
2012-03-01
comprehensive explanations (Yechout, 2003), (Nelson, 1998). Figure 9: USAFA/Brandt Jet5 Aircraft Modeling Program 18 2.5.1 Dynamic Aircraft...16 2.5.1 Dynamic Aircraft Stability Modes .......................................................... 18 2.5.2 State...12 Figure 7: Body-Fixed Reference Frame ........................................................................... 13 Figure 8: Static and Dynamic
ERIC Educational Resources Information Center
Kuhn, Stephanie A. Contrucci; Triggs, Mandy
2009-01-01
Self-injurious behavior (SIB) that occurs at high rates across all conditions of a functional analysis can suggest automatic or multiple functions. In the current study, we conducted a functional analysis for 1 individual with SIB. Results indicated that SIB was, at least in part, maintained by automatic reinforcement. Further analyses using…
Pi, Yiming
2017-01-01
The frequency of terahertz radar ranges from 0.1 THz to 10 THz, which is higher than that of microwaves. Multi-modal signals, including high-resolution range profile (HRRP) and Doppler signatures, can be acquired by the terahertz radar system. These two kinds of information are commonly used in automatic target recognition; however, dynamic gesture recognition is rarely discussed in the terahertz regime. In this paper, a dynamic gesture recognition system using a terahertz radar is proposed, based on multi-modal signals. The HRRP sequences and Doppler signatures were first achieved from the radar echoes. Considering the electromagnetic scattering characteristics, a feature extraction model is designed using location parameter estimation of scattering centers. Dynamic Time Warping (DTW) extended to multi-modal signals is used to accomplish the classifications. Ten types of gesture signals, collected from a terahertz radar, are applied to validate the analysis and the recognition system. The results of the experiment indicate that the recognition rate reaches more than 91%. This research verifies the potential applications of dynamic gesture recognition using a terahertz radar. PMID:29267249
Zhou, Zhi; Cao, Zongjie; Pi, Yiming
2017-12-21
The frequency of terahertz radar ranges from 0.1 THz to 10 THz, which is higher than that of microwaves. Multi-modal signals, including high-resolution range profile (HRRP) and Doppler signatures, can be acquired by the terahertz radar system. These two kinds of information are commonly used in automatic target recognition; however, dynamic gesture recognition is rarely discussed in the terahertz regime. In this paper, a dynamic gesture recognition system using a terahertz radar is proposed, based on multi-modal signals. The HRRP sequences and Doppler signatures were first achieved from the radar echoes. Considering the electromagnetic scattering characteristics, a feature extraction model is designed using location parameter estimation of scattering centers. Dynamic Time Warping (DTW) extended to multi-modal signals is used to accomplish the classifications. Ten types of gesture signals, collected from a terahertz radar, are applied to validate the analysis and the recognition system. The results of the experiment indicate that the recognition rate reaches more than 91%. This research verifies the potential applications of dynamic gesture recognition using a terahertz radar.
The MOD-OA 200 kilowatt wind turbine generator design and analysis report
NASA Astrophysics Data System (ADS)
Andersen, T. S.; Bodenschatz, C. A.; Eggers, A. G.; Hughes, P. S.; Lampe, R. F.; Lipner, M. H.; Schornhorst, J. R.
1980-08-01
The project requirements, approach, system description, design requirements, design, analysis, system tests, installation safety considerations, failure modes and effects analysis, data acquisition, and initial performance for the MOD-OA 200 kw wind turbine generator are discussed. The components, the rotor, driven train, nacelle equipment, yaw drive mechanism and brake, tower, foundation, electrical system, and control systems are presented. The rotor includes the blades, hub and pitch change mechanism. The drive train includes the low speed shaft, speed increaser, high speed shaft, and rotor brake. The electrical system includes the generator, switchgear, transformer, and utility connection. The control systems are the blade pitch, yaw, and generator control, and the safety system. Manual, automatic, and remote control and Dynamic loads and fatigue are analyzed.
The MOD-OA 200 kilowatt wind turbine generator design and analysis report
NASA Technical Reports Server (NTRS)
Andersen, T. S.; Bodenschatz, C. A.; Eggers, A. G.; Hughes, P. S.; Lampe, R. F.; Lipner, M. H.; Schornhorst, J. R.
1980-01-01
The project requirements, approach, system description, design requirements, design, analysis, system tests, installation safety considerations, failure modes and effects analysis, data acquisition, and initial performance for the MOD-OA 200 kw wind turbine generator are discussed. The components, the rotor, driven train, nacelle equipment, yaw drive mechanism and brake, tower, foundation, electrical system, and control systems are presented. The rotor includes the blades, hub and pitch change mechanism. The drive train includes the low speed shaft, speed increaser, high speed shaft, and rotor brake. The electrical system includes the generator, switchgear, transformer, and utility connection. The control systems are the blade pitch, yaw, and generator control, and the safety system. Manual, automatic, and remote control and Dynamic loads and fatigue are analyzed.
Adventures in Modern Time Series Analysis: From the Sun to the Crab Nebula and Beyond
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey
2014-01-01
With the generation of long, precise, and finely sampled time series the Age of Digital Astronomy is uncovering and elucidating energetic dynamical processes throughout the Universe. Fulfilling these opportunities requires data effective analysis techniques rapidly and automatically implementing advanced concepts. The Time Series Explorer, under development in collaboration with Tom Loredo, provides tools ranging from simple but optimal histograms to time and frequency domain analysis for arbitrary data modes with any time sampling. Much of this development owes its existence to Joe Bredekamp and the encouragement he provided over several decades. Sample results for solar chromospheric activity, gamma-ray activity in the Crab Nebula, active galactic nuclei and gamma-ray bursts will be displayed.
Dixit, Sudeepa; Fox, Mark; Pal, Anupam
2014-01-01
Magnetic resonance imaging (MRI) has advantages for the assessment of gastrointestinal structures and functions; however, processing MRI data is time consuming and this has limited uptake to a few specialist centers. This study introduces a semiautomatic image processing system for rapid analysis of gastrointestinal MRI. For assessment of simpler regions of interest (ROI) such as the stomach, the system generates virtual images along arbitrary planes that intersect the ROI edges in the original images. This generates seed points that are joined automatically to form contours on each adjacent two-dimensional image and reconstructed in three dimensions (3D). An alternative thresholding approach is available for rapid assessment of complex structures like the small intestine. For assessment of dynamic gastrointestinal function, such as gastric accommodation and emptying, the initial 3D reconstruction is used as reference to process adjacent image stacks automatically. This generates four-dimensional (4D) reconstructions of dynamic volume change over time. Compared with manual processing, this semiautomatic system reduced the user input required to analyze a MRI gastric emptying study (estimated 100 vs. 10,000 mouse clicks). This analysis was not subject to variation in volume measurements seen between three human observers. In conclusion, the image processing platform presented processed large volumes of MRI data, such as that produced by gastric accommodation and emptying studies, with minimal user input. 3D and 4D reconstructions of the stomach and, potentially, other gastrointestinal organs are produced faster and more accurately than manual methods. This system will facilitate the application of MRI in gastrointestinal research and clinical practice. PMID:25540229
CAD system for automatic analysis of CT perfusion maps
NASA Astrophysics Data System (ADS)
Hachaj, T.; Ogiela, M. R.
2011-03-01
In this article, authors present novel algorithms developed for the computer-assisted diagnosis (CAD) system for analysis of dynamic brain perfusion, computer tomography (CT) maps, cerebral blood flow (CBF), and cerebral blood volume (CBV). Those methods perform both quantitative analysis [detection and measurement and description with brain anatomy atlas (AA) of potential asymmetries/lesions] and qualitative analysis (semantic interpretation of visualized symptoms). The semantic interpretation (decision about type of lesion: ischemic/hemorrhagic, is the brain tissue at risk of infraction or not) of visualized symptoms is done by, so-called, cognitive inference processes allowing for reasoning on character of pathological regions based on specialist image knowledge. The whole system is implemented in.NET platform (C# programming language) and can be used on any standard PC computer with.NET framework installed.
NASA Technical Reports Server (NTRS)
Stone, H. W.; Powell, R. W.
1977-01-01
A six-degree-of-freedom simulation analysis was conducted to examine the effects of longitudinal static aerodynamic stability and control uncertainties on the performance of the space shuttle orbiter automatic (no manual inputs) entry guidance and control systems. To establish the acceptable boundaries, the static aerodynamic characteristics were varied either by applying a multiplier to the aerodynamic parameter or by adding an increment. With either of two previously identified control system modifications included, the acceptable longitudinal aerodynamic boundaries were determined.
Li, Song-Tao; Liu, Yong; Zhou, Qiang; Lue, Ren-Fa; Song, Lei; Dong, Shi-Wu; Guo, Ping; Kopjar, Branko
2014-03-01
This study introduced a prototype of an axial-stress bioreactor system that supports long-term growth and development of engineered tissues. The main features of this bioreactor are an integrated substance exchanger and feedback control of pH and PO₂. A 21-day study was conducted to validate the system's ability to maintain a stable environment, while remaining sterile. Our results showed that the pH, PO₂, and nutrient (glucose) remained balanced at appropriate levels, while metabolic waste (lactic acid) was removed. No bacteria or fungi were detected in the system or tissue; thus, demonstrating that it was sterile. These data indicate the bioreactor's strong potential for long-term tissue culture. To explore this idea, the effect of dynamic culture, including cyclic compression and automatic substance exchange, on mouse bone-marrow mesenchymal stem cells (BMSCs) seeded in decalcified bone matrix was studied using the bioreactor prototype. Histological sections of the engineered tissues showed higher cell densities in scaffolds in dynamic culture compared to those in static culture, while cell cycle analysis showed that dynamic culture promoted BMSC proliferation (proliferation index, PI=34.02±1.77) more effectively than static culture (PI=26.66±1.81). The results from a methyl thiazolyl tetrazolium assay were consistent with the loading experimental data. Furthermore, elevated alkaline phosphatase activity and calcium content were observed in dynamic condition compared to static culture. In conclusion, this bioreactor system supplies a method of modulating the pH and PO₂ in defined ranges with only small fluctuations; it can be used as a physiological or pathological analog. Automatic control of the environment is a practical solution for long-term, steady-state culture for future commercialization.
Rosende, María; Beesley, Luke; Moreno-Jimenez, Eduardo; Miró, Manuel
2016-02-01
An automatic in-vitro bioaccessibility test based upon dynamic microcolumn extraction in a programmable flow setup is herein proposed as a screening tool to evaluate bio-char based remediation of mine soils contaminated with trace elements as a compelling alternative to conventional phyto-availability tests. The feasibility of the proposed system was evaluated by extracting the readily bioaccessible pools of As, Pb and Zn in two contaminated mine soils before and after the addition of two biochars (9% (w:w)) of diverse source origin (pine and olive). Bioaccessible fractions under worst-case scenarios were measured using 0.001 mol L(-1) CaCl2 as extractant for mimicking plant uptake, and analysis of the extracts by inductively coupled optical emission spectrometry. The t-test of comparison of means revealed an efficient metal (mostly Pb and Zn) immobilization by the action of olive pruning-based biochar against the bare (control) soil at the 0.05 significance level. In-vitro flow-through bioaccessibility tests are compared for the first time with in-vivo phyto-toxicity assays in a microcosm soil study. By assessing seed germination and shoot elongation of Lolium perenne in contaminated soils with and without biochar amendments the dynamic flow-based bioaccessibility data proved to be in good agreement with the phyto-availability tests. Experimental results indicate that the dynamic extraction method is a viable and economical in-vitro tool in risk assessment explorations to evaluate the feasibility of a given biochar amendment for revegetation and remediation of metal contaminated soils in a mere 10 min against 4 days in case of phyto-toxicity assays. Copyright © 2015 Elsevier B.V. All rights reserved.
Visual display and alarm system for wind tunnel static and dynamic loads
NASA Technical Reports Server (NTRS)
Hanly, Richard D.; Fogarty, James T.
1987-01-01
A wind tunnel balance monitor and alarm system developed at NASA Ames Research Center will produce several beneficial results. The costs of wind tunnel delays because of inadvertent balance damage and the costs of balance repair or replacement can be greatly reduced or eliminated with better real-time information on the balance static and dynamic loading. The wind tunnel itself will have enhanced utility with the elimination of overly cautious limits on test conditions. The microprocessor-based system features automatic scaling and 16 multicolored LED bargraphs to indicate both static and dynamic components of the signals from eight individual channels. Five individually programmable alarm levels are available with relay closures for internal or external visual and audible warning devices and other functions such as automatic activation of external recording devices, model positioning mechanisms, or tunnel shutdown.
Visual display and alarm system for wind tunnel static and dynamic loads
NASA Technical Reports Server (NTRS)
Hanly, Richard D.; Fogarty, James T.
1987-01-01
A wind tunnel balance monitor and alarm system developed at NASA Ames Research Center will produce several beneficial results. The costs of wind tunnel delays because of inadvertent balance damage and the costs of balance repair or replacement can be greatly reduced or eliminated with better real-time information on the balance static and dynamic loading. The wind tunnel itself will have enhanced utility with the elimination of overly cautious limits on test conditions. The microprocessor-based system features automatic scaling and 16 multicolored LED bargraphs to indicate both static and dynamic components of the signals from eight individual channels. Five individually programmable alarm levels are available with relay closures for internal or external visual and audible warning devices and other functions such as automatic activation of external recording devices, model positioning mechanism, or tunnel shutdown.
Bayesian switching factor analysis for estimating time-varying functional connectivity in fMRI.
Taghia, Jalil; Ryali, Srikanth; Chen, Tianwen; Supekar, Kaustubh; Cai, Weidong; Menon, Vinod
2017-07-15
There is growing interest in understanding the dynamical properties of functional interactions between distributed brain regions. However, robust estimation of temporal dynamics from functional magnetic resonance imaging (fMRI) data remains challenging due to limitations in extant multivariate methods for modeling time-varying functional interactions between multiple brain areas. Here, we develop a Bayesian generative model for fMRI time-series within the framework of hidden Markov models (HMMs). The model is a dynamic variant of the static factor analysis model (Ghahramani and Beal, 2000). We refer to this model as Bayesian switching factor analysis (BSFA) as it integrates factor analysis into a generative HMM in a unified Bayesian framework. In BSFA, brain dynamic functional networks are represented by latent states which are learnt from the data. Crucially, BSFA is a generative model which estimates the temporal evolution of brain states and transition probabilities between states as a function of time. An attractive feature of BSFA is the automatic determination of the number of latent states via Bayesian model selection arising from penalization of excessively complex models. Key features of BSFA are validated using extensive simulations on carefully designed synthetic data. We further validate BSFA using fingerprint analysis of multisession resting-state fMRI data from the Human Connectome Project (HCP). Our results show that modeling temporal dependencies in the generative model of BSFA results in improved fingerprinting of individual participants. Finally, we apply BSFA to elucidate the dynamic functional organization of the salience, central-executive, and default mode networks-three core neurocognitive systems with central role in cognitive and affective information processing (Menon, 2011). Across two HCP sessions, we demonstrate a high level of dynamic interactions between these networks and determine that the salience network has the highest temporal flexibility among the three networks. Our proposed methods provide a novel and powerful generative model for investigating dynamic brain connectivity. Copyright © 2017 Elsevier Inc. All rights reserved.
Tidal analysis and Arrival Process Mining Using Automatic Identification System (AIS) Data
2017-01-01
files, organized by location. The data were processed using the Python programming language (van Rossum and Drake 2001), the Pandas data analysis...ER D C/ CH L TR -1 7- 2 Coastal Inlets Research Program Tidal Analysis and Arrival Process Mining Using Automatic Identification System...17-2 January 2017 Tidal Analysis and Arrival Process Mining Using Automatic Identification System (AIS) Data Brandan M. Scully Coastal and
[Study on the automatic parameters identification of water pipe network model].
Jia, Hai-Feng; Zhao, Qi-Feng
2010-01-01
Based on the problems analysis on development and application of water pipe network model, the model parameters automatic identification is regarded as a kernel bottleneck of model's application in water supply enterprise. The methodology of water pipe network model parameters automatic identification based on GIS and SCADA database is proposed. Then the kernel algorithm of model parameters automatic identification is studied, RSA (Regionalized Sensitivity Analysis) is used for automatic recognition of sensitive parameters, and MCS (Monte-Carlo Sampling) is used for automatic identification of parameters, the detail technical route based on RSA and MCS is presented. The module of water pipe network model parameters automatic identification is developed. At last, selected a typical water pipe network as a case, the case study on water pipe network model parameters automatic identification is conducted and the satisfied results are achieved.
Kauppi, Jukka-Pekka; Martikainen, Kalle; Ruotsalainen, Ulla
2010-12-01
The central purpose of passive signal intercept receivers is to perform automatic categorization of unknown radar signals. Currently, there is an urgent need to develop intelligent classification algorithms for these devices due to emerging complexity of radar waveforms. Especially multifunction radars (MFRs) capable of performing several simultaneous tasks by utilizing complex, dynamically varying scheduled waveforms are a major challenge for automatic pattern classification systems. To assist recognition of complex radar emissions in modern intercept receivers, we have developed a novel method to recognize dynamically varying pulse repetition interval (PRI) modulation patterns emitted by MFRs. We use robust feature extraction and classifier design techniques to assist recognition in unpredictable real-world signal environments. We classify received pulse trains hierarchically which allows unambiguous detection of the subpatterns using a sliding window. Accuracy, robustness and reliability of the technique are demonstrated with extensive simulations using both static and dynamically varying PRI modulation patterns. Copyright © 2010 Elsevier Ltd. All rights reserved.
Computerized Interpretation of Dynamic Breast MRI
2006-05-01
correction, tumor segmentation , extraction of computerized features that help distinguish between benign and malignant lesions, and classification. Our...for assessing tumor extent in 3D. The primary feature used for 3D tumor segmentation is the postcontrast enhancement vector. Tumor segmentation is a...Appendix B. 4. Investigation of methods for automatic tumor segmentation We developed an automatic method for assessing tumor extent in 3D. The
Gated high speed optical detector
NASA Technical Reports Server (NTRS)
Green, S. I.; Carson, L. M.; Neal, G. W.
1973-01-01
The design, fabrication, and test of two gated, high speed optical detectors for use in high speed digital laser communication links are discussed. The optical detectors used a dynamic crossed field photomultiplier and electronics including dc bias and RF drive circuits, automatic remote synchronization circuits, automatic gain control circuits, and threshold detection circuits. The equipment is used to detect binary encoded signals from a mode locked neodynium laser.
The dynamics of rupture in porous media
NASA Astrophysics Data System (ADS)
Stopiński, Wojciech; Ponomaryov, Aleksandr V.; Loś, Vladimir
1991-05-01
This paper presents a laboratory investigation of electric resistivity parameter for samples subject to loading in automatic press of “INOVA” type. The procedure of automatic quasi-continuous measurements of resistivity is briefly outlined. The distribution of mini-electrodes within the sample is described. Also shown is the manner in which reliability can be improved by increasing the repetition of resistivity measurements (every 7 16 s).
NASA Technical Reports Server (NTRS)
Hasler, A. F.; Strong, J.; Woodward, R. H.; Pierce, H.
1991-01-01
Results are presented on an automatic stereo analysis of cloud-top heights from nearly simultaneous satellite image pairs from the GOES and NOAA satellites, using a massively parallel processor computer. Comparisons of computer-derived height fields and manually analyzed fields show that the automatic analysis technique shows promise for performing routine stereo analysis in a real-time environment, providing a useful forecasting tool by augmenting observational data sets of severe thunderstorms and hurricanes. Simulations using synthetic stereo data show that it is possible to automatically resolve small-scale features such as 4000-m-diam clouds to about 1500 m in the vertical.
Features and perspectives of automatized construction crane-manipulators
NASA Astrophysics Data System (ADS)
Stepanov, Mikhail A.; Ilukhin, Peter A.
2018-03-01
Modern construction industry still has a high percentage of manual labor, and the greatest prospects of improving the construction process are lying in the field of automatization. In this article automatized construction manipulator-cranes are being studied in order to achieve the most rational design scheme. This is done through formulating a list of general conditions necessary for such cranes and a set of specialized kinematical conditions. A variety of kinematical schemes is evaluated via these conditions, and some are taken for further dynamical analisys. The comparative dynamical analisys of taken schemes was made and the most rational scheme was defined. Therefore a basis for a more complex and practical research of manipulator-cranes design is given and ways to implement them on practical level can now be calculated properly. Also, the perspectives of implementation of automated control systems and informational networks on construction sites in order to boost the quality of construction works, safety of labour and ecological safety are shown.
Jing, Xueping; Zheng, Xiujuan; Song, Shaoli; Liu, Kai
2017-12-01
Glomerular filtration rate (GFR), which can be estimated by Gates method with dynamic kidney single photon emission computed tomography (SPECT) imaging, is a key indicator of renal function. In this paper, an automatic computer tomography (CT)-assisted detection method of kidney region of interest (ROI) is proposed to achieve the objective and accurate GFR calculation. In this method, the CT coronal projection image and the enhanced SPECT synthetic image are firstly generated and registered together. Then, the kidney ROIs are delineated using a modified level set algorithm. Meanwhile, the background ROIs are also obtained based on the kidney ROIs. Finally, the value of GFR is calculated via Gates method. Comparing with the clinical data, the GFR values estimated by the proposed method were consistent with the clinical reports. This automatic method can improve the accuracy and stability of kidney ROI detection for GFR calculation, especially when the kidney function has been severely damaged.
Ebrahimi, Farideh; Setarehdan, Seyed-Kamaledin; Ayala-Moyeda, Jose; Nazeran, Homer
2013-10-01
The conventional method for sleep staging is to analyze polysomnograms (PSGs) recorded in a sleep lab. The electroencephalogram (EEG) is one of the most important signals in PSGs but recording and analysis of this signal presents a number of technical challenges, especially at home. Instead, electrocardiograms (ECGs) are much easier to record and may offer an attractive alternative for home sleep monitoring. The heart rate variability (HRV) signal proves suitable for automatic sleep staging. Thirty PSGs from the Sleep Heart Health Study (SHHS) database were used. Three feature sets were extracted from 5- and 0.5-min HRV segments: time-domain features, nonlinear-dynamics features and time-frequency features. The latter was achieved by using empirical mode decomposition (EMD) and discrete wavelet transform (DWT) methods. Normalized energies in important frequency bands of HRV signals were computed using time-frequency methods. ANOVA and t-test were used for statistical evaluations. Automatic sleep staging was based on HRV signal features. The ANOVA followed by a post hoc Bonferroni was used for individual feature assessment. Most features were beneficial for sleep staging. A t-test was used to compare the means of extracted features in 5- and 0.5-min HRV segments. The results showed that the extracted features means were statistically similar for a small number of features. A separability measure showed that time-frequency features, especially EMD features, had larger separation than others. There was not a sizable difference in separability of linear features between 5- and 0.5-min HRV segments but separability of nonlinear features, especially EMD features, decreased in 0.5-min HRV segments. HRV signal features were classified by linear discriminant (LD) and quadratic discriminant (QD) methods. Classification results based on features from 5-min segments surpassed those obtained from 0.5-min segments. The best result was obtained from features using 5-min HRV segments classified by the LD classifier. A combination of linear/nonlinear features from HRV signals is effective in automatic sleep staging. Moreover, time-frequency features are more informative than others. In addition, a separability measure and classification results showed that HRV signal features, especially nonlinear features, extracted from 5-min segments are more discriminative than those from 0.5-min segments in automatic sleep staging. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Integration of scheduling and discrete event simulation systems to improve production flow planning
NASA Astrophysics Data System (ADS)
Krenczyk, D.; Paprocka, I.; Kempa, W. M.; Grabowik, C.; Kalinowski, K.
2016-08-01
The increased availability of data and computer-aided technologies such as MRPI/II, ERP and MES system, allowing producers to be more adaptive to market dynamics and to improve production scheduling. Integration of production scheduling and computer modelling, simulation and visualization systems can be useful in the analysis of production system constraints related to the efficiency of manufacturing systems. A integration methodology based on semi-automatic model generation method for eliminating problems associated with complexity of the model and labour-intensive and time-consuming process of simulation model creation is proposed. Data mapping and data transformation techniques for the proposed method have been applied. This approach has been illustrated through examples of practical implementation of the proposed method using KbRS scheduling system and Enterprise Dynamics simulation system.
An automatic damage detection algorithm based on the Short Time Impulse Response Function
NASA Astrophysics Data System (ADS)
Auletta, Gianluca; Carlo Ponzo, Felice; Ditommaso, Rocco; Iacovino, Chiara
2016-04-01
Structural Health Monitoring together with all the dynamic identification techniques and damage detection techniques are increasing in popularity in both scientific and civil community in last years. The basic idea arises from the observation that spectral properties, described in terms of the so-called modal parameters (eigenfrequencies, mode shapes, and modal damping), are functions of the physical properties of the structure (mass, energy dissipation mechanisms and stiffness). Damage detection techniques traditionally consist in visual inspection and/or non-destructive testing. A different approach consists in vibration based methods detecting changes of feature related to damage. Structural damage exhibits its main effects in terms of stiffness and damping variation. Damage detection approach based on dynamic monitoring of structural properties over time has received a considerable attention in recent scientific literature. We focused the attention on the structural damage localization and detection after an earthquake, from the evaluation of the mode curvature difference. The methodology is based on the acquisition of the structural dynamic response through a three-directional accelerometer installed on the top floor of the structure. It is able to assess the presence of any damage on the structure providing also information about the related position and severity of the damage. The procedure is based on a Band-Variable Filter, (Ditommaso et al., 2012), used to extract the dynamic characteristics of systems that evolve over time by acting simultaneously in both time and frequency domain. In this paper using a combined approach based on the Fourier Transform and on the seismic interferometric analysis, an useful tool for the automatic fundamental frequency evaluation of nonlinear structures has been proposed. Moreover, using this kind of approach it is possible to improve some of the existing methods for the automatic damage detection providing stable results also during the strong motion phase. This approach helps to overcome the limitation derived from the use of techniques based on simple Fourier Transform that provide good results when the response of the monitored system is stationary, but fails when the system exhibits a non-stationary behaviour. The main advantage derived from the use of the proposed approach for Structural Health Monitoring is based on the simplicity of the interpretation of the nonlinear variations of the fundamental frequency. The proposed methodology has been tested on numerical models of reinforced concrete structures designed for only gravity loads without and with the presence of infill panels. In order to verify the effectiveness of the proposed approach for the automatic evaluation of the fundamental frequency over time, the results of an experimental campaign of shaking table tests conducted at the seismic laboratory of University of Basilicata (SISLAB) have been used. Acknowledgements This study was partially funded by the Italian Civil Protection Department within the project DPC-RELUIS 2015 - RS4 ''Seismic observatory of structures and health monitoring''. References Ditommaso, R., Mucciarelli, M., Ponzo, F.C. (2012) Analysis of non-stationary structural systems by using a band-variable filter. Bulletin of Earthquake Engineering. DOI: 10.1007/s10518-012-9338-y.
An Expert Assistant for Computer Aided Parallelization
NASA Technical Reports Server (NTRS)
Jost, Gabriele; Chun, Robert; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit
2004-01-01
The prototype implementation of an expert system was developed to assist the user in the computer aided parallelization process. The system interfaces to tools for automatic parallelization and performance analysis. By fusing static program structure information and dynamic performance analysis data the expert system can help the user to filter, correlate, and interpret the data gathered by the existing tools. Sections of the code that show poor performance and require further attention are rapidly identified and suggestions for improvements are presented to the user. In this paper we describe the components of the expert system and discuss its interface to the existing tools. We present a case study to demonstrate the successful use in full scale scientific applications.
Tissue Cartography: Compressing Bio-Image Data by Dimensional Reduction
Heemskerk, Idse; Streichan, Sebastian J
2017-01-01
High data volumes produced by state-of-the-art optical microscopes encumber research. Taking advantage of the laminar structure of many biological specimens we developed a method that reduces data size and processing time by orders of magnitude, while disentangling signal. The Image Surface Analysis Environment that we implemented automatically constructs an atlas of 2D images for arbitrary shaped, dynamic, and possibly multi-layered “Surfaces of Interest”. Built-in correction for cartographic distortion assures no information on the surface is lost, making it suitable for quantitative analysis. We demonstrate our approach by application to 4D imaging of the D. melanogaster embryo and D. rerio beating heart. PMID:26524242
Design Through Manufacturing: The Solid Model - Finite Element Analysis Interface
NASA Technical Reports Server (NTRS)
Rubin, Carol
2003-01-01
State-of-the-art computer aided design (CAD) presently affords engineers the opportunity to create solid models of machine parts which reflect every detail of the finished product. Ideally, these models should fulfill two very important functions: (1) they must provide numerical control information for automated manufacturing of precision parts, and (2) they must enable analysts to easily evaluate the stress levels (using finite element analysis - FEA) for all structurally significant parts used in space missions. Today's state-of-the-art CAD programs perform function (1) very well, providing an excellent model for precision manufacturing. But they do not provide a straightforward and simple means of automating the translation from CAD to FEA models, especially for aircraft-type structures. The research performed during the fellowship period investigated the transition process from the solid CAD model to the FEA stress analysis model with the final goal of creating an automatic interface between the two. During the period of the fellowship a detailed multi-year program for the development of such an interface was created. The ultimate goal of this program will be the development of a fully parameterized automatic ProE/FEA translator for parts and assemblies, with the incorporation of data base management into the solution, and ultimately including computational fluid dynamics and thermal modeling in the interface.
Automated Video Based Facial Expression Analysis of Neuropsychiatric Disorders
Wang, Peng; Barrett, Frederick; Martin, Elizabeth; Milanova, Marina; Gur, Raquel E.; Gur, Ruben C.; Kohler, Christian; Verma, Ragini
2008-01-01
Deficits in emotional expression are prominent in several neuropsychiatric disorders, including schizophrenia. Available clinical facial expression evaluations provide subjective and qualitative measurements, which are based on static 2D images that do not capture the temporal dynamics and subtleties of expression changes. Therefore, there is a need for automated, objective and quantitative measurements of facial expressions captured using videos. This paper presents a computational framework that creates probabilistic expression profiles for video data and can potentially help to automatically quantify emotional expression differences between patients with neuropsychiatric disorders and healthy controls. Our method automatically detects and tracks facial landmarks in videos, and then extracts geometric features to characterize facial expression changes. To analyze temporal facial expression changes, we employ probabilistic classifiers that analyze facial expressions in individual frames, and then propagate the probabilities throughout the video to capture the temporal characteristics of facial expressions. The applications of our method to healthy controls and case studies of patients with schizophrenia and Asperger’s syndrome demonstrate the capability of the video-based expression analysis method in capturing subtleties of facial expression. Such results can pave the way for a video based method for quantitative analysis of facial expressions in clinical research of disorders that cause affective deficits. PMID:18045693
Detecting similarities among distant homologous proteins by comparison of domain flexibilities.
Pandini, Alessandro; Mauri, Giancarlo; Bordogna, Annalisa; Bonati, Laura
2007-06-01
Aim of this work is to assess the informativeness of protein dynamics in the detection of similarities among distant homologous proteins. To this end, an approach to perform large-scale comparisons of protein domain flexibilities is proposed. CONCOORD is confirmed as a reliable method for fast conformational sampling. The root mean square fluctuation of alpha carbon positions in the essential dynamics subspace is employed as a measure of local flexibility and a synthetic index of similarity is presented. The dynamics of a large collection of protein domains from ASTRAL/SCOP40 is analyzed and the possibility to identify relationships, at both the family and the superfamily levels, on the basis of the dynamical features is discussed. The obtained picture is in agreement with the SCOP classification, and furthermore suggests the presence of a distinguishable familiar trend in the flexibility profiles. The results support the complementarity of the dynamical and the structural information, suggesting that information from dynamics analysis can arise from functional similarities, often partially hidden by a static comparison. On the basis of this first test, flexibility annotation can be expected to help in automatically detecting functional similarities otherwise unrecoverable.
1989-08-01
Automatic Line Network Extraction from Aerial Imangery of Urban Areas Sthrough KnowledghBased Image Analysis N 04 Final Technical ReportI December...Automatic Line Network Extraction from Aerial Imagery of Urban Areas through Knowledge Based Image Analysis Accesion For NTIS CRA&I DTIC TAB 0...paittern re’ognlition. blac’kboardl oriented symbollic processing, knowledge based image analysis , image understanding, aer’ial imsagery, urban area, 17
Automatic Dynamic Aircraft Modeler (ADAM) for the Computer Program NASTRAN
NASA Technical Reports Server (NTRS)
Griffis, H.
1985-01-01
Large general purpose finite element programs require users to develop large quantities of input data. General purpose pre-processors are used to decrease the effort required to develop structural models. Further reduction of effort can be achieved by specific application pre-processors. Automatic Dynamic Aircraft Modeler (ADAM) is one such application specific pre-processor. General purpose pre-processors use points, lines and surfaces to describe geometric shapes. Specifying that ADAM is used only for aircraft structures allows generic structural sections, wing boxes and bodies, to be pre-defined. Hence with only gross dimensions, thicknesses, material properties and pre-defined boundary conditions a complete model of an aircraft can be created.
Self-calibrating models for dynamic monitoring and diagnosis
NASA Technical Reports Server (NTRS)
Kuipers, Benjamin
1994-01-01
The present goal in qualitative reasoning is to develop methods for automatically building qualitative and semiquantitative models of dynamic systems and to use them for monitoring and fault diagnosis. The qualitative approach to modeling provides a guarantee of coverage while our semiquantitative methods support convergence toward a numerical model as observations are accumulated. We have developed and applied methods for automatic creation of qualitative models, developed two methods for obtaining tractable results on problems that were previously intractable for qualitative simulation, and developed more powerful methods for learning semiquantitative models from observations and deriving semiquantitative predictions from them. With these advances, qualitative reasoning comes significantly closer to realizing its aims as a practical engineering method.
NASA Technical Reports Server (NTRS)
Graves, Sharon S.; Burner, Alpheus W.; Edwards, John W.; Schuster, David M.
2001-01-01
The techniques used to acquire, reduce, and analyze dynamic deformation measurements of an aeroelastic semispan wind tunnel model are presented. Single-camera, single-view video photogrammetry (also referred to as videogrammetric model deformation, or VMD) was used to determine dynamic aeroelastic deformation of the semispan 'Models for Aeroelastic Validation Research Involving Computation' (MAVRIC) model in the Transonic Dynamics Tunnel at the NASA Langley Research Center. Dynamic deformation was determined from optical retroreflective tape targets at five semispan locations located on the wing from the root to the tip. Digitized video images from a charge coupled device (CCD) camera were recorded and processed to automatically determine target image plane locations that were then corrected for sensor, lens, and frame grabber spatial errors. Videogrammetric dynamic data were acquired at a 60-Hz rate for time records of up to 6 seconds during portions of this flutter/Limit Cycle Oscillation (LCO) test at Mach numbers from 0.3 to 0.96. Spectral analysis of the deformation data is used to identify dominant frequencies in the wing motion. The dynamic data will be used to separate aerodynamic and structural effects and to provide time history deflection data for Computational Aeroelasticity code evaluation and validation.
Automatic segmentation of trees in dynamic outdoor environments
USDA-ARS?s Scientific Manuscript database
Segmentation in dynamic outdoor environments can be difficult when the illumination levels and other aspects of the scene cannot be controlled. Specifically in agricultural contexts, a background material is often used to shield a camera's field of view from other rows of crops. In this paper, we ...
Self-calibrating models for dynamic monitoring and diagnosis
NASA Technical Reports Server (NTRS)
Kuipers, Benjamin
1996-01-01
A method for automatically building qualitative and semi-quantitative models of dynamic systems, and using them for monitoring and fault diagnosis, is developed and demonstrated. The qualitative approach and semi-quantitative method are applied to monitoring observation streams, and to design of non-linear control systems.
Automatic analysis of the micronucleus test in primary human lymphocytes using image analysis.
Frieauff, W; Martus, H J; Suter, W; Elhajouji, A
2013-01-01
The in vitro micronucleus test (MNT) is a well-established test for early screening of new chemical entities in industrial toxicology. For assessing the clastogenic or aneugenic potential of a test compound, micronucleus induction in cells has been shown repeatedly to be a sensitive and a specific parameter. Various automated systems to replace the tedious and time-consuming visual slide analysis procedure as well as flow cytometric approaches have been discussed. The ROBIAS (Robotic Image Analysis System) for both automatic cytotoxicity assessment and micronucleus detection in human lymphocytes was developed at Novartis where the assay has been used to validate positive results obtained in the MNT in TK6 cells, which serves as the primary screening system for genotoxicity profiling in early drug development. In addition, the in vitro MNT has become an accepted alternative to support clinical studies and will be used for regulatory purposes as well. The comparison of visual with automatic analysis results showed a high degree of concordance for 25 independent experiments conducted for the profiling of 12 compounds. For concentration series of cyclophosphamide and carbendazim, a very good correlation between automatic and visual analysis by two examiners could be established, both for the relative division index used as cytotoxicity parameter, as well as for micronuclei scoring in mono- and binucleated cells. Generally, false-positive micronucleus decisions could be controlled by fast and simple relocation of the automatically detected patterns. The possibility to analyse 24 slides within 65h by automatic analysis over the weekend and the high reproducibility of the results make automatic image processing a powerful tool for the micronucleus analysis in primary human lymphocytes. The automated slide analysis for the MNT in human lymphocytes complements the portfolio of image analysis applications on ROBIAS which is supporting various assays at Novartis.
DELINEATING SUBTYPES OF SELF-INJURIOUS BEHAVIOR MAINTAINED BY AUTOMATIC REINFORCEMENT
Hagopian, Louis P.; Rooker, Griffin W.; Zarcone, Jennifer R.
2016-01-01
Self-injurious behavior (SIB) is maintained by automatic reinforcement in roughly 25% of cases. Automatically reinforced SIB typically has been considered a single functional category, and is less understood than socially reinforced SIB. Subtyping automatically reinforced SIB into functional categories has the potential to guide the development of more targeted interventions and increase our understanding of its biological underpinnings. The current study involved an analysis of 39 individuals with automatically reinforced SIB and a comparison group of 13 individuals with socially reinforced SIB. Automatically reinforced SIB was categorized into 3 subtypes based on patterns of responding in the functional analysis and the presence of self-restraint. These response features were selected as the basis for subtyping on the premise that they could reflect functional properties of SIB unique to each subtype. Analysis of treatment data revealed important differences across subtypes and provides preliminary support to warrant additional research on this proposed subtyping model. PMID:26223959
Sauer, Vernon B.
2002-01-01
Surface-water computation methods and procedures are described in this report to provide standards from which a completely automated electronic processing system can be developed. To the greatest extent possible, the traditional U. S. Geological Survey (USGS) methodology and standards for streamflow data collection and analysis have been incorporated into these standards. Although USGS methodology and standards are the basis for this report, the report is applicable to other organizations doing similar work. The proposed electronic processing system allows field measurement data, including data stored on automatic field recording devices and data recorded by the field hydrographer (a person who collects streamflow and other surface-water data) in electronic field notebooks, to be input easily and automatically. A user of the electronic processing system easily can monitor the incoming data and verify and edit the data, if necessary. Input of the computational procedures, rating curves, shift requirements, and other special methods are interactive processes between the user and the electronic processing system, with much of this processing being automatic. Special computation procedures are provided for complex stations such as velocity-index, slope, control structures, and unsteady-flow models, such as the Branch-Network Dynamic Flow Model (BRANCH). Navigation paths are designed to lead the user through the computational steps for each type of gaging station (stage-only, stagedischarge, velocity-index, slope, rate-of-change in stage, reservoir, tide, structure, and hydraulic model stations). The proposed electronic processing system emphasizes the use of interactive graphics to provide good visual tools for unit values editing, rating curve and shift analysis, hydrograph comparisons, data-estimation procedures, data review, and other needs. Documentation, review, finalization, and publication of records are provided for with the electronic processing system, as well as archiving, quality assurance, and quality control.
Application of automatic gain control for radiometer diagnostic in SST-1 tokamak.
Makwana, Foram R; Siju, Varsha; Edappala, Praveenlal; Pathak, S K
2017-12-01
This paper describes the characterisation of a negative feedback type of automatic gain control (AGC) circuit that will be an integral part of the heterodyne radiometer system operating at a frequency range of 75-86 GHz at SST-1 tokamak. The developed AGC circuit is a combination of variable gain amplifier and log amplifier which provides both gain and attenuation typically up to 15 dB and 45 dB, respectively, at a fixed set point voltage and it has been explored for the first time in tokamak radiometry application. The other important characteristics are that it exhibits a very fast response time of 390 ns to understand the fast dynamics of electron cyclotron emission and can operate at very wide input RF power dynamic range of around 60 dB that ensures signal level within the dynamic range of the detection system.
Torres, M E; Añino, M M; Schlotthauer, G
2003-12-01
It is well known that, from a dynamical point of view, sudden variations in physiological parameters which govern certain diseases can cause qualitative changes in the dynamics of the corresponding physiological process. The purpose of this paper is to introduce a technique that allows the automated temporal localization of slight changes in a parameter of the law that governs the nonlinear dynamics of a given signal. This tool takes, from the multiresolution entropies, the ability to show these changes as statistical variations at each scale. These variations are held in the corresponding principal component. Appropriately combining these techniques with a statistical changes detector, a complexity change detection algorithm is obtained. The relevance of the approach, together with its robustness in the presence of moderate noise, is discussed in numerical simulations and the automatic detector is applied to real and simulated biological signals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishnan, Venkat; Das, Trishna
Increasing variable generation penetration and the consequent increase in short-term variability makes energy storage technologies look attractive, especially in the ancillary market for providing frequency regulation services. This paper presents slow dynamics model for compressed air energy storage and battery storage technologies that can be used in automatic generation control studies to assess the system frequency response and quantify the benefits from storage technologies in providing regulation service. The paper also represents the slow dynamics model of the power system integrated with storage technologies in a complete state space form. The storage technologies have been integrated to the IEEE 24more » bus system with single area, and a comparative study of various solution strategies including transmission enhancement and combustion turbine have been performed in terms of generation cycling and frequency response performance metrics.« less
Study of the cerrado vegetation in the Federal District area from orbital data. M.S. Thesis
NASA Technical Reports Server (NTRS)
Dejesusparada, N. (Principal Investigator); Aoki, H.; Dossantos, J. R.
1980-01-01
The physiognomic units of cerrado in the area of Distrito Federal (DF) were studied through the visual and automatic analysis of products provided by Multispectral Scanning System (MSS) of LANDSAT. The visual analysis of the multispectral images in black and white, at the 1:250,000 scale, was made based on the texture and tonal patterns. The automatic analysis of the compatible computer tapes (CCT) was made by means of IMAGE-100 system. The following conclusions were obtained: (1) the delimitation of cerrado vegetation forms can be made by the visual and automatic analysis; (2) in the visual analysis, the principal parameter used to discriminate the cerrado forms was the tonal pattern, independently of the year's seasons, and the channel 5 gave better information; (3) in the automatic analysis, the data of the four channels of MSS can be used in the discrimination of the cerrado forms; and (4) in the automatic analysis, the four channels combination possibilities gave more information in the separation of cerrado units when soil types were considered.
Validation of automatic segmentation of ribs for NTCP modeling.
Stam, Barbara; Peulen, Heike; Rossi, Maddalena M G; Belderbos, José S A; Sonke, Jan-Jakob
2016-03-01
Determination of a dose-effect relation for rib fractures in a large patient group has been limited by the time consuming manual delineation of ribs. Automatic segmentation could facilitate such an analysis. We determine the accuracy of automatic rib segmentation in the context of normal tissue complication probability modeling (NTCP). Forty-one patients with stage I/II non-small cell lung cancer treated with SBRT to 54 Gy in 3 fractions were selected. Using the 4DCT derived mid-ventilation planning CT, all ribs were manually contoured and automatically segmented. Accuracy of segmentation was assessed using volumetric, shape and dosimetric measures. Manual and automatic dosimetric parameters Dx and EUD were tested for equivalence using the Two One-Sided T-test (TOST), and assessed for agreement using Bland-Altman analysis. NTCP models based on manual and automatic segmentation were compared. Automatic segmentation was comparable with the manual delineation in radial direction, but larger near the costal cartilage and vertebrae. Manual and automatic Dx and EUD were significantly equivalent. The Bland-Altman analysis showed good agreement. The two NTCP models were very similar. Automatic rib segmentation was significantly equivalent to manual delineation and can be used for NTCP modeling in a large patient group. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Demonstration of Automatically-Generated Adjoint Code for Use in Aerodynamic Shape Optimization
NASA Technical Reports Server (NTRS)
Green, Lawrence; Carle, Alan; Fagan, Mike
1999-01-01
Gradient-based optimization requires accurate derivatives of the objective function and constraints. These gradients may have previously been obtained by manual differentiation of analysis codes, symbolic manipulators, finite-difference approximations, or existing automatic differentiation (AD) tools such as ADIFOR (Automatic Differentiation in FORTRAN). Each of these methods has certain deficiencies, particularly when applied to complex, coupled analyses with many design variables. Recently, a new AD tool called ADJIFOR (Automatic Adjoint Generation in FORTRAN), based upon ADIFOR, was developed and demonstrated. Whereas ADIFOR implements forward-mode (direct) differentiation throughout an analysis program to obtain exact derivatives via the chain rule of calculus, ADJIFOR implements the reverse-mode counterpart of the chain rule to obtain exact adjoint form derivatives from FORTRAN code. Automatically-generated adjoint versions of the widely-used CFL3D computational fluid dynamics (CFD) code and an algebraic wing grid generation code were obtained with just a few hours processing time using the ADJIFOR tool. The codes were verified for accuracy and were shown to compute the exact gradient of the wing lift-to-drag ratio, with respect to any number of shape parameters, in about the time required for 7 to 20 function evaluations. The codes have now been executed on various computers with typical memory and disk space for problems with up to 129 x 65 x 33 grid points, and for hundreds to thousands of independent variables. These adjoint codes are now used in a gradient-based aerodynamic shape optimization problem for a swept, tapered wing. For each design iteration, the optimization package constructs an approximate, linear optimization problem, based upon the current objective function, constraints, and gradient values. The optimizer subroutines are called within a design loop employing the approximate linear problem until an optimum shape is found, the design loop limit is reached, or no further design improvement is possible due to active design variable bounds and/or constraints. The resulting shape parameters are then used by the grid generation code to define a new wing surface and computational grid. The lift-to-drag ratio and its gradient are computed for the new design by the automatically-generated adjoint codes. Several optimization iterations may be required to find an optimum wing shape. Results from two sample cases will be discussed. The reader should note that this work primarily represents a demonstration of use of automatically- generated adjoint code within an aerodynamic shape optimization. As such, little significance is placed upon the actual optimization results, relative to the method for obtaining the results.
Variational Identification of Markovian Transition States
NASA Astrophysics Data System (ADS)
Martini, Linda; Kells, Adam; Covino, Roberto; Hummer, Gerhard; Buchete, Nicolae-Viorel; Rosta, Edina
2017-07-01
We present a method that enables the identification and analysis of conformational Markovian transition states from atomistic or coarse-grained molecular dynamics (MD) trajectories. Our algorithm is presented by using both analytical models and examples from MD simulations of the benchmark system helix-forming peptide Ala5 , and of larger, biomedically important systems: the 15-lipoxygenase-2 enzyme (15-LOX-2), the epidermal growth factor receptor (EGFR) protein, and the Mga2 fungal transcription factor. The analysis of 15-LOX-2 uses data generated exclusively from biased umbrella sampling simulations carried out at the hybrid ab initio density functional theory (DFT) quantum mechanics/molecular mechanics (QM/MM) level of theory. In all cases, our method automatically identifies the corresponding transition states and metastable conformations in a variationally optimal way, with the input of a set of relevant coordinates, by accurately reproducing the intrinsic slowest relaxation rate of each system. Our approach offers a general yet easy-to-implement analysis method that provides unique insight into the molecular mechanism and the rare but crucial (i.e., rate-limiting) transition states occurring along conformational transition paths in complex dynamical systems such as molecular trajectories.
CPAS Preflight Drop Test Analysis Process
NASA Technical Reports Server (NTRS)
Englert, Megan E.; Bledsoe, Kristin J.; Romero, Leah M.
2015-01-01
Throughout the Capsule Parachute Assembly System (CPAS) drop test program, the CPAS Analysis Team has developed a simulation and analysis process to support drop test planning and execution. This process includes multiple phases focused on developing test simulations and communicating results to all groups involved in the drop test. CPAS Engineering Development Unit (EDU) series drop test planning begins with the development of a basic operational concept for each test. Trajectory simulation tools include the Flight Analysis and Simulation Tool (FAST) for single bodies, and the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulation for the mated vehicle. Results are communicated to the team at the Test Configuration Review (TCR) and Test Readiness Review (TRR), as well as at Analysis Integrated Product Team (IPT) meetings in earlier and intermediate phases of the pre-test planning. The ability to plan and communicate efficiently with rapidly changing objectives and tight schedule constraints is a necessity for safe and successful drop tests.
Hyper-X Stage Separation Trajectory Validation Studies
NASA Technical Reports Server (NTRS)
Tartabini, Paul V.; Bose, David M.; McMinn, John D.; Martin, John G.; Strovers, Brian K.
2003-01-01
An independent twelve degree-of-freedom simulation of the X-43A separation trajectory was created with the Program to Optimize Simulated trajectories (POST II). This simulation modeled the multi-body dynamics of the X-43A and its booster and included the effect of two pyrotechnically actuated pistons used to push the vehicles apart as well as aerodynamic interaction forces and moments between the two vehicles. The simulation was developed to validate trajectory studies conducted with a 14 degree-of-freedom simulation created early in the program using the Automatic Dynamic Analysis of Mechanics Systems (ADAMS) simulation software. The POST simulation was less detailed than the official ADAMS-based simulation used by the Project, but was simpler, more concise and ran faster, while providing similar results. The increase in speed provided by the POST simulation provided the Project with an alternate analysis tool. This tool was ideal for performing separation control logic trade studies that required the running of numerous Monte Carlo trajectories.
Contrasting Public Opinion Dynamics and Emotional Response during Crisis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Volkova, Svitlana; Chetviorkin, Ilia; Arendt, Dustin L.
We propose an approach for contrasting spatiotemporal dynamics of public opinions expressed toward targeted entities, also known as stance detection task, in Russia and Ukraine during crisis. Our analysis relies on a novel corpus constructed from posts on the VKontakte social network, centered on local public opinion of the ongoing Russian-Ukrainian crisis, along with newly annotated resources for predicting expressions of fine-grained emotions including joy, sadness, disgust, anger, surprise and fear. Akin to prior work on sentiment analysis we align traditional public opinion polls with aggregated automatic predictions of sentiments for contrastive geo-locations. We report interesting observations on emotional responsemore » and stance variations across geo-locations. Some of our findings contradict stereotypical misconceptions imposed by media, for example, we found posts from Ukraine that do not support Euromaidan but support Putin, and posts from Russia that are against Putin but in favor USA. Furthermore, we are the first to demonstrate contrastive stance variations over time across geo-locations using storyline visualization technique.« less
SmartPort: A Platform for Sensor Data Monitoring in a Seaport Based on FIWARE
Fernández, Pablo; Santana, José Miguel; Ortega, Sebastián; Trujillo, Agustín; Suárez, José Pablo; Domínguez, Conrado; Santana, Jaisiel; Sánchez, Alejandro
2016-01-01
Seaport monitoring and management is a significant research area, in which infrastructure automatically collects big data sets that lead the organization in its multiple activities. Thus, this problem is heavily related to the fields of data acquisition, transfer, storage, big data analysis and information visualization. Las Palmas de Gran Canaria port is a good example of how a seaport generates big data volumes through a network of sensors. They are placed on meteorological stations and maritime buoys, registering environmental parameters. Likewise, the Automatic Identification System (AIS) registers several dynamic parameters about the tracked vessels. However, such an amount of data is useless without a system that enables a meaningful visualization and helps make decisions. In this work, we present SmartPort, a platform that offers a distributed architecture for the collection of the port sensors’ data and a rich Internet application that allows the user to explore the geolocated data. The presented SmartPort tool is a representative, promising and inspiring approach to manage and develop a smart system. It covers a demanding need for big data analysis and visualization utilities for managing complex infrastructures, such as a seaport. PMID:27011192
Salmi, T; Sovijärvi, A R; Brander, P; Piirilä, P
1988-11-01
Reliable long-term assessment of cough is necessary in many clinical and scientific settings. A new method for long-term recording and automatic analysis of cough is presented. The method is based on simultaneous recording of two independent signals: high-pass filtered cough sounds and cough-induced fast movements of the body. The acoustic signals are recorded with a dynamic microphone in the acoustic focus of a glass fiber paraboloid mirror. Body movements are recorded with a static charge-sensitive bed located under an ordinary plastic foam mattress. The patient can be studied lying or sitting with no transducers or electrodes attached. A microcomputer is used for sampling of signals, detection of cough, statistical analyses, and on-line printing of results. The method was validated in seven adult patients with a total of 809 spontaneous cough events, using clinical observation as a reference. The sensitivity of the method to detect cough was 99.0 percent, and the positive predictivity was 98.1 percent. The system ignored speaking and snoring. The method provides a convenient means of reliable long-term follow-up of cough in clinical work and research.
Bone marrow cavity segmentation using graph-cuts with wavelet-based texture feature.
Shigeta, Hironori; Mashita, Tomohiro; Kikuta, Junichi; Seno, Shigeto; Takemura, Haruo; Ishii, Masaru; Matsuda, Hideo
2017-10-01
Emerging bioimaging technologies enable us to capture various dynamic cellular activities [Formula: see text]. As large amounts of data are obtained these days and it is becoming unrealistic to manually process massive number of images, automatic analysis methods are required. One of the issues for automatic image segmentation is that image-taking conditions are variable. Thus, commonly, many manual inputs are required according to each image. In this paper, we propose a bone marrow cavity (BMC) segmentation method for bone images as BMC is considered to be related to the mechanism of bone remodeling, osteoporosis, and so on. To reduce manual inputs to segment BMC, we classified the texture pattern using wavelet transformation and support vector machine. We also integrated the result of texture pattern classification into the graph-cuts-based image segmentation method because texture analysis does not consider spatial continuity. Our method is applicable to a particular frame in an image sequence in which the condition of fluorescent material is variable. In the experiment, we evaluated our method with nine types of mother wavelets and several sets of scale parameters. The proposed method with graph-cuts and texture pattern classification performs well without manual inputs by a user.
NASA Technical Reports Server (NTRS)
Schoppers, Marcel
1994-01-01
The design of a flexible, real-time software architecture for trajectory planning and automatic control of redundant manipulators is described. Emphasis is placed on a technique of designing control systems that are both flexible and robust yet have good real-time performance. The solution presented involves an artificial intelligence algorithm that dynamically reprograms the real-time control system while planning system behavior.
Assessment of linear and nonlinear/complex heartbeat dynamics in subclinical depression (dysphoria).
Greco, Alberto; Messerotti Benvenuti, Simone; Gentili, Claudio; Palomba, Daniela; Scilingo, Enzo Pasquale; Valenza, Gaetano
2018-03-29
Depression is one of the leading causes of disability worldwide. Most previous studies have focused on major depression, and studies on subclinical depression, such as those on so-called dysphoria, have been overlooked. Indeed, dysphoria is associated with a high prevalence of somatic disorders, and a reduction of quality of life and life expectancy. In current clinical practice, dysphoria is assessed using psychometric questionnaires and structured interviews only, without taking into account objective pathophysiological indices. To address this problem, in this study we investigated heartbeat linear and nonlinear dynamics to derive objective autonomic nervous system biomarkers of dysphoria. Sixty undergraduate students participated in the study: according to clinical evaluation, 24 of them were dysphoric. Extensive group-wise statistics was performed to characterize the pathological and control groups. Moreover, a recursive feature elimination algorithm based on a K-NN classifier was carried out for the automatic recognition of dysphoria at a single-subject level. The results showed that the most significant group-wise differences referred to increased heartbeat complexity (particularly for fractal dimension, sample entropy and recurrence plot analysis) with regards to the healthy controls, confirming dysfunctional nonlinear sympatho-vagal dynamics in mood disorders. Furthermore, a balanced accuracy of 79.17% was achieved in automatically distinguishing dysphoric patients from controls, with the most informative power attributed to nonlinear, spectral and polyspectral quantifiers of cardiovascular variability. This study experimentally supports the assessment of dysphoria as a defined clinical condition with specific characteristics which are different both from healthy, fully euthymic controls and from full-blown major depression.
Fidelity of Automatic Speech Processing for Adult and Child Talker Classifications.
VanDam, Mark; Silbert, Noah H
2016-01-01
Automatic speech processing (ASP) has recently been applied to very large datasets of naturalistically collected, daylong recordings of child speech via an audio recorder worn by young children. The system developed by the LENA Research Foundation analyzes children's speech for research and clinical purposes, with special focus on of identifying and tagging family speech dynamics and the at-home acoustic environment from the auditory perspective of the child. A primary issue for researchers, clinicians, and families using the Language ENvironment Analysis (LENA) system is to what degree the segment labels are valid. This classification study evaluates the performance of the computer ASP output against 23 trained human judges who made about 53,000 judgements of classification of segments tagged by the LENA ASP. Results indicate performance consistent with modern ASP such as those using HMM methods, with acoustic characteristics of fundamental frequency and segment duration most important for both human and machine classifications. Results are likely to be important for interpreting and improving ASP output.
Fidelity of Automatic Speech Processing for Adult and Child Talker Classifications
2016-01-01
Automatic speech processing (ASP) has recently been applied to very large datasets of naturalistically collected, daylong recordings of child speech via an audio recorder worn by young children. The system developed by the LENA Research Foundation analyzes children's speech for research and clinical purposes, with special focus on of identifying and tagging family speech dynamics and the at-home acoustic environment from the auditory perspective of the child. A primary issue for researchers, clinicians, and families using the Language ENvironment Analysis (LENA) system is to what degree the segment labels are valid. This classification study evaluates the performance of the computer ASP output against 23 trained human judges who made about 53,000 judgements of classification of segments tagged by the LENA ASP. Results indicate performance consistent with modern ASP such as those using HMM methods, with acoustic characteristics of fundamental frequency and segment duration most important for both human and machine classifications. Results are likely to be important for interpreting and improving ASP output. PMID:27529813
Applications of automatic differentiation in computational fluid dynamics
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Carle, A.; Bischof, C.; Haigler, Kara J.; Newman, Perry A.
1994-01-01
Automatic differentiation (AD) is a powerful computational method that provides for computing exact sensitivity derivatives (SD) from existing computer programs for multidisciplinary design optimization (MDO) or in sensitivity analysis. A pre-compiler AD tool for FORTRAN programs called ADIFOR has been developed. The ADIFOR tool has been easily and quickly applied by NASA Langley researchers to assess the feasibility and computational impact of AD in MDO with several different FORTRAN programs. These include a state-of-the-art three dimensional multigrid Navier-Stokes flow solver for wings or aircraft configurations in transonic turbulent flow. With ADIFOR the user specifies sets of independent and dependent variables with an existing computer code. ADIFOR then traces the dependency path throughout the code, applies the chain rule to formulate derivative expressions, and generates new code to compute the required SD matrix. The resulting codes have been verified to compute exact non-geometric and geometric SD for a variety of cases. in less time than is required to compute the SD matrix using centered divided differences.
Dynamic user data analysis and web composition technique using big data
NASA Astrophysics Data System (ADS)
Soundarya, P.; Vanitha, M.; Sumaiya Thaseen, I.
2017-11-01
In the existing system, a reliable service oriented system is built which is more important when compared with the traditional standalone system in the unpredictable internet service and it also a challenging task to build reliable web service. In the proposed system, the fault tolerance is determined by using the proposed heuristic algorithm. There are two kinds of strategies active and passive strategies. The user requirement is also formulated as local and global constraints. Different services are deployed in the modification process. Two bus reservation and two train reservation services are deployed along with hotel reservation service. User can choose any one of the bus reservation and specify their destination location. If corresponding destination is not available then automatic backup service to another bus reservation system is carried. If same, the service is not available then parallel service of train reservation is initiated. Automatic hotel reservation is also initiated based on the mode and type of travel of the user.
Constructing graph models for software system development and analysis
NASA Astrophysics Data System (ADS)
Pogrebnoy, Andrey V.
2017-01-01
We propose a concept for creating the instrumentation for functional and structural decisions rationale during the software system (SS) development. We propose to develop SS simultaneously on two models - functional (FM) and structural (SM). FM is a source code of the SS. Adequate representation of the FM in the form of a graph model (GM) is made automatically and called SM. The problem of creating and visualizing GM is considered from the point of applying it as a uniform platform for the adequate representation of the SS source code. We propose three levels of GM detailing: GM1 - for visual analysis of the source code and for SS version control, GM2 - for resources optimization and analysis of connections between SS components, GM3 - for analysis of the SS functioning in dynamics. The paper includes examples of constructing all levels of GM.
A Risk Assessment System with Automatic Extraction of Event Types
NASA Astrophysics Data System (ADS)
Capet, Philippe; Delavallade, Thomas; Nakamura, Takuya; Sandor, Agnes; Tarsitano, Cedric; Voyatzi, Stavroula
In this article we describe the joint effort of experts in linguistics, information extraction and risk assessment to integrate EventSpotter, an automatic event extraction engine, into ADAC, an automated early warning system. By detecting as early as possible weak signals of emerging risks ADAC provides a dynamic synthetic picture of situations involving risk. The ADAC system calculates risk on the basis of fuzzy logic rules operated on a template graph whose leaves are event types. EventSpotter is based on a general purpose natural language dependency parser, XIP, enhanced with domain-specific lexical resources (Lexicon-Grammar). Its role is to automatically feed the leaves with input data.
Automatic segmentation of relevant structures in DCE MR mammograms
NASA Astrophysics Data System (ADS)
Koenig, Matthias; Laue, Hendrik; Boehler, Tobias; Peitgen, Heinz-Otto
2007-03-01
The automatic segmentation of relevant structures such as skin edge, chest wall, or nipple in dynamic contrast enhanced MR imaging (DCE MRI) of the breast provides additional information for computer aided diagnosis (CAD) systems. Automatic reporting using BI-RADS criteria benefits of information about location of those structures. Lesion positions can be automatically described relatively to such reference structures for reporting purposes. Furthermore, this information can assist data reduction for computation expensive preprocessing such as registration, or for visualization of only the segments of current interest. In this paper, a novel automatic method for determining the air-breast boundary resp. skin edge, for approximation of the chest wall, and locating of the nipples is presented. The method consists of several steps which are built on top of each other. Automatic threshold computation leads to the air-breast boundary which is then analyzed to determine the location of the nipple. Finally, results of both steps are starting point for approximation of the chest wall. The proposed process was evaluated on a large data set of DCE MRI recorded by T1 sequences and yielded reasonable results in all cases.
Kim, Jinsuh; Leira, Enrique C; Callison, Richard C; Ludwig, Bryan; Moritani, Toshio; Magnotta, Vincent A; Madsen, Mark T
2010-05-01
We developed fully automated software for dynamic susceptibility contrast (DSC) MR perfusion-weighted imaging (PWI) to efficiently and reliably derive critical hemodynamic information for acute stroke treatment decisions. Brain MR PWI was performed in 80 consecutive patients with acute nonlacunar ischemic stroke within 24h after onset of symptom from January 2008 to August 2009. These studies were automatically processed to generate hemodynamic parameters that included cerebral blood flow and cerebral blood volume, and the mean transit time (MTT). To develop reliable software for PWI analysis, we used computationally robust algorithms including the piecewise continuous regression method to determine bolus arrival time (BAT), log-linear curve fitting, arrival time independent deconvolution method and sophisticated motion correction methods. An optimal arterial input function (AIF) search algorithm using a new artery-likelihood metric was also developed. Anatomical locations of the automatically determined AIF were reviewed and validated. The automatically computed BAT values were statistically compared with estimated BAT by a single observer. In addition, gamma-variate curve-fitting errors of AIF and inter-subject variability of AIFs were analyzed. Lastly, two observes independently assessed the quality and area of hypoperfusion mismatched with restricted diffusion area from motion corrected MTT maps and compared that with time-to-peak (TTP) maps using the standard approach. The AIF was identified within an arterial branch and enhanced areas of perfusion deficit were visualized in all evaluated cases. Total processing time was 10.9+/-2.5s (mean+/-s.d.) without motion correction and 267+/-80s (mean+/-s.d.) with motion correction on a standard personal computer. The MTT map produced with our software adequately estimated brain areas with perfusion deficit and was significantly less affected by random noise of the PWI when compared with the TTP map. Results of image quality assessment by two observers revealed that the MTT maps exhibited superior quality over the TTP maps (88% good rating of MTT as compared to 68% of TTP). Our software allowed fully automated deconvolution analysis of DSC PWI using proven efficient algorithms that can be applied to acute stroke treatment decisions. Our streamlined method also offers promise for further development of automated quantitative analysis of the ischemic penumbra. Copyright (c) 2009 Elsevier Ireland Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parra, N A; Abramowitz, M; Pollack, A
2016-06-15
Purpose: To automatically identify and outline suspicious regions of recurrent or residual disease in the prostate bed using Dynamic Contrast Enhanced-MRI (DCE-MRI) in patients after prostatectomy. Methods: Twenty-two patients presenting for salvage radiotherapy and with identified Gross Tumor Volume (GTV) in the prostate bed were retrospectively analyzed. The MRI data consisted of Axial T2weighted-MRI (T2w) of the pelvis: resolution 1.25×1.25×2.5 mm; Field of View (FOV): 320×320 mm; slice thickness=2.5mm; 72 slices; and Dynamic Contrast Enhanced MRI (DCE-MRI)–12 series of T1w with identical spatial resolution to T2w and at 30–34s temporal resolution. Unsupervised pattern recognition was used to decompose the 4Dmore » DCE data as the product W.H of weights W of k patterns H. A well-perfused pattern Hwp was identified and the weight map Wwp associated to Hwp was used to delineate suspicious volumes. Threshold of Wwp set at mean(Wwp)+S*std(Wwp), S=1,1.5,2 and 2.5 defined four volumes labeled as DCE1.0 to DCE2.5. These volumes were displayed on T2w and, along with GTV, were correlated with the highest pre-treatment PSA values, and with pharmacokinetic analysis constants. Results: GTV was significantly correlated with DCE2.0(ρ= 0.60, p<0.003), and DCE 2.5 (ρ=0.58, p=0.004)). Significant correlation was found between highest pre-treatment PSA and GTV(ρ=0.42, p<0.049), DCE2.0(ρ= 0.52, p<0.012), and DCE 2.5 (ρ=0.67, p<<0.01)). Kruskal-Wallis analysis showed that Ktrans median value was statistically different between non-specific prostate bed tissue NSPBT and both GTV (p<<0.001) and DCE2.5 (p<<0.001), but while median Ve was statistically different between DCE2.5 and NSPBT (p=0.002), it was not statistically different between GTV and NSPBT (p=0.054), suggesting that automatic volumes capture more accurately the area of malignancy. Conclusion: Software developed for identification and visualization of suspicions regions in DCE-MRI from post-prostatectomy patients has been validated by PSA and pharmacokinetic constants analysis showing that it generates clinically relevant volumes.« less
Harms, Hendrik Johannes; Stubkjær Hansson, Nils Henrik; Tolbod, Lars Poulsen; Kim, Won Yong; Jakobsen, Steen; Bouchelouche, Kirsten; Wiggers, Henrik; Frøkiaer, Jørgen; Sörensen, Jens
2016-09-01
Dynamic cardiac PET is used to quantify molecular processes in vivo. However, measurements of left ventricular (LV) mass and volume require electrocardiogram-gated PET data. The aim of this study was to explore the feasibility of measuring LV geometry using nongated dynamic cardiac PET. Thirty-five patients with aortic-valve stenosis and 10 healthy controls underwent a 27-min (11)C-acetate PET/CT scan and cardiac MRI (CMR). The controls were scanned twice to assess repeatability. Parametric images of uptake rate K1 and the blood pool were generated from nongated dynamic data. Using software-based structure recognition, the LV wall was automatically segmented from K1 images to derive functional assessments of LV mass (mLV) and wall thickness. End-systolic and end-diastolic volumes were calculated using blood pool images and applied to obtain stroke volume and LV ejection fraction (LVEF). PET measurements were compared with CMR. High, linear correlations were found for LV mass (r = 0.95), end-systolic volume (r = 0.93), and end-diastolic volume (r = 0.90), and slightly lower correlations were found for stroke volume (r = 0.74), LVEF (r = 0.81), and thickness (r = 0.78). Bland-Altman analyses showed significant differences for mLV and thickness only and an overestimation for LVEF at lower values. Intra- and interobserver correlations were greater than 0.95 for all PET measurements. PET repeatability accuracy in the controls was comparable to CMR. LV mass and volume are accurately and automatically generated from dynamic (11)C-acetate PET without electrocardiogram gating. This method can be incorporated in a standard routine without any additional workload and can, in theory, be extended to other PET tracers. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.
Accuracy Analysis for Automatic Orientation of a Tumbling Oblique Viewing Sensor System
NASA Astrophysics Data System (ADS)
Stebner, K.; Wieden, A.
2014-03-01
Dynamic camera systems with moving parts are difficult to handle in photogrammetric workflow, because it is not ensured that the dynamics are constant over the recording period. Minimum changes of the camera's orientation greatly influence the projection of oblique images. In this publication these effects - originating from the kinematic chain of a dynamic camera system - are analysed and validated. A member of the Modular Airborne Camera System family - MACS-TumbleCam - consisting of a vertical viewing and a tumbling oblique camera was used for this investigation. Focus is on dynamic geometric modeling and the stability of the kinematic chain. To validate the experimental findings, the determined parameters are applied to the exterior orientation of an actual aerial image acquisition campaign using MACS-TumbleCam. The quality of the parameters is sufficient for direct georeferencing of oblique image data from the orientation information of a synchronously captured vertical image dataset. Relative accuracy for the oblique data set ranges from 1.5 pixels when using all images of the image block to 0.3 pixels when using only adjacent images.
SS-mPMG and SS-GA: tools for finding pathways and dynamic simulation of metabolic networks.
Katsuragi, Tetsuo; Ono, Naoaki; Yasumoto, Keiichi; Altaf-Ul-Amin, Md; Hirai, Masami Y; Sriyudthsak, Kansuporn; Sawada, Yuji; Yamashita, Yui; Chiba, Yukako; Onouchi, Hitoshi; Fujiwara, Toru; Naito, Satoshi; Shiraishi, Fumihide; Kanaya, Shigehiko
2013-05-01
Metabolomics analysis tools can provide quantitative information on the concentration of metabolites in an organism. In this paper, we propose the minimum pathway model generator tool for simulating the dynamics of metabolite concentrations (SS-mPMG) and a tool for parameter estimation by genetic algorithm (SS-GA). SS-mPMG can extract a subsystem of the metabolic network from the genome-scale pathway maps to reduce the complexity of the simulation model and automatically construct a dynamic simulator to evaluate the experimentally observed behavior of metabolites. Using this tool, we show that stochastic simulation can reproduce experimentally observed dynamics of amino acid biosynthesis in Arabidopsis thaliana. In this simulation, SS-mPMG extracts the metabolic network subsystem from published databases. The parameters needed for the simulation are determined using a genetic algorithm to fit the simulation results to the experimental data. We expect that SS-mPMG and SS-GA will help researchers to create relevant metabolic networks and carry out simulations of metabolic reactions derived from metabolomics data.
Robust parameter design for automatically controlled systems and nanostructure synthesis
NASA Astrophysics Data System (ADS)
Dasgupta, Tirthankar
2007-12-01
This research focuses on developing comprehensive frameworks for developing robust parameter design methodology for dynamic systems with automatic control and for synthesis of nanostructures. In many automatically controlled dynamic processes, the optimal feedback control law depends on the parameter design solution and vice versa and therefore an integrated approach is necessary. A parameter design methodology in the presence of feedback control is developed for processes of long duration under the assumption that experimental noise factors are uncorrelated over time. Systems that follow a pure-gain dynamic model are considered and the best proportional-integral and minimum mean squared error control strategies are developed by using robust parameter design. The proposed method is illustrated using a simulated example and a case study in a urea packing plant. This idea is also extended to cases with on-line noise factors. The possibility of integrating feedforward control with a minimum mean squared error feedback control scheme is explored. To meet the needs of large scale synthesis of nanostructures, it is critical to systematically find experimental conditions under which the desired nanostructures are synthesized reproducibly, at large quantity and with controlled morphology. The first part of the research in this area focuses on modeling and optimization of existing experimental data. Through a rigorous statistical analysis of experimental data, models linking the probabilities of obtaining specific morphologies to the process variables are developed. A new iterative algorithm for fitting a Multinomial GLM is proposed and used. The optimum process conditions, which maximize the above probabilities and make the synthesis process less sensitive to variations of process variables around set values, are derived from the fitted models using Monte-Carlo simulations. The second part of the research deals with development of an experimental design methodology, tailor-made to address the unique phenomena associated with nanostructure synthesis. A sequential space filling design called Sequential Minimum Energy Design (SMED) for exploring best process conditions for synthesis of nanowires. The SMED is a novel approach to generate sequential designs that are model independent, can quickly "carve out" regions with no observable nanostructure morphology, and allow for the exploration of complex response surfaces.
NASA Astrophysics Data System (ADS)
Armando, Alessandro; Giunchiglia, Enrico; Ponta, Serena Elisa
We present an approach to the formal specification and automatic analysis of business processes under authorization constraints based on the action language \\cal{C}. The use of \\cal{C} allows for a natural and concise modeling of the business process and the associated security policy and for the automatic analysis of the resulting specification by using the Causal Calculator (CCALC). Our approach improves upon previous work by greatly simplifying the specification step while retaining the ability to perform a fully automatic analysis. To illustrate the effectiveness of the approach we describe its application to a version of a business process taken from the banking domain and use CCALC to determine resource allocation plans complying with the security policy.
Urban land use: Remote sensing of ground-basin permeability
NASA Technical Reports Server (NTRS)
Tinney, L. R.; Jensen, J. R.; Estes, J. E.
1975-01-01
A remote sensing analysis of the amount and type of permeable and impermeable surfaces overlying an urban recharge basin is discussed. An effective methodology for accurately generating this data as input to a safe yield study is detailed and compared to more conventional alternative approaches. The amount of area inventoried, approximately 10 sq. miles, should provide a reliable base against which automatic pattern recognition algorithms, currently under investigation for this task, can be evaluated. If successful, such approaches can significantly reduce the time and effort involved in obtaining permeability data, an important aspect of urban hydrology dynamics.
NASA Technical Reports Server (NTRS)
Stone, H. W.; Powell, R. W.
1977-01-01
A six-degree-of-freedom simulation analysis was conducted to examine the effects of the lateral-directional static aerodynamic stability and control uncertainties on the performance of the automatic (no manual inputs) entry-guidance and control systems of the space shuttle orbiter. To establish the acceptable boundaries of the uncertainties, the static aerodynamic characteristics were varied either by applying a multiplier to the aerodynamic parameter or by adding an increment. Control-system modifications were identified that decrease the sensitivity to off-nominal aerodynamics. With these modifications, the acceptable aerodynamic boundaries were determined.
Analysis and Processing the 3D-Range-Image-Data for Robot Monitoring
NASA Astrophysics Data System (ADS)
Kohoutek, Tobias
2008-09-01
Industrial robots are commonly used for physically stressful jobs in complex environments. In any case collisions with heavy and high dynamic machines need to be prevented. For this reason the operational range has to be monitored precisely, reliably and meticulously. The advantage of the SwissRanger® SR-3000 is that it delivers intensity images and 3D-information simultaneously of the same scene that conveniently allows 3D-monitoring. Due to that fact automatic real time collision prevention within the robots working space is possible by working with 3D-coordinates.
Thermal Analysis and Design of an Advanced Space Suit
NASA Technical Reports Server (NTRS)
Lin, Chin H.; Campbell, Anthony B.; French, Jonathan D.; French, D.; Nair, Satish S.; Miles, John B.
2000-01-01
The thermal dynamics and design of an Advanced Space Suit are considered. A transient model of the Advanced Space Suit has been developed and implemented using MATLAB/Simulink to help with sizing, with design evaluation, and with the development of an automatic thermal comfort control strategy. The model is described and the thermal characteristics of the Advanced Space suit are investigated including various parametric design studies. The steady state performance envelope for the Advanced Space Suit is defined in terms of the thermal environment and human metabolic rate and the transient response of the human-suit-MPLSS system is analyzed.
Fragment size distribution statistics in dynamic fragmentation of laser shock-loaded tin
NASA Astrophysics Data System (ADS)
He, Weihua; Xin, Jianting; Zhao, Yongqiang; Chu, Genbai; Xi, Tao; Shui, Min; Lu, Feng; Gu, Yuqiu
2017-06-01
This work investigates the geometric statistics method to characterize the size distribution of tin fragments produced in the laser shock-loaded dynamic fragmentation process. In the shock experiments, the ejection of the tin sample with etched V-shape groove in the free surface are collected by the soft recovery technique. Subsequently, the produced fragments are automatically detected with the fine post-shot analysis techniques including the X-ray micro-tomography and the improved watershed method. To characterize the size distributions of the fragments, a theoretical random geometric statistics model based on Poisson mixtures is derived for dynamic heterogeneous fragmentation problem, which reveals linear combinational exponential distribution. The experimental data related to fragment size distributions of the laser shock-loaded tin sample are examined with the proposed theoretical model, and its fitting performance is compared with that of other state-of-the-art fragment size distribution models. The comparison results prove that our proposed model can provide far more reasonable fitting result for the laser shock-loaded tin.
A time domain simulation of a beam control system
NASA Astrophysics Data System (ADS)
Mitchell, J. R.
1981-02-01
The Airborne Laser Laboratory (ALL) is being developed by the Air Force to investigate the integration and operation of high energy laser components in a dynamic airborne environment and to study the propagation of laser light from an airborne vehicle to an airborne target. The ALL is composed of several systems; among these are the Airborne Pointing and Tracking System (APT) and the Automatic Alignment System (AAS). This report presents the results of performing a time domain dynamic simulation for an integrated beam control system composed of the APT and AAS. The simulation is performed on a digital computer using the MIMIC language. It includes models of the dynamics of the system and of disturbances. Also presented in the report are the rationales and developments of these models. The data from the simulation code is summarized by several plots. In addition results from massaging the data with waveform analysis packages are presented. The results are discussed and conclusions are drawn.
Cerebral coherence between communicators marks the emergence of meaning
Stolk, Arjen; Noordzij, Matthijs L.; Verhagen, Lennart; Volman, Inge; Schoffelen, Jan-Mathijs; Oostenveld, Robert; Hagoort, Peter; Toni, Ivan
2014-01-01
How can we understand each other during communicative interactions? An influential suggestion holds that communicators are primed by each other’s behaviors, with associative mechanisms automatically coordinating the production of communicative signals and the comprehension of their meanings. An alternative suggestion posits that mutual understanding requires shared conceptualizations of a signal’s use, i.e., “conceptual pacts” that are abstracted away from specific experiences. Both accounts predict coherent neural dynamics across communicators, aligned either to the occurrence of a signal or to the dynamics of conceptual pacts. Using coherence spectral-density analysis of cerebral activity simultaneously measured in pairs of communicators, this study shows that establishing mutual understanding of novel signals synchronizes cerebral dynamics across communicators’ right temporal lobes. This interpersonal cerebral coherence occurred only within pairs with a shared communicative history, and at temporal scales independent from signals’ occurrences. These findings favor the notion that meaning emerges from shared conceptualizations of a signal’s use. PMID:25489093
NASA Astrophysics Data System (ADS)
Zhang, Jinhua; Fang, Bin; Hong, Jun; Wan, Shaoke; Zhu, Yongsheng
2017-12-01
The combined angular contact ball bearings are widely used in automatic, aerospace and machine tools, but few researches on the combined angular contact ball bearings have been reported. It is shown that the preload and stiffness of combined bearings are mutual influenced rather than simply the superposition of multiple single bearing, therefore the characteristic calculation of combined bearings achieved by coupling the load and deformation analysis of a single bearing. In this paper, based on the Jones quasi-static model and stiffness analytical model, a new iterative algorithm and model are proposed for the calculation of combined bearings preload and stiffness, and the dynamic effects include centrifugal force and gyroscopic moment have to be considered. It is demonstrated that the new method has general applicability, the preload factors of combined bearings are calculated according to the different design preloads, and the static and dynamic stiffness for various arrangements of combined bearings are comparatively studied and analyzed, and the influences of the design preload magnitude, axial load and rotating speed are discussed in detail. Besides, the change rule of dynamic contact angles of combined bearings with respect to the rotating speed is also discussed. The results show that bearing arrangement modes, rotating speed and design preload magnitude have a significant influence on the preload and stiffness of combined bearings. The proposed formulation provides a useful tool in dynamic analysis of the complex bearing-rotor system.
Automatic humidification system to support the assessment of food drying processes
NASA Astrophysics Data System (ADS)
Ortiz Hernández, B. D.; Carreño Olejua, A. R.; Castellanos Olarte, J. M.
2016-07-01
This work shows the main features of an automatic humidification system to provide drying air that match environmental conditions of different climate zones. This conditioned air is then used to assess the drying process of different agro-industrial products at the Automation and Control for Agro-industrial Processes Laboratory of the Pontifical Bolivarian University of Bucaramanga, Colombia. The automatic system allows creating and improving control strategies to supply drying air under specified conditions of temperature and humidity. The development of automatic routines to control and acquire real time data was made possible by the use of robust control systems and suitable instrumentation. The signals are read and directed to a controller memory where they are scaled and transferred to a memory unit. Using the IP address is possible to access data to perform supervision tasks. One important characteristic of this automatic system is the Dynamic Data Exchange Server (DDE) to allow direct communication between the control unit and the computer used to build experimental curves.
Model verification of mixed dynamic systems. [POGO problem in liquid propellant rockets
NASA Technical Reports Server (NTRS)
Chrostowski, J. D.; Evensen, D. A.; Hasselman, T. K.
1978-01-01
A parameter-estimation method is described for verifying the mathematical model of mixed (combined interactive components from various engineering fields) dynamic systems against pertinent experimental data. The model verification problem is divided into two separate parts: defining a proper model and evaluating the parameters of that model. The main idea is to use differences between measured and predicted behavior (response) to adjust automatically the key parameters of a model so as to minimize response differences. To achieve the goal of modeling flexibility, the method combines the convenience of automated matrix generation with the generality of direct matrix input. The equations of motion are treated in first-order form, allowing for nonsymmetric matrices, modeling of general networks, and complex-mode analysis. The effectiveness of the method is demonstrated for an example problem involving a complex hydraulic-mechanical system.
Palm: Easing the Burden of Analytical Performance Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tallent, Nathan R.; Hoisie, Adolfy
2014-06-01
Analytical (predictive) application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult because they must be both accurate and concise. To ease the burden of performance modeling, we developed Palm, a modeling tool that combines top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. To express insight, Palm defines a source code modeling annotation language. By coordinating models and source code, Palm's models are `first-class' and reproducible. Unlike prior work, Palm formally links models, functions, and measurements. As a result, Palm (a) uses functions to either abstract or express complexitymore » (b) generates hierarchical models (representing an application's static and dynamic structure); and (c) automatically incorporates measurements to focus attention, represent constant behavior, and validate models. We discuss generating models for three different applications.« less
Statistical modelling of subdiffusive dynamics in the cytoplasm of living cells: A FARIMA approach
NASA Astrophysics Data System (ADS)
Burnecki, K.; Muszkieta, M.; Sikora, G.; Weron, A.
2012-04-01
Golding and Cox (Phys. Rev. Lett., 96 (2006) 098102) tracked the motion of individual fluorescently labelled mRNA molecules inside live E. coli cells. They found that in the set of 23 trajectories from 3 different experiments, the automatically recognized motion is subdiffusive and published an intriguing microscopy video. Here, we extract the corresponding time series from this video by image segmentation method and present its detailed statistical analysis. We find that this trajectory was not included in the data set already studied and has different statistical properties. It is best fitted by a fractional autoregressive integrated moving average (FARIMA) process with the normal-inverse Gaussian (NIG) noise and the negative memory. In contrast to earlier studies, this shows that the fractional Brownian motion is not the best model for the dynamics documented in this video.
Static and dynamic crystalline lens accommodation evaluated using quantitative 3-D OCT.
Gambra, Enrique; Ortiz, Sergio; Perez-Merino, Pablo; Gora, Michalina; Wojtkowski, Maciej; Marcos, Susana
2013-01-01
Custom high-resolution high-speed anterior segment spectral domain Optical Coherence Tomography (OCT) provided with automatic quantification and distortion correction algorithms was used to characterize three-dimensionally (3-D) the human crystalline lens in vivo in four subjects, for accommodative demands between 0 to 6 D in 1 D steps. Anterior and posterior lens radii of curvature decreased with accommodative demand at rates of 0.73 and 0.20 mm/D, resulting in an increase of the estimated optical power of the eye of 0.62 D per diopter of accommodative demand. Dynamic fluctuations in crystalline lens radii of curvature, anterior chamber depth and lens thickness were also estimated from dynamic 2-D OCT images (14 Hz), acquired during 5-s of steady fixation, for different accommodative demands. Estimates of the eye power from dynamical geometrical measurements revealed an increase of the fluctuations of the accommodative response from 0.07 D to 0.47 D between 0 and 6 D (0.044 D per D of accommodative demand). A sensitivity analysis showed that the fluctuations of accommodation were driven by dynamic changes in the lens surfaces, particularly in the posterior lens surface.
Static and dynamic crystalline lens accommodation evaluated using quantitative 3-D OCT
Gambra, Enrique; Ortiz, Sergio; Perez-Merino, Pablo; Gora, Michalina; Wojtkowski, Maciej; Marcos, Susana
2013-01-01
Custom high-resolution high-speed anterior segment spectral domain Optical Coherence Tomography (OCT) provided with automatic quantification and distortion correction algorithms was used to characterize three-dimensionally (3-D) the human crystalline lens in vivo in four subjects, for accommodative demands between 0 to 6 D in 1 D steps. Anterior and posterior lens radii of curvature decreased with accommodative demand at rates of 0.73 and 0.20 mm/D, resulting in an increase of the estimated optical power of the eye of 0.62 D per diopter of accommodative demand. Dynamic fluctuations in crystalline lens radii of curvature, anterior chamber depth and lens thickness were also estimated from dynamic 2-D OCT images (14 Hz), acquired during 5-s of steady fixation, for different accommodative demands. Estimates of the eye power from dynamical geometrical measurements revealed an increase of the fluctuations of the accommodative response from 0.07 D to 0.47 D between 0 and 6 D (0.044 D per D of accommodative demand). A sensitivity analysis showed that the fluctuations of accommodation were driven by dynamic changes in the lens surfaces, particularly in the posterior lens surface. PMID:24049680
Yavuzer, Yasemin; Karataş, Zeynep
2013-01-01
This study aimed to examine the mediating role of anger in the relationship between automatic thoughts and physical aggression in adolescents. The study included 224 adolescents in the 9th grade of 3 different high schools in central Burdur during the 2011-2012 academic year. Participants completed the Aggression Questionnaire and Automatic Thoughts Scale in their classrooms during counseling sessions. Data were analyzed using simple and multiple linear regression analysis. There were positive correlations between the adolescents' automatic thoughts, and physical aggression, and anger. According to regression analysis, automatic thoughts effectively predicted the level of physical aggression (b= 0.233, P < 0.001)) and anger (b= 0.325, P < 0.001). Analysis of the mediating role of anger showed that anger fully mediated the relationship between automatic thoughts and physical aggression (Sobel z = 5.646, P < 0.001). Anger fully mediated the relationship between automatic thoughts and physical aggression. Providing adolescents with anger management skills training is very important for the prevention of physical aggression. Such training programs should include components related to the development of an awareness of dysfunctional and anger-triggering automatic thoughts, and how to change them. As the study group included adolescents from Burdur, the findings can only be generalized to groups with similar characteristics.
A Dynamic Integration Method for Borderland Database using OSM data
NASA Astrophysics Data System (ADS)
Zhou, X.-G.; Jiang, Y.; Zhou, K.-X.; Zeng, L.
2013-11-01
Spatial data is the fundamental of borderland analysis of the geography, natural resources, demography, politics, economy, and culture. As the spatial region used in borderland researching usually covers several neighboring countries' borderland regions, the data is difficult to achieve by one research institution or government. VGI has been proven to be a very successful means of acquiring timely and detailed global spatial data at very low cost. Therefore VGI will be one reasonable source of borderland spatial data. OpenStreetMap (OSM) has been known as the most successful VGI resource. But OSM data model is far different from the traditional authoritative geographic information. Thus the OSM data needs to be converted to the scientist customized data model. With the real world changing fast, the converted data needs to be updated. Therefore, a dynamic integration method for borderland data is presented in this paper. In this method, a machine study mechanism is used to convert the OSM data model to the user data model; a method used to select the changed objects in the researching area over a given period from OSM whole world daily diff file is presented, the change-only information file with designed form is produced automatically. Based on the rules and algorithms mentioned above, we enabled the automatic (or semiautomatic) integration and updating of the borderland database by programming. The developed system was intensively tested.
A Computer Model for Teaching the Dynamic Behavior of AC Contactors
ERIC Educational Resources Information Center
Ruiz, J.-R. R.; Espinosa, A. G.; Romeral, L.
2010-01-01
Ac-powered contactors are extensively used in industry in applications such as automatic electrical devices, motor starters, and heaters. In this work, a practical session that allows students to model and simulate the dynamic behavior of ac-powered electromechanical contactors is presented. Simulation is carried out using a rigorous parametric…
Data Intensive Systems (DIS) Benchmark Performance Summary
2003-08-01
models assumed by today’s conventional architectures. Such applications include model- based Automatic Target Recognition (ATR), synthetic aperture...radar (SAR) codes, large scale dynamic databases/battlefield integration, dynamic sensor- based processing, high-speed cryptanalysis, high speed...distributed interactive and data intensive simulations, data-oriented problems characterized by pointer- based and other highly irregular data structures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Qifang; Wang, Fei; Hodge, Bri-Mathias
A real-time price (RTP)-based automatic demand response (ADR) strategy for PV-assisted electric vehicle (EV) Charging Station (PVCS) without vehicle to grid is proposed. The charging process is modeled as a dynamic linear program instead of the normal day-ahead and real-time regulation strategy, to capture the advantages of both global and real-time optimization. Different from conventional price forecasting algorithms, a dynamic price vector formation model is proposed based on a clustering algorithm to form an RTP vector for a particular day. A dynamic feasible energy demand region (DFEDR) model considering grid voltage profiles is designed to calculate the lower and uppermore » bounds. A deduction method is proposed to deal with the unknown information of future intervals, such as the actual stochastic arrival and departure times of EVs, which make the DFEDR model suitable for global optimization. Finally, both the comparative cases articulate the advantages of the developed methods and the validity in reducing electricity costs, mitigating peak charging demand, and improving PV self-consumption of the proposed strategy are verified through simulation scenarios.« less
NASA Technical Reports Server (NTRS)
McNally, B. David (Inventor); Erzberger, Heinz (Inventor); Sheth, Kapil (Inventor)
2015-01-01
A dynamic weather route system automatically analyzes routes for in-flight aircraft flying in convective weather regions and attempts to find more time and fuel efficient reroutes around current and predicted weather cells. The dynamic weather route system continuously analyzes all flights and provides reroute advisories that are dynamically updated in real time while the aircraft are in flight. The dynamic weather route system includes a graphical user interface that allows users to visualize, evaluate, modify if necessary, and implement proposed reroutes.
Method and algorithm of automatic estimation of road surface type for variable damping control
NASA Astrophysics Data System (ADS)
Dąbrowski, K.; Ślaski, G.
2016-09-01
In this paper authors presented an idea of road surface estimation (recognition) on a base of suspension dynamic response signals statistical analysis. For preliminary analysis cumulated distribution function (CDF) was used, and some conclusion that various roads have responses values in a different ranges of limits for the same percentage of samples or for the same limits different percentages of samples are located within the range between limit values. That was the base for developed and presented algorithm which was tested using suspension response signals recorded during road test riding over various surfaces. Proposed algorithm can be essential part of adaptive damping control algorithm for a vehicle suspension or adaptive control strategy for suspension damping control.
Brown, Ryan M; Meah, Christopher J; Heath, Victoria L; Styles, Iain B; Bicknell, Roy
2016-01-01
Angiogenesis involves the generation of new blood vessels from the existing vasculature and is dependent on many growth factors and signaling events. In vivo angiogenesis is dynamic and complex, meaning assays are commonly utilized to explore specific targets for research into this area. Tube-forming assays offer an excellent overview of the molecular processes in angiogenesis. The Matrigel tube forming assay is a simple-to-implement but powerful tool for identifying biomolecules involved in angiogenesis. A detailed experimental protocol on the implementation of the assay is described in conjunction with an in-depth review of methods that can be applied to the analysis of the tube formation. In addition, an ImageJ plug-in is presented which allows automatic quantification of tube images reducing analysis times while removing user bias and subjectivity.
Method and System for Air Traffic Rerouting for Airspace Constraint Resolution
NASA Technical Reports Server (NTRS)
Erzberger, Heinz (Inventor); Morando, Alexander R. (Inventor); Sheth, Kapil S. (Inventor); McNally, B. David (Inventor); Clymer, Alexis A. (Inventor); Shih, Fu-tai (Inventor)
2017-01-01
A dynamic constraint avoidance route system automatically analyzes routes of aircraft flying, or to be flown, in or near constraint regions and attempts to find more time and fuel efficient reroutes around current and predicted constraints. The dynamic constraint avoidance route system continuously analyzes all flight routes and provides reroute advisories that are dynamically updated in real time. The dynamic constraint avoidance route system includes a graphical user interface that allows users to visualize, evaluate, modify if necessary, and implement proposed reroutes.
NASA Astrophysics Data System (ADS)
Shahamatnia, Ehsan; Dorotovič, Ivan; Fonseca, Jose M.; Ribeiro, Rita A.
2016-03-01
Developing specialized software tools is essential to support studies of solar activity evolution. With new space missions such as Solar Dynamics Observatory (SDO), solar images are being produced in unprecedented volumes. To capitalize on that huge data availability, the scientific community needs a new generation of software tools for automatic and efficient data processing. In this paper a prototype of a modular framework for solar feature detection, characterization, and tracking is presented. To develop an efficient system capable of automatic solar feature tracking and measuring, a hybrid approach combining specialized image processing, evolutionary optimization, and soft computing algorithms is being followed. The specialized hybrid algorithm for tracking solar features allows automatic feature tracking while gathering characterization details about the tracked features. The hybrid algorithm takes advantages of the snake model, a specialized image processing algorithm widely used in applications such as boundary delineation, image segmentation, and object tracking. Further, it exploits the flexibility and efficiency of Particle Swarm Optimization (PSO), a stochastic population based optimization algorithm. PSO has been used successfully in a wide range of applications including combinatorial optimization, control, clustering, robotics, scheduling, and image processing and video analysis applications. The proposed tool, denoted PSO-Snake model, was already successfully tested in other works for tracking sunspots and coronal bright points. In this work, we discuss the application of the PSO-Snake algorithm for calculating the sidereal rotational angular velocity of the solar corona. To validate the results we compare them with published manual results performed by an expert.
flowAI: automatic and interactive anomaly discerning tools for flow cytometry data.
Monaco, Gianni; Chen, Hao; Poidinger, Michael; Chen, Jinmiao; de Magalhães, João Pedro; Larbi, Anis
2016-08-15
Flow cytometry (FCM) is widely used in both clinical and basic research to characterize cell phenotypes and functions. The latest FCM instruments analyze up to 20 markers of individual cells, producing high-dimensional data. This requires the use of the latest clustering and dimensionality reduction techniques to automatically segregate cell sub-populations in an unbiased manner. However, automated analyses may lead to false discoveries due to inter-sample differences in quality and properties. We present an R package, flowAI, containing two methods to clean FCM files from unwanted events: (i) an automatic method that adopts algorithms for the detection of anomalies and (ii) an interactive method with a graphical user interface implemented into an R shiny application. The general approach behind the two methods consists of three key steps to check and remove suspected anomalies that derive from (i) abrupt changes in the flow rate, (ii) instability of signal acquisition and (iii) outliers in the lower limit and margin events in the upper limit of the dynamic range. For each file analyzed our software generates a summary of the quality assessment from the aforementioned steps. The software presented is an intuitive solution seeking to improve the results not only of manual but also and in particular of automatic analysis on FCM data. R source code available through Bioconductor: http://bioconductor.org/packages/flowAI/ CONTACTS: mongianni1@gmail.com or Anis_Larbi@immunol.a-star.edu.sg Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Automatic selection of arterial input function using tri-exponential models
NASA Astrophysics Data System (ADS)
Yao, Jianhua; Chen, Jeremy; Castro, Marcelo; Thomasson, David
2009-02-01
Dynamic Contrast Enhanced MRI (DCE-MRI) is one method for drug and tumor assessment. Selecting a consistent arterial input function (AIF) is necessary to calculate tissue and tumor pharmacokinetic parameters in DCE-MRI. This paper presents an automatic and robust method to select the AIF. The first stage is artery detection and segmentation, where knowledge about artery structure and dynamic signal intensity temporal properties of DCE-MRI is employed. The second stage is AIF model fitting and selection. A tri-exponential model is fitted for every candidate AIF using the Levenberg-Marquardt method, and the best fitted AIF is selected. Our method has been applied in DCE-MRIs of four different body parts: breast, brain, liver and prostate. The success rates in artery segmentation for 19 cases are 89.6%+/-15.9%. The pharmacokinetic parameters computed from the automatically selected AIFs are highly correlated with those from manually determined AIFs (R2=0.946, P(T<=t)=0.09). Our imaging-based tri-exponential AIF model demonstrated significant improvement over a previously proposed bi-exponential model.
Cavity contour segmentation in chest radiographs using supervised learning and dynamic programming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maduskar, Pragnya, E-mail: pragnya.maduskar@radboudumc.nl; Hogeweg, Laurens; Sánchez, Clara I.
Purpose: Efficacy of tuberculosis (TB) treatment is often monitored using chest radiography. Monitoring size of cavities in pulmonary tuberculosis is important as the size predicts severity of the disease and its persistence under therapy predicts relapse. The authors present a method for automatic cavity segmentation in chest radiographs. Methods: A two stage method is proposed to segment the cavity borders, given a user defined seed point close to the center of the cavity. First, a supervised learning approach is employed to train a pixel classifier using texture and radial features to identify the border pixels of the cavity. A likelihoodmore » value of belonging to the cavity border is assigned to each pixel by the classifier. The authors experimented with four different classifiers:k-nearest neighbor (kNN), linear discriminant analysis (LDA), GentleBoost (GB), and random forest (RF). Next, the constructed likelihood map was used as an input cost image in the polar transformed image space for dynamic programming to trace the optimal maximum cost path. This constructed path corresponds to the segmented cavity contour in image space. Results: The method was evaluated on 100 chest radiographs (CXRs) containing 126 cavities. The reference segmentation was manually delineated by an experienced chest radiologist. An independent observer (a chest radiologist) also delineated all cavities to estimate interobserver variability. Jaccard overlap measure Ω was computed between the reference segmentation and the automatic segmentation; and between the reference segmentation and the independent observer's segmentation for all cavities. A median overlap Ω of 0.81 (0.76 ± 0.16), and 0.85 (0.82 ± 0.11) was achieved between the reference segmentation and the automatic segmentation, and between the segmentations by the two radiologists, respectively. The best reported mean contour distance and Hausdorff distance between the reference and the automatic segmentation were, respectively, 2.48 ± 2.19 and 8.32 ± 5.66 mm, whereas these distances were 1.66 ± 1.29 and 5.75 ± 4.88 mm between the segmentations by the reference reader and the independent observer, respectively. The automatic segmentations were also visually assessed by two trained CXR readers as “excellent,” “adequate,” or “insufficient.” The readers had good agreement in assessing the cavity outlines and 84% of the segmentations were rated as “excellent” or “adequate” by both readers. Conclusions: The proposed cavity segmentation technique produced results with a good degree of overlap with manual expert segmentations. The evaluation measures demonstrated that the results approached the results of the experienced chest radiologists, in terms of overlap measure and contour distance measures. Automatic cavity segmentation can be employed in TB clinics for treatment monitoring, especially in resource limited settings where radiologists are not available.« less
OKCARS : Oklahoma Collision Analysis and Response System.
DOT National Transportation Integrated Search
2012-10-01
By continuously monitoring traffic intersections to automatically detect that a collision or nearcollision : has occurred, automatically call for assistance, and automatically forewarn oncoming traffic, : our OKCARS has the capability to effectively ...
Gabriel, Damien; Wong, Thian Chiew; Nicolier, Magali; Giustiniani, Julie; Mignot, Coralie; Noiret, Nicolas; Monnin, Julie; Magnin, Eloi; Pazart, Lionel; Moulin, Thierry; Haffen, Emmanuel; Vandel, Pierre
2016-07-01
The vast majority of people experience musical imagery, the sensation of reliving a song in absence of any external stimulation. Internal perception of a song can be deliberate and effortful, but also may occur involuntarily and spontaneously. Moreover, musical imagery is also involuntarily used for automatically completing missing parts of music or lyrics from a familiar song. The aim of our study was to explore the onset of musical imagery dynamics that leads to the automatic completion of missing lyrics. High-density electroencephalography was used to record the cerebral activity of twenty healthy volunteers while they were passively listening to unfamiliar songs, very familiar songs, and songs previously listened to for two weeks. Silent gaps inserted into these songs elicited a series of neural activations encompassing perceptual, attentional and cognitive mechanisms (range 100-500ms). Familiarity and learning effects emerged as early as 100ms and lasted 400ms after silence occurred. Although participants reported more easily mentally imagining lyrics in familiar rather than passively learnt songs, the onset of neural mechanisms and the power spectrum underlying musical imagery were similar for both types of songs. This study offers new insights into the musical imagery dynamics evoked by gaps of silence and on the role of familiarity and learning processes in the generation of these dynamics. The automatic and effortless method presented here is a potentially useful tool to understand failure in the familiarity and learning processes of pathological populations. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Carlo Ponzo, Felice; Ditommaso, Rocco
2015-04-01
This study presents an innovative strategy for automatic evaluation of the variable fundamental frequency and related damping factor of nonlinear structures during strong motion phases. Most of methods for damage detection are based on the assessment of the variations of the dynamic parameters characterizing the monitored structure. A crucial aspect of these methods is the automatic and accurate estimation of both structural eigen-frequencies and related damping factors also during the nonlinear behaviour. A new method, named STIRF (Short-Time Impulse Response Function - STIRF), based on the nonlinear interferometric analysis combined with the Fourier Transform (FT) here is proposed in order to allow scientists and engineers to characterize frequencies and damping variations of a monitored structure. The STIRF approach helps to overcome some limitation derived from the use of techniques based on simple Fourier Transform. These latter techniques provide good results when the response of the monitored system is stationary, but fails when the system exhibits a non-stationary, time-varying behaviour: even non-stationary input, soil-foundation and/or adjacent structures interaction phenomena can show the inadequacy of classic techniques to analysing the nonlinear and/or non-stationary behaviour of structures. In fact, using this kind of approach it is possible to improve some of the existing methods for the automatic damage detection providing stable results also during the strong motion phase. Results are consistent with those expected if compared with other techniques. The main advantage derived from the use of the proposed approach (STIRF) for Structural Health Monitoring is based on the simplicity of the interpretation of the nonlinear variations of the fundamental frequency and the related equivalent viscous damping factor. The proposed methodology has been tested on both numerical and experimental models also using data retrieved from shaking table tests. Based on the results provided in this study, the methodology seems to be able to evaluate fast variations (over time) of dynamic parameters of a generic reinforced concrete framed structure. Further analyses are necessary to better calibrate the length of the moving time-window (in order to minimize the spurious frequency within each Interferometric Response Function evaluated on both weak and strong motion phases) and to verify the possibility to use the STIRF to analyse the nonlinear behaviour of general systems. Acknowledgements This study was partially funded by the Italian Civil Protection Department within the project DPC-RELUIS 2014 - RS4 ''Seismic observatory of structures and health monitoring''. References R. Ditommaso, F.C. Ponzo (2015). Automatic evaluation of the fundamental frequency variations and related damping factor of reinforced concrete framed structures using the Short Time Impulse Response Function (STIRF). Engineering Structures, 82 (2015), 104-112. http://dx.doi.org/10.1016/j.engstruct.2014.10.023.
Automatic imitation: A meta-analysis.
Cracco, Emiel; Bardi, Lara; Desmet, Charlotte; Genschow, Oliver; Rigoni, Davide; De Coster, Lize; Radkova, Ina; Deschrijver, Eliane; Brass, Marcel
2018-05-01
Automatic imitation is the finding that movement execution is facilitated by compatible and impeded by incompatible observed movements. In the past 15 years, automatic imitation has been studied to understand the relation between perception and action in social interaction. Although research on this topic started in cognitive science, interest quickly spread to related disciplines such as social psychology, clinical psychology, and neuroscience. However, important theoretical questions have remained unanswered. Therefore, in the present meta-analysis, we evaluated seven key questions on automatic imitation. The results, based on 161 studies containing 226 experiments, revealed an overall effect size of g z = 0.95, 95% CI [0.88, 1.02]. Moderator analyses identified automatic imitation as a flexible, largely automatic process that is driven by movement and effector compatibility, but is also influenced by spatial compatibility. Automatic imitation was found to be stronger for forced choice tasks than for simple response tasks, for human agents than for nonhuman agents, and for goalless actions than for goal-directed actions. However, it was not modulated by more subtle factors such as animacy beliefs, motion profiles, or visual perspective. Finally, there was no evidence for a relation between automatic imitation and either empathy or autism. Among other things, these findings point toward actor-imitator similarity as a crucial modulator of automatic imitation and challenge the view that imitative tendencies are an indicator of social functioning. The current meta-analysis has important theoretical implications and sheds light on longstanding controversies in the literature on automatic imitation and related domains. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
NASA Technical Reports Server (NTRS)
Halyo, N.
1979-01-01
The development of a digital automatic control law for a small jet transport to perform a steep final approach in automatic landings is reported along with the development of a steady-state Kalman filter used to provide smooth estimates to the control law. The control law performs the functions of localizer and glides capture, localizer and glideslope track, decrab, and place. The control law uses the microwave landing system position data, and aircraft body-mounted accelerators, attitude and attitude rate information. The results obtained from a digital simulation of the aircraft dynamics, wind conditions, and sensor noises using the control law and filter developed are described.
Automatic Match between Delimitation Line and Real Terrain Based on Least-Cost Path Analysis
NASA Astrophysics Data System (ADS)
Feng, C. Q.; Jiang, N.; Zhang, X. N.; Ma, J.
2013-11-01
Nowadays, during the international negotiation on separating dispute areas, manual adjusting is lonely applied to the match between delimitation line and real terrain, which not only consumes much time and great labor force, but also cannot ensure high precision. Concerning that, the paper mainly explores automatic match between them and study its general solution based on Least -Cost Path Analysis. First, under the guidelines of delimitation laws, the cost layer is acquired through special disposals of delimitation line and terrain features line. Second, a new delimitation line gets constructed with the help of Least-Cost Path Analysis. Third, the whole automatic match model is built via Module Builder in order to share and reuse it. Finally, the result of automatic match is analyzed from many different aspects, including delimitation laws, two-sided benefits and so on. Consequently, a conclusion is made that the method of automatic match is feasible and effective.
Spyrakis, Francesca; Benedetti, Paolo; Decherchi, Sergio; Rocchia, Walter; Cavalli, Andrea; Alcaro, Stefano; Ortuso, Francesco; Baroni, Massimo; Cruciani, Gabriele
2015-10-26
The importance of taking into account protein flexibility in drug design and virtual ligand screening (VS) has been widely debated in the literature, and molecular dynamics (MD) has been recognized as one of the most powerful tools for investigating intrinsic protein dynamics. Nevertheless, deciphering the amount of information hidden in MD simulations and recognizing a significant minimal set of states to be used in virtual screening experiments can be quite complicated. Here we present an integrated MD-FLAP (molecular dynamics-fingerprints for ligand and proteins) approach, comprising a pipeline of molecular dynamics, clustering and linear discriminant analysis, for enhancing accuracy and efficacy in VS campaigns. We first extracted a limited number of representative structures from tens of nanoseconds of MD trajectories by means of the k-medoids clustering algorithm as implemented in the BiKi Life Science Suite ( http://www.bikitech.com [accessed July 21, 2015]). Then, instead of applying arbitrary selection criteria, that is, RMSD, pharmacophore properties, or enrichment performances, we allowed the linear discriminant analysis algorithm implemented in FLAP ( http://www.moldiscovery.com [accessed July 21, 2015]) to automatically choose the best performing conformational states among medoids and X-ray structures. Retrospective virtual screenings confirmed that ensemble receptor protocols outperform single rigid receptor approaches, proved that computationally generated conformations comprise the same quantity/quality of information included in X-ray structures, and pointed to the MD-FLAP approach as a valuable tool for improving VS performances.
Mapping permafrost change hot-spots with Landsat time-series
NASA Astrophysics Data System (ADS)
Grosse, G.; Nitze, I.
2016-12-01
Recent and projected future climate warming strongly affects permafrost stability over large parts of the terrestrial Arctic with local, regional and global scale consequences. The monitoring and quantification of permafrost and associated land surface changes in these areas is crucial for the analysis of hydrological and biogeochemical cycles as well as vegetation and ecosystem dynamics. However, detailed knowledge of the spatial distribution and the temporal dynamics of these processes is scarce and likely key locations of permafrost landscape dynamics may remain unnoticed. As part of the ERC funded PETA-CARB and ESA GlobPermafrost projects, we developed an automated processing chain based on data from the entire Landsat archive (excluding MSS) for the detection of permafrost change related processes and hotspots. The automated method enables us to analyze thousands of Landsat scenes, which allows for a multi-scaled spatio-temporal analysis at 30 meter spatial resolution. All necessary processing steps are carried out automatically with minimal user interaction, including data extraction, masking, reprojection, subsetting, data stacking, and calculation of multi-spectral indices. These indices, e.g. Landsat Tasseled Cap and NDVI among others, are used as proxies for land surface conditions, such as vegetation status, moisture or albedo. Finally, a robust trend analysis is applied to each multi-spectral index and each pixel over the entire observation period of up to 30 years from 1985 to 2015, depending on data availability. Large transects of around 2 million km² across different permafrost types in Siberia and North America have been processed. Permafrost related or influencing landscape dynamics were detected within the trend analysis, including thermokarst lake dynamics, fires, thaw slumps, and coastal dynamics. The produced datasets will be distributed to the community as part of the ERC PETA-CARB and ESA GlobPermafrost projects. Users are encouraged to provide feedback and ground truth data for a continuous improvement of our methodology and datasets, which will lead to a better understanding of the spatial and temporal distribution of changes within the vulnerable permafrost zone.
A Framework for Daylighting Optimization in Whole Buildings with OpenStudio
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2016-08-12
We present a toolkit and workflow for leveraging the OpenStudio (Guglielmetti et al. 2010) platform to perform daylighting analysis and optimization in a whole building energy modeling (BEM) context. We have re-implemented OpenStudio's integrated Radiance and EnergyPlus functionality as an OpenStudio Measure. The OpenStudio Radiance Measure works within the OpenStudio Application and Parametric Analysis Tool, as well as the OpenStudio Server large scale analysis framework, allowing a rigorous daylighting simulation to be performed on a single building model or potentially an entire population of programmatically generated models. The Radiance simulation results can automatically inform the broader building energy model, andmore » provide dynamic daylight metrics as a basis for decision. Through introduction and example, this paper illustrates the utility of the OpenStudio building energy modeling platform to leverage existing simulation tools for integrated building energy performance simulation, daylighting analysis, and reportage.« less
Juan-Albarracín, Javier; Fuster-Garcia, Elies; Pérez-Girbés, Alexandre; Aparici-Robles, Fernando; Alberich-Bayarri, Ángel; Revert-Ventura, Antonio; Martí-Bonmatí, Luis; García-Gómez, Juan M
2018-06-01
Purpose To determine if preoperative vascular heterogeneity of glioblastoma is predictive of overall survival of patients undergoing standard-of-care treatment by using an unsupervised multiparametric perfusion-based habitat-discovery algorithm. Materials and Methods Preoperative magnetic resonance (MR) imaging including dynamic susceptibility-weighted contrast material-enhanced perfusion studies in 50 consecutive patients with glioblastoma were retrieved. Perfusion parameters of glioblastoma were analyzed and used to automatically draw four reproducible habitats that describe the tumor vascular heterogeneity: high-angiogenic and low-angiogenic regions of the enhancing tumor, potentially tumor-infiltrated peripheral edema, and vasogenic edema. Kaplan-Meier and Cox proportional hazard analyses were conducted to assess the prognostic potential of the hemodynamic tissue signature to predict patient survival. Results Cox regression analysis yielded a significant correlation between patients' survival and maximum relative cerebral blood volume (rCBV max ) and maximum relative cerebral blood flow (rCBF max ) in high-angiogenic and low-angiogenic habitats (P < .01, false discovery rate-corrected P < .05). Moreover, rCBF max in the potentially tumor-infiltrated peripheral edema habitat was also significantly correlated (P < .05, false discovery rate-corrected P < .05). Kaplan-Meier analysis demonstrated significant differences between the observed survival of populations divided according to the median of the rCBV max or rCBF max at the high-angiogenic and low-angiogenic habitats (log-rank test P < .05, false discovery rate-corrected P < .05), with an average survival increase of 230 days. Conclusion Preoperative perfusion heterogeneity contains relevant information about overall survival in patients who undergo standard-of-care treatment. The hemodynamic tissue signature method automatically describes this heterogeneity, providing a set of vascular habitats with high prognostic capabilities. © RSNA, 2018.
Ayhan, Cigdem; Bilgin, Sevil; Aksoy, Songul; Yakut, Yavuz
2016-08-10
Automatic and voluntary body position control is essential for postural stability; however, little is known about individual factors that impair the sensorimotor system associated with low back pain (LBP). To evaluate automatic and voluntary motor control impairments causing postural instability in patients with LBP. Motor control impairments associated with poor movement and balance control were analyzed prospectively in 32 patients with LBP. Numeric Rating Scale (NRS) for pain assessment, Oswestry Disability Index (ODI) for disability measurement, and computerized dynamic posturography (CDP) for analysis of postural responses were used to measure outcomes of all patients. Computerized dynamic posturography tests including Sensory organization test (SOT), limits of stability test (movement velocity, directional control, endpoint, and maximum excursion), rhythmic weight shift (rhythmic movement speed and directional control), and adaptation test (toes-up and toes-down tests) were performed and the results compared with NeuroCom normative data. The mean age of the patients was 40.50 ± 12.28 years. Lower equilibrium scores were observed in SOT (p < 0.05). There was a significant increase in reaction time and decrease in movement velocity, directional control, and endpoint excursion (p < 0.05). Speed of rhythmic movement along the anteroposterior direction decreased, while speed increased along the lateral direction (p < 0.05). Poor directional control was recorded in the anteroposterior direction (p < 0.05). Toes-down test showed an increased COG sway in patients compared with that in the controls (p < 0.05). LBP causes poor voluntary control of body positioning, a reduction in movement control, delays in movement initiation, and a difficulty to adapt to sudden surface changes.
A dynamic fault tree model of a propulsion system
NASA Technical Reports Server (NTRS)
Xu, Hong; Dugan, Joanne Bechta; Meshkat, Leila
2006-01-01
We present a dynamic fault tree model of the benchmark propulsion system, and solve it using Galileo. Dynamic fault trees (DFT) extend traditional static fault trees with special gates to model spares and other sequence dependencies. Galileo solves DFT models using a judicious combination of automatically generated Markov and Binary Decision Diagram models. Galileo easily handles the complexities exhibited by the benchmark problem. In particular, Galileo is designed to model phased mission systems.
Do resting brain dynamics predict oddball evoked-potential?
2011-01-01
Background The oddball paradigm is widely applied to the investigation of cognitive function in neuroscience and in neuropsychiatry. Whether cortical oscillation in the resting state can predict the elicited oddball event-related potential (ERP) is still not clear. This study explored the relationship between resting electroencephalography (EEG) and oddball ERPs. The regional powers of 18 electrodes across delta, theta, alpha and beta frequencies were correlated with the amplitude and latency of N1, P2, N2 and P3 components of oddball ERPs. A multivariate analysis based on partial least squares (PLS) was applied to further examine the spatial pattern revealed by multiple correlations. Results Higher synchronization in the resting state, especially at the alpha spectrum, is associated with higher neural responsiveness and faster neural propagation, as indicated by the higher amplitude change of N1/N2 and shorter latency of P2. None of the resting quantitative EEG indices predict P3 latency and amplitude. The PLS analysis confirms that the resting cortical dynamics which explains N1/N2 amplitude and P2 latency does not show regional specificity, indicating a global property of the brain. Conclusions This study differs from previous approaches by relating dynamics in the resting state to neural responsiveness in the activation state. Our analyses suggest that the neural characteristics carried by resting brain dynamics modulate the earlier/automatic stage of target detection. PMID:22114868
Dynamics of cochlear nonlinearity: Automatic gain control or instantaneous damping?
Altoè, Alessandro; Charaziak, Karolina K; Shera, Christopher A
2017-12-01
Measurements of basilar-membrane (BM) motion show that the compressive nonlinearity of cochlear mechanical responses is not an instantaneous phenomenon. For this reason, the cochlear amplifier has been thought to incorporate an automatic gain control (AGC) mechanism characterized by a finite reaction time. This paper studies the effect of instantaneous nonlinear damping on the responses of oscillatory systems. The principal results are that (i) instantaneous nonlinear damping produces a noninstantaneous gain control that differs markedly from typical AGC strategies; (ii) the kinetics of compressive nonlinearity implied by the finite reaction time of an AGC system appear inconsistent with the nonlinear dynamics measured on the gerbil basilar membrane; and (iii) conversely, those nonlinear dynamics can be reproduced using an harmonic oscillator with instantaneous nonlinear damping. Furthermore, existing cochlear models that include instantaneous gain-control mechanisms capture the principal kinetics of BM nonlinearity. Thus, an AGC system with finite reaction time appears neither necessary nor sufficient to explain nonlinear gain control in the cochlea.
Lerner, Itamar; Bentin, Shlomo; Shriki, Oren
2012-01-01
Localist models of spreading activation (SA) and models assuming distributed-representations offer very different takes on semantic priming, a widely investigated paradigm in word recognition and semantic memory research. In the present study we implemented SA in an attractor neural network model with distributed representations and created a unified framework for the two approaches. Our models assumes a synaptic depression mechanism leading to autonomous transitions between encoded memory patterns (latching dynamics), which account for the major characteristics of automatic semantic priming in humans. Using computer simulations we demonstrated how findings that challenged attractor-based networks in the past, such as mediated and asymmetric priming, are a natural consequence of our present model’s dynamics. Puzzling results regarding backward priming were also given a straightforward explanation. In addition, the current model addresses some of the differences between semantic and associative relatedness and explains how these differences interact with stimulus onset asynchrony in priming experiments. PMID:23094718
49 CFR 552.13 - Form of petition.
Code of Federal Regulations, 2010 CFR
2010-10-01
... for Expedited Rulemaking To Establish Dynamic Automatic Suppression System Test Procedures for Federal... Safety Administration, 400 Seventh Street, S.W., Washington, DC 20590. (b) Be written in the English...
NASA Technical Reports Server (NTRS)
Meyer, G.; Cicolani, L.
1981-01-01
A practical method for the design of automatic flight control systems for aircraft with complex characteristics and operational requirements, such as the powered lift STOL and V/STOL configurations, is presented. The method is effective for a large class of dynamic systems requiring multi-axis control which have highly coupled nonlinearities, redundant controls, and complex multidimensional operational envelopes. It exploits the concept of inverse dynamic systems, and an algorithm for the construction of inverse is given. A hierarchic structure for the total control logic with inverses is presented. The method is illustrated with an application to the Augmentor Wing Jet STOL Research Aircraft equipped with a digital flight control system. Results of flight evaluation of the control concept on this aircraft are presented.
Kim, Won-Seok; Zeng, Pengcheng; Shi, Jian Qing; Lee, Youngjo; Paik, Nam-Jong
2017-01-01
Motion analysis of the hyoid bone via videofluoroscopic study has been used in clinical research, but the classical manual tracking method is generally labor intensive and time consuming. Although some automatic tracking methods have been developed, masked points could not be tracked and smoothing and segmentation, which are necessary for functional motion analysis prior to registration, were not provided by the previous software. We developed software to track the hyoid bone motion semi-automatically. It works even in the situation where the hyoid bone is masked by the mandible and has been validated in dysphagia patients with stroke. In addition, we added the function of semi-automatic smoothing and segmentation. A total of 30 patients' data were used to develop the software, and data collected from 17 patients were used for validation, of which the trajectories of 8 patients were partly masked. Pearson correlation coefficients between the manual and automatic tracking are high and statistically significant (0.942 to 0.991, P-value<0.0001). Relative errors between automatic tracking and manual tracking in terms of the x-axis, y-axis and 2D range of hyoid bone excursion range from 3.3% to 9.2%. We also developed an automatic method to segment each hyoid bone trajectory into four phases (elevation phase, anterior movement phase, descending phase and returning phase). The semi-automatic hyoid bone tracking from VFSS data by our software is valid compared to the conventional manual tracking method. In addition, the ability of automatic indication to switch the automatic mode to manual mode in extreme cases and calibration without attaching the radiopaque object is convenient and useful for users. Semi-automatic smoothing and segmentation provide further information for functional motion analysis which is beneficial to further statistical analysis such as functional classification and prognostication for dysphagia. Therefore, this software could provide the researchers in the field of dysphagia with a convenient, useful, and all-in-one platform for analyzing the hyoid bone motion. Further development of our method to track the other swallowing related structures or objects such as epiglottis and bolus and to carry out the 2D curve registration may be needed for a more comprehensive functional data analysis for dysphagia with big data.
New orbit recalculations of comet C/1890 F1 Brooks and its dynamical evolution
NASA Astrophysics Data System (ADS)
Królikowska, Małgorzata; Dybczyński, Piotr A.
2016-08-01
C/1890 F1 Brooks belongs to a group of 19 comets used by Jan Oort to support his famous hypothesis on the existence of a spherical cloud containing hundreds of billions of comets with orbits of semi-major axes between 50 000 and 150 000 au. Comet Brooks stands out from this group because of a long series of astrometric observations as well as a nearly 2-yr-long observational arc. Rich observational material makes this comet an ideal target for testing the rationality of an effort to recalculate astrometric positions on the basis of original (comet-star) measurements using modern star catalogues. This paper presents the results of such a new analysis based on two different methods: (I) automatic re-reduction based on cometary positions and the (comet-star) measurements and (II) partially automatic re-reduction based on the contemporary data for the reference stars originally used. We show that both methods offer a significant reduction in the uncertainty of orbital elements. Based on the most preferred orbital solution, the dynamical evolution of comet Brooks during three consecutive perihelion passages is discussed. We conclude that C/1890 F1 is a dynamically old comet that passed the Sun at a distance below 5 au during its previous perihelion passage. Furthermore, its next perihelion passage will be a little closer than during the 1890-1892 apparition. C/1890 F1 is interesting also because it suffered extremely small planetary perturbations when it travelled through the planetary zone. Therefore, in the next passage through perihelion, it will once again be a comet from the Oort spike.
Yücel, Zeynep; Brščić, Dražen; Kanda, Takayuki; Hagita, Norihiro
2017-01-01
Being determined by human social behaviour, pedestrian group dynamics may depend on “intrinsic properties” such as the purpose of the pedestrians, their personal relation, gender, age, and body size. In this work we investigate the dynamical properties of pedestrian dyads (distance, spatial formation and velocity) by analysing a large data set of automatically tracked pedestrian trajectories in an unconstrained “ecological” setting (a shopping mall), whose apparent physical and social group properties have been analysed by three different human coders. We observed that females walk slower and closer than males, that workers walk faster, at a larger distance and more abreast than leisure oriented people, and that inter-group relation has a strong effect on group structure, with couples walking very close and abreast, colleagues walking at a larger distance, and friends walking more abreast than family members. Pedestrian height (obtained automatically through our tracking system) influences velocity and abreast distance, both growing functions of the average group height. Results regarding pedestrian age show that elderly people walk slowly, while active age adults walk at the maximum velocity. Groups with children have a strong tendency to walk in a non-abreast formation, with a large distance (despite a low abreast distance). A cross-analysis of the interplay between these intrinsic features, taking in account also the effect of an “extrinsic property” such as crowd density, confirms these major results but reveals also a richer structure. An interesting and unexpected result, for example, is that the velocity of groups with children increases with density, at least in the low-medium density range found under normal conditions in shopping malls. Children also appear to behave differently according to the gender of the parent. PMID:29095913
Zanlungo, Francesco; Yücel, Zeynep; Brščić, Dražen; Kanda, Takayuki; Hagita, Norihiro
2017-01-01
Being determined by human social behaviour, pedestrian group dynamics may depend on "intrinsic properties" such as the purpose of the pedestrians, their personal relation, gender, age, and body size. In this work we investigate the dynamical properties of pedestrian dyads (distance, spatial formation and velocity) by analysing a large data set of automatically tracked pedestrian trajectories in an unconstrained "ecological" setting (a shopping mall), whose apparent physical and social group properties have been analysed by three different human coders. We observed that females walk slower and closer than males, that workers walk faster, at a larger distance and more abreast than leisure oriented people, and that inter-group relation has a strong effect on group structure, with couples walking very close and abreast, colleagues walking at a larger distance, and friends walking more abreast than family members. Pedestrian height (obtained automatically through our tracking system) influences velocity and abreast distance, both growing functions of the average group height. Results regarding pedestrian age show that elderly people walk slowly, while active age adults walk at the maximum velocity. Groups with children have a strong tendency to walk in a non-abreast formation, with a large distance (despite a low abreast distance). A cross-analysis of the interplay between these intrinsic features, taking in account also the effect of an "extrinsic property" such as crowd density, confirms these major results but reveals also a richer structure. An interesting and unexpected result, for example, is that the velocity of groups with children increases with density, at least in the low-medium density range found under normal conditions in shopping malls. Children also appear to behave differently according to the gender of the parent.
A low-cost, computer-controlled robotic flower system for behavioral experiments.
Kuusela, Erno; Lämsä, Juho
2016-04-01
Human observations during behavioral studies are expensive, time-consuming, and error prone. For this reason, automatization of experiments is highly desirable, as it reduces the risk of human errors and workload. The robotic system we developed is simple and cheap to build and handles feeding and data collection automatically. The system was built using mostly off-the-shelf components and has a novel feeding mechanism that uses servos to perform refill operations. We used the robotic system in two separate behavioral studies with bumblebees (Bombus terrestris): The system was used both for training of the bees and for the experimental data collection. The robotic system was reliable, with no flight in our studies failing due to a technical malfunction. The data recorded were easy to apply for further analysis. The software and the hardware design are open source. The development of cheap open-source prototyping platforms during the recent years has opened up many possibilities in designing of experiments. Automatization not only reduces workload, but also potentially allows experimental designs never done before, such as dynamic experiments, where the system responds to, for example, learning of the animal. We present a complete system with hardware and software, and it can be used as such in various experiments requiring feeders and collection of visitation data. Use of the system is not limited to any particular experimental setup or even species.
Beam/seam alignment control for electron beam welding
Burkhardt, Jr., James H.; Henry, J. James; Davenport, Clyde M.
1980-01-01
This invention relates to a dynamic beam/seam alignment control system for electron beam welds utilizing video apparatus. The system includes automatic control of workpiece illumination, near infrared illumination of the workpiece to limit the range of illumination and camera sensitivity adjustment, curve fitting of seam position data to obtain an accurate measure of beam/seam alignment, and automatic beam detection and calculation of the threshold beam level from the peak beam level of the preceding video line to locate the beam or seam edges.
Automatic telangiectasia analysis in dermoscopy images using adaptive critic design.
Cheng, B; Stanley, R J; Stoecker, W V; Hinton, K
2012-11-01
Telangiectasia, tiny skin vessels, are important dermoscopy structures used to discriminate basal cell carcinoma (BCC) from benign skin lesions. This research builds off of previously developed image analysis techniques to identify vessels automatically to discriminate benign lesions from BCCs. A biologically inspired reinforcement learning approach is investigated in an adaptive critic design framework to apply action-dependent heuristic dynamic programming (ADHDP) for discrimination based on computed features using different skin lesion contrast variations to promote the discrimination process. Lesion discrimination results for ADHDP are compared with multilayer perception backpropagation artificial neural networks. This study uses a data set of 498 dermoscopy skin lesion images of 263 BCCs and 226 competitive benign images as the input sets. This data set is extended from previous research [Cheng et al., Skin Research and Technology, 2011, 17: 278]. Experimental results yielded a diagnostic accuracy as high as 84.6% using the ADHDP approach, providing an 8.03% improvement over a standard multilayer perception method. We have chosen BCC detection rather than vessel detection as the endpoint. Although vessel detection is inherently easier, BCC detection has potential direct clinical applications. Small BCCs are detectable early by dermoscopy and potentially detectable by the automated methods described in this research. © 2011 John Wiley & Sons A/S.
Developments in the CCP4 molecular-graphics project.
Potterton, Liz; McNicholas, Stuart; Krissinel, Eugene; Gruber, Jan; Cowtan, Kevin; Emsley, Paul; Murshudov, Garib N; Cohen, Serge; Perrakis, Anastassis; Noble, Martin
2004-12-01
Progress towards structure determination that is both high-throughput and high-value is dependent on the development of integrated and automatic tools for electron-density map interpretation and for the analysis of the resulting atomic models. Advances in map-interpretation algorithms are extending the resolution regime in which fully automatic tools can work reliably, but at present human intervention is required to interpret poor regions of macromolecular electron density, particularly where crystallographic data is only available to modest resolution [for example, I/sigma(I) < 2.0 for minimum resolution 2.5 A]. In such cases, a set of manual and semi-manual model-building molecular-graphics tools is needed. At the same time, converting the knowledge encapsulated in a molecular structure into understanding is dependent upon visualization tools, which must be able to communicate that understanding to others by means of both static and dynamic representations. CCP4 mg is a program designed to meet these needs in a way that is closely integrated with the ongoing development of CCP4 as a program suite suitable for both low- and high-intervention computational structural biology. As well as providing a carefully designed user interface to advanced algorithms of model building and analysis, CCP4 mg is intended to present a graphical toolkit to developers of novel algorithms in these fields.
Zhu, Jingbo; Liu, Baoyue; Shan, Shibo; Ding, Yanl; Kou, Zinong; Xiao, Wei
2015-08-01
In order to meet the needs of efficient purification of products from natural resources, this paper developed an automatic vacuum liquid chromatographic device (AUTO-VLC) and applied it to the component separation of petroleum ether extracts of Schisandra chinensis (Turcz) Baill. The device was comprised of a solvent system, a 10-position distribution valve, a 3-position changes valve, dynamic axis compress chromatographic columns with three diameters, and a 10-position fraction valve. The programmable logic controller (PLC) S7- 200 was adopted to realize the automatic control and monitoring of the mobile phase changing, column selection, separation time setting and fraction collection. The separation results showed that six fractions (S1-S6) of different chemical components from 100 g Schisandra chinensis (Turcz) Baill. petroleum ether phase were obtained by the AUTO-VLC with 150 mm diameter dynamic axis compress chromatographic column. A new method used for the VLC separation parameters screened by using multiple development TLC was developed and confirmed. The initial mobile phase of AUTO-VLC was selected by taking Rf of all the target compounds ranging from 0 to 0.45 for fist development on the TLC; gradient elution ratio was selected according to k value (the slope of the linear function of Rf value and development times on the TLC) and the resolution of target compounds; elution times (n) were calculated by the formula n ≈ ΔRf/k. A total of four compounds with the purity more than 85% and 13 other components were separated from S5 under the selected conditions for only 17 h. Therefore, the development of the automatic VLC and its method are significant to the automatic and systematic separation of traditional Chinese medicines.
A study of helicopter stability and control including blade dynamics
NASA Technical Reports Server (NTRS)
Zhao, Xin; Curtiss, H. C., Jr.
1988-01-01
A linearized model of rotorcraft dynamics has been developed through the use of symbolic automatic equation generating techniques. The dynamic model has been formulated in a unique way such that it can be used to analyze a variety of rotor/body coupling problems including a rotor mounted on a flexible shaft with a number of modes as well as free-flight stability and control characteristics. Direct comparison of the time response to longitudinal, lateral and directional control inputs at various trim conditions shows that the linear model yields good to very good correlation with flight test. In particular it is shown that a dynamic inflow model is essential to obtain good time response correlation, especially for the hover trim condition. It also is shown that the main rotor wake interaction with the tail rotor and fixed tail surfaces is a significant contributor to the response at translational flight trim conditions. A relatively simple model for the downwash and sidewash at the tail surfaces based on flat vortex wake theory is shown to produce good agreement. Then, the influence of rotor flap and lag dynamics on automatic control systems feedback gain limitations is investigated with the model. It is shown that the blade dynamics, especially lagging dynamics, can severly limit the useable values of the feedback gain for simple feedback control and that multivariable optimal control theory is a powerful tool to design high gain augmentation control system. The frequency-shaped optimal control design can offer much better flight dynamic characteristics and a stable margin for the feedback system without need to model the lagging dynamics.
Chaotic vortex filaments in a Bose–Einstein condensate and in superfluid helium
NASA Astrophysics Data System (ADS)
Nemirovskii, S. K.
2018-05-01
A statement of the quantum turbulence problem in both a Bose–Einstein condensate (BEC) and superfluid helium is formulated. In superfluid helium use is made of a so-called vortex filament method, in which quantum vortices are represented by stringlike objects, i.e. vortex lines. The dynamics of the vortex lines is determined by deterministic equations of motion, supplemented by random reconnections. Unlike He II, the laws of the dynamics of quantum vortices in BEC are based on the nonlinear Schrödinger equation. This makes it possible to obtain a microscopic description of the collision of vortices, the structure of a vortex filament, etc. A comparative analysis of these complementary approaches is carried out. It is shown that there are some features that do not automatically transfer the results obtained for BEC to vortices in He II and vice versa.
A Dynamic Time Warping Approach to Real-Time Activity Recognition for Food Preparation
NASA Astrophysics Data System (ADS)
Pham, Cuong; Plötz, Thomas; Olivier, Patrick
We present a dynamic time warping based activity recognition system for the analysis of low-level food preparation activities. Accelerometers embedded into kitchen utensils provide continuous sensor data streams while people are using them for cooking. The recognition framework analyzes frames of contiguous sensor readings in real-time with low latency. It thereby adapts to the idiosyncrasies of utensil use by automatically maintaining a template database. We demonstrate the effectiveness of the classification approach by a number of real-world practical experiments on a publically available dataset. The adaptive system shows superior performance compared to a static recognizer. Furthermore, we demonstrate the generalization capabilities of the system by gradually reducing the amount of training samples. The system achieves excellent classification results even if only a small number of training samples is available, which is especially relevant for real-world scenarios.
Designing an architectural style for Pervasive Healthcare systems.
Rafe, Vahid; Hajvali, Masoumeh
2013-04-01
Nowadays, the Pervasive Healthcare (PH) systems are considered as an important research area. These systems have a dynamic structure and configuration. Therefore, an appropriate method for designing such systems is necessary. The Publish/Subscribe Architecture (pub/sub) is one of the convenient architectures to support such systems. PH systems are safety critical; hence, errors can bring disastrous results. To prevent such problems, a powerful analytical tool is required. So using a proper formal language like graph transformation systems for developing of these systems seems necessary. But even if software engineers use such high level methodologies, errors may occur in the system under design. Hence, it should be investigated automatically and formally that whether this model of system satisfies all their requirements or not. In this paper, a dynamic architectural style for developing PH systems is presented. Then, the behavior of these systems is modeled and evaluated using GROOVE toolset. The results of the analysis show its high reliability.
NASA Astrophysics Data System (ADS)
Iwasaki, Yuma; Kusne, A. Gilad; Takeuchi, Ichiro
2017-12-01
Machine learning techniques have proven invaluable to manage the ever growing volume of materials research data produced as developments continue in high-throughput materials simulation, fabrication, and characterization. In particular, machine learning techniques have been demonstrated for their utility in rapidly and automatically identifying potential composition-phase maps from structural data characterization of composition spread libraries, enabling rapid materials fabrication-structure-property analysis and functional materials discovery. A key issue in development of an automated phase-diagram determination method is the choice of dissimilarity measure, or kernel function. The desired measure reduces the impact of confounding structural data issues on analysis performance. The issues include peak height changes and peak shifting due to lattice constant change as a function of composition. In this work, we investigate the choice of dissimilarity measure in X-ray diffraction-based structure analysis and the choice of measure's performance impact on automatic composition-phase map determination. Nine dissimilarity measures are investigated for their impact in analyzing X-ray diffraction patterns for a Fe-Co-Ni ternary alloy composition spread. The cosine, Pearson correlation coefficient, and Jensen-Shannon divergence measures are shown to provide the best performance in the presence of peak height change and peak shifting (due to lattice constant change) when the magnitude of peak shifting is unknown. With prior knowledge of the maximum peak shifting, dynamic time warping in a normalized constrained mode provides the best performance. This work also serves to demonstrate a strategy for rapid analysis of a large number of X-ray diffraction patterns in general beyond data from combinatorial libraries.
ADAM: analysis of discrete models of biological systems using computer algebra.
Hinkelmann, Franziska; Brandon, Madison; Guang, Bonny; McNeill, Rustin; Blekherman, Grigoriy; Veliz-Cuba, Alan; Laubenbacher, Reinhard
2011-07-20
Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web-based tool for several different input formats, and it makes analysis of complex models accessible to a larger community, as it is platform independent as a web-service and does not require understanding of the underlying mathematics.
Steady-State Computation of Constant Rotational Rate Dynamic Stability Derivatives
NASA Technical Reports Server (NTRS)
Park, Michael A.; Green, Lawrence L.
2000-01-01
Dynamic stability derivatives are essential to predicting the open and closed loop performance, stability, and controllability of aircraft. Computational determination of constant-rate dynamic stability derivatives (derivatives of aircraft forces and moments with respect to constant rotational rates) is currently performed indirectly with finite differencing of multiple time-accurate computational fluid dynamics solutions. Typical time-accurate solutions require excessive amounts of computational time to complete. Formulating Navier-Stokes (N-S) equations in a rotating noninertial reference frame and applying an automatic differentiation tool to the modified code has the potential for directly computing these derivatives with a single, much faster steady-state calculation. The ability to rapidly determine static and dynamic stability derivatives by computational methods can benefit multidisciplinary design methodologies and reduce dependency on wind tunnel measurements. The CFL3D thin-layer N-S computational fluid dynamics code was modified for this study to allow calculations on complex three-dimensional configurations with constant rotation rate components in all three axes. These CFL3D modifications also have direct application to rotorcraft and turbomachinery analyses. The modified CFL3D steady-state calculation is a new capability that showed excellent agreement with results calculated by a similar formulation. The application of automatic differentiation to CFL3D allows the static stability and body-axis rate derivatives to be calculated quickly and exactly.
Review of automatic detection of pig behaviours by using image analysis
NASA Astrophysics Data System (ADS)
Han, Shuqing; Zhang, Jianhua; Zhu, Mengshuai; Wu, Jianzhai; Kong, Fantao
2017-06-01
Automatic detection of lying, moving, feeding, drinking, and aggressive behaviours of pigs by means of image analysis can save observation input by staff. It would help staff make early detection of diseases or injuries of pigs during breeding and improve management efficiency of swine industry. This study describes the progress of pig behaviour detection based on image analysis and advancement in image segmentation of pig body, segmentation of pig adhesion and extraction of pig behaviour characteristic parameters. Challenges for achieving automatic detection of pig behaviours were summarized.
Automatic analysis of altered gait in arylsulphatase A-deficient mice in the open field.
Leroy, Toon; Stroobants, Stijn; Aerts, Jean-Marie; D'Hooge, Rudi; Berckmans, Daniel
2009-08-01
In current research with laboratory animals, observing their dynamic behavior or locomotion is a labor-intensive task. Automatic continuous monitoring can provide quantitative data on each animal's condition and coordination ability. The objective of the present work is to develop an automated mouse observation system integrated with a conventional open-field test for motor function evaluation. Data were acquired from 86 mice having a targeted disruption of the arylsulphatase A (ASA) gene and having lowered coordinated locomotion abilities as a symptom. The mice used were 36 heterozygotes (12 females) and 50 knockout mice (30 females) at the age of 6 months. The mice were placed one at a time into the test setup, which consisted of a Plexiglas cage (53x34.5x26 cm) and two fluorescent bulbs for proper illumination. The transparent cage allowed images to be captured from underneath the cage, so image information could be obtained about the dynamic variation of the positions of the limbs of the mice for gait reconstruction. Every mouse was recorded for 10 min. Background subtraction and color filtering were used to measure and calculate image features, which are variables that contain crucial information, such as the mouse's position, orientation, body outline, and possible locations for the mouse's paws. A set of heuristic rules was used to prune implausible paw features and label the remaining ones as front/hind and left/right. After we had pruned the implausible paw features, the paw features that were consistent over subsequent images were matched to footprints. Finally, from the measured footprint sequence, eight parameters were calculated in order to quantify the gait of the mouse. This automatic observation technique can be integrated with a regular open-field test, where the trajectory and motor function of a free-moving mouse are measured simultaneously.
Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images
NASA Technical Reports Server (NTRS)
Fischer, Bernd
2004-01-01
Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems which use numerical approximations even in cases where closed-form solutions exist. AutoBayes is implemented in Prolog and comprises approximately 75.000 lines of code. In this paper, we take one typical scientific data analysis problem-analyzing planetary nebulae images taken by the Hubble Space Telescope-and show how AutoBayes can be used to automate the implementation of the necessary anal- ysis programs. We initially follow the analysis described by Knuth and Hajian [KHO2] and use AutoBayes to derive code for the published models. We show the details of the code derivation process, including the symbolic computations and automatic integration of library procedures, and compare the results of the automatically generated and manually implemented code. We then go beyond the original analysis and use AutoBayes to derive code for a simple image segmentation procedure based on a mixture model which can be used to automate a manual preproceesing step. Finally, we combine the original approach with the simple segmentation which yields a more detailed analysis. This also demonstrates that AutoBayes makes it easy to combine different aspects of data analysis.
CAD-Based Aerodynamic Design of Complex Configurations using a Cartesian Method
NASA Technical Reports Server (NTRS)
Nemec, Marian; Aftosmis, Michael J.; Pulliam, Thomas H.
2003-01-01
A modular framework for aerodynamic optimization of complex geometries is developed. By working directly with a parametric CAD system, complex-geometry models are modified nnd tessellated in an automatic fashion. The use of a component-based Cartesian method significantly reduces the demands on the CAD system, and also provides for robust and efficient flowfield analysis. The optimization is controlled using either a genetic or quasi-Newton algorithm. Parallel efficiency of the framework is maintained even when subject to limited CAD resources by dynamically re-allocating the processors of the flow solver. Overall, the resulting framework can explore designs incorporating large shape modifications and changes in topology.
A real-time digital computer program for the simulation of automatic spacecraft reentries
NASA Technical Reports Server (NTRS)
Kaylor, J. T.; Powell, L. F.; Powell, R. W.
1977-01-01
The automatic reentry flight dynamics simulator, a nonlinear, six-degree-of-freedom simulation, digital computer program, has been developed. The program includes a rotating, oblate earth model for accurate navigation calculations and contains adjustable gains on the aerodynamic stability and control parameters. This program uses a real-time simulation system and is designed to examine entries of vehicles which have constant mass properties whose attitudes are controlled by both aerodynamic surfaces and reaction control thrusters, and which have automatic guidance and control systems. The program has been used to study the space shuttle orbiter entry. This report includes descriptions of the equations of motion used, the control and guidance schemes that were implemented, the program flow and operation, and the hardware involved.
NASA Astrophysics Data System (ADS)
Martel, Anne L.
2004-04-01
In order to extract quantitative information from dynamic contrast-enhanced MR images (DCE-MRI) it is usually necessary to identify an arterial input function. This is not a trivial problem if there are no major vessels present in the field of view. Most existing techniques rely on operator intervention or use various curve parameters to identify suitable pixels but these are often specific to the anatomical region or the acquisition method used. They also require the signal from several pixels to be averaged in order to improve the signal to noise ratio, however this introduces errors due to partial volume effects. We have described previously how factor analysis can be used to automatically separate arterial and venous components from DCE-MRI studies of the brain but although that method works well for single slice images through the brain when the blood brain barrier technique is intact, it runs into problems for multi-slice images with more complex dynamics. This paper will describe a factor analysis method that is more robust in such situations and is relatively insensitive to the number of physiological components present in the data set. The technique is very similar to that used to identify spectral end-members from multispectral remote sensing images.
NASA Astrophysics Data System (ADS)
Sayapina, D. O.; Zharnikova, M. A.; Tsydypov, B. Z.; Sodnomov, B. V.; Garmaev, E. Zh
2016-11-01
Starting in the eighties of the 20th century, the scientists of the Baikal Institute of Nature Management (BINM SB RAS) have been conducting field observations of the Transbaikalia geosystems transformation due to the change of climate and nature management. An utmost importance is placed on the study of a negative response of the land geosystems. This is shown through their deterioration, degradation, and desertification in particular. Through the years of research (1985-2015) in dry areas of the north of Central Asia, the scientists of the BINM SB RAS established a network of key sites for contact monitoring of the status and dynamics of the geosystems and the negative natural-anthropogenic processes along the Baikal-Gobi meridional transect (51-44° N, 105-107° E). The monitoring of the status and dynamics of the vegetation cover of some key sites is conducted by processing and analysis of multitemporal and multispectral Landsat and MODIS Terra imagery. An automatic analysis of the time variation of NDVI and a comparison with the progress of the index in the previous seasons are performed. The landscape indication of the key sites is made on the basis of satellite imagery and complete geobotanical descriptions. Landscape profiles and facies maps with natural boundaries are created.
Fernández-de-Manúel, Laura; Díaz-Díaz, Covadonga; Jiménez-Carretero, Daniel; Torres, Miguel; Montoya, María C
2017-05-01
Embryonic stem cells (ESCs) can be established as permanent cell lines, and their potential to differentiate into adult tissues has led to widespread use for studying the mechanisms and dynamics of stem cell differentiation and exploring strategies for tissue repair. Imaging live ESCs during development is now feasible due to advances in optical imaging and engineering of genetically encoded fluorescent reporters; however, a major limitation is the low spatio-temporal resolution of long-term 3-D imaging required for generational and neighboring reconstructions. Here, we present the ESC-Track (ESC-T) workflow, which includes an automated cell and nuclear segmentation and tracking tool for 4-D (3-D + time) confocal image data sets as well as a manual editing tool for visual inspection and error correction. ESC-T automatically identifies cell divisions and membrane contacts for lineage tree and neighborhood reconstruction and computes quantitative features from individual cell entities, enabling analysis of fluorescence signal dynamics and tracking of cell morphology and motion. We use ESC-T to examine Myc intensity fluctuations in the context of mouse ESC (mESC) lineage and neighborhood relationships. ESC-T is a powerful tool for evaluation of the genealogical and microenvironmental cues that maintain ESC fitness.
A RESEARCH DATABASE FOR IMPROVED DATA MANAGEMENT AND ANALYSIS IN LONGITUDINAL STUDIES
BIELEFELD, ROGER A.; YAMASHITA, TOYOKO S.; KEREKES, EDWARD F.; ERCANLI, EHAT; SINGER, LYNN T.
2014-01-01
We developed a research database for a five-year prospective investigation of the medical, social, and developmental correlates of chronic lung disease during the first three years of life. We used the Ingres database management system and the Statit statistical software package. The database includes records containing 1300 variables each, the results of 35 psychological tests, each repeated five times (providing longitudinal data on the child, the parents, and behavioral interactions), both raw and calculated variables, and both missing and deferred values. The four-layer menu-driven user interface incorporates automatic activation of complex functions to handle data verification, missing and deferred values, static and dynamic backup, determination of calculated values, display of database status, reports, bulk data extraction, and statistical analysis. PMID:7596250
S192 multispectral scanner channel 13 electromechanical noise investigation ECP-166
NASA Technical Reports Server (NTRS)
Koumjian, H.
1975-01-01
A review is presented of all data on the multispectral scanner having to do with low frequency noise. The noise is component-induced, either mechanical or electrical or a combination of both. To assist in understanding the source of the noise, several dynamic analyses both structural and electrical were made and are reported. A review is presented of structural resonance test data obtained with the use of an accelerometer and strain gage sensors. Results of an analysis of the natural frequencies of the Dewar leads is included along with an analysis of the S192 cooler and its supporting structure. Other topics discussed include electronic stability of the forward signal, automatic gain control, and the offset control feedback loops as well as the preamplifier which utilized on integrator feedback circuit.
Quantitative high-throughput population dynamics in continuous-culture by automated microscopy.
Merritt, Jason; Kuehn, Seppe
2016-09-12
We present a high-throughput method to measure abundance dynamics in microbial communities sustained in continuous-culture. Our method uses custom epi-fluorescence microscopes to automatically image single cells drawn from a continuously-cultured population while precisely controlling culture conditions. For clonal populations of Escherichia coli our instrument reveals history-dependent resilience and growth rate dependent aggregation.
Dynamic load balancing of applications
Wheat, Stephen R.
1997-01-01
An application-level method for dynamically maintaining global load balance on a parallel computer, particularly on massively parallel MIMD computers. Global load balancing is achieved by overlapping neighborhoods of processors, where each neighborhood performs local load balancing. The method supports a large class of finite element and finite difference based applications and provides an automatic element management system to which applications are easily integrated.
NASA Technical Reports Server (NTRS)
Kessel, C.; Wickens, C. D.
1978-01-01
The development of the internal model as it pertains to the detection of step changes in the order of control dynamics is investigated for two modes of participation: whether the subjects are actively controlling those dynamics or are monitoring an autopilot controlling them. A transfer of training design was used to evaluate the relative contribution of proprioception and visual information to the overall accuracy of the internal model. Sixteen subjects either tracked or monitored the system dynamics as a 2-dimensional pursuit display under single task conditions and concurrently with a sub-critical tracking task at two difficulty levels. Detection performance was faster and more accurate in the manual as opposed to the autopilot mode. The concurrent tracking task produced a decrement in detection performance for all conditions though this was more marked for the manual mode. The development of an internal model in the manual mode transferred positively to the automatic mode producing enhanced detection performance. There was no transfer from the internal model developed in the automatic mode to the manual mode.
User-Assisted Store Recycling for Dynamic Task Graph Schedulers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurt, Mehmet Can; Krishnamoorthy, Sriram; Agrawal, Gagan
The emergence of the multi-core era has led to increased interest in designing effective yet practical parallel programming models. Models based on task graphs that operate on single-assignment data are attractive in several ways: they can support dynamic applications and precisely represent the available concurrency. However, they also require nuanced algorithms for scheduling and memory management for efficient execution. In this paper, we consider memory-efficient dynamic scheduling of task graphs. Specifically, we present a novel approach for dynamically recycling the memory locations assigned to data items as they are produced by tasks. We develop algorithms to identify memory-efficient store recyclingmore » functions by systematically evaluating the validity of a set of (user-provided or automatically generated) alternatives. Because recycling function can be input data-dependent, we have also developed support for continued correct execution of a task graph in the presence of a potentially incorrect store recycling function. Experimental evaluation demonstrates that our approach to automatic store recycling incurs little to no overheads, achieves memory usage comparable to the best manually derived solutions, often produces recycling functions valid across problem sizes and input parameters, and efficiently recovers from an incorrect choice of store recycling functions.« less
Automatic Analysis of Critical Incident Reports: Requirements and Use Cases.
Denecke, Kerstin
2016-01-01
Increasingly, critical incident reports are used as a means to increase patient safety and quality of care. The entire potential of these sources of experiential knowledge remains often unconsidered since retrieval and analysis is difficult and time-consuming, and the reporting systems often do not provide support for these tasks. The objective of this paper is to identify potential use cases for automatic methods that analyse critical incident reports. In more detail, we will describe how faceted search could offer an intuitive retrieval of critical incident reports and how text mining could support in analysing relations among events. To realise an automated analysis, natural language processing needs to be applied. Therefore, we analyse the language of critical incident reports and derive requirements towards automatic processing methods. We learned that there is a huge potential for an automatic analysis of incident reports, but there are still challenges to be solved.
Automatic analysis of microscopic images of red blood cell aggregates
NASA Astrophysics Data System (ADS)
Menichini, Pablo A.; Larese, Mónica G.; Riquelme, Bibiana D.
2015-06-01
Red blood cell aggregation is one of the most important factors in blood viscosity at stasis or at very low rates of flow. The basic structure of aggregates is a linear array of cell commonly termed as rouleaux. Enhanced or abnormal aggregation is seen in clinical conditions, such as diabetes and hypertension, producing alterations in the microcirculation, some of which can be analyzed through the characterization of aggregated cells. Frequently, image processing and analysis for the characterization of RBC aggregation were done manually or semi-automatically using interactive tools. We propose a system that processes images of RBC aggregation and automatically obtains the characterization and quantification of the different types of RBC aggregates. Present technique could be interesting to perform the adaptation as a routine used in hemorheological and Clinical Biochemistry Laboratories because this automatic method is rapid, efficient and economical, and at the same time independent of the user performing the analysis (repeatability of the analysis).
A Semi-Automatic Method for Image Analysis of Edge Dynamics in Living Cells
Huang, Lawrence; Helmke, Brian P.
2011-01-01
Spatial asymmetry of actin edge ruffling contributes to the process of cell polarization and directional migration, but mechanisms by which external cues control actin polymerization near cell edges remain unclear. We designed a quantitative image analysis strategy to measure the spatiotemporal distribution of actin edge ruffling. Time-lapse images of endothelial cells (ECs) expressing mRFP-actin were segmented using an active contour method. In intensity line profiles oriented normal to the cell edge, peak detection identified the angular distribution of polymerized actin within 1 µm of the cell edge, which was localized to lamellipodia and edge ruffles. Edge features associated with filopodia and peripheral stress fibers were removed. Circular statistical analysis enabled detection of cell polarity, indicated by a unimodal distribution of edge ruffles. To demonstrate the approach, we detected a rapid, nondirectional increase in edge ruffling in serum-stimulated ECs and a change in constitutive ruffling orientation in quiescent, nonpolarized ECs. Error analysis using simulated test images demonstrate robustness of the method to variations in image noise levels, edge ruffle arc length, and edge intensity gradient. These quantitative measurements of edge ruffling dynamics enable investigation at the cellular length scale of the underlying molecular mechanisms regulating actin assembly and cell polarization. PMID:21643526
A preliminary investigation of inlet unstart effects on a high-speed civil transport concept
NASA Technical Reports Server (NTRS)
Domack, Christopher S.
1991-01-01
Vehicle motions resulting from a supersonic mixed-compression inlet unstart were examined to determine if the unstart constituted a hazard severe enough to warrant rejection of mixed-compression inlets on high-speed civil transport (HSCT) concepts. A simple kinematic analysis of an inlet unstart during cruise was performed for a Mach 2, 4, 250-passenger HSCT concept using data from a wind-tunnel test of a representative configuration with unstarted inlets simulated. A survey of previously published research on inlet unstart effects, including simulation and flight test data for the YF-12, XB-70, and Concorde aircraft, was conducted to validate the calculated results. It was concluded that, when countered by suitable automatic propulsion and flight control systems, the vehicle dynamics induced by an inlet unstart are not severe enough to preclude the use of mixed-compression inlets on an HSCT from a passenger safety standpoint. The ability to provide suitable automatic controls appears to be within the current state of the art. However, the passenger startle and discomfort caused by the noise, vibration, and cabin motions associated with an inlet unstart remain a concern.
Oscillatory EEG dynamics underlying automatic chunking during sentence processing.
Bonhage, Corinna E; Meyer, Lars; Gruber, Thomas; Friederici, Angela D; Mueller, Jutta L
2017-05-15
Sentences are easier to remember than random word sequences, likely because linguistic regularities facilitate chunking of words into meaningful groups. The present electroencephalography study investigated the neural oscillations modulated by this so-called sentence superiority effect during the encoding and maintenance of sentence fragments versus word lists. We hypothesized a chunking-related modulation of neural processing during the encoding and retention of sentences (i.e., sentence fragments) as compared to word lists. Time-frequency analysis revealed a two-fold oscillatory pattern for the memorization of sentences: Sentence encoding was accompanied by higher delta amplitude (4Hz), originating both from regions processing syntax as well as semantics (bilateral superior/middle temporal regions and fusiform gyrus). Subsequent sentence retention was reflected in decreased theta (6Hz) and beta/gamma (27-32Hz) amplitude instead. Notably, whether participants simply read or properly memorized the sentences did not impact chunking-related activity during encoding. Therefore, we argue that the sentence superiority effect is grounded in highly automatized language processing mechanisms, which generate meaningful memory chunks irrespective of task demands. Copyright © 2017 Elsevier Inc. All rights reserved.
Automatic Assignment of Methyl-NMR Spectra of Supramolecular Machines Using Graph Theory.
Pritišanac, Iva; Degiacomi, Matteo T; Alderson, T Reid; Carneiro, Marta G; Ab, Eiso; Siegal, Gregg; Baldwin, Andrew J
2017-07-19
Methyl groups are powerful probes for the analysis of structure, dynamics and function of supramolecular assemblies, using both solution- and solid-state NMR. Widespread application of the methodology has been limited due to the challenges associated with assigning spectral resonances to specific locations within a biomolecule. Here, we present Methyl Assignment by Graph Matching (MAGMA), for the automatic assignment of methyl resonances. A graph matching protocol examines all possibilities for each resonance in order to determine an exact assignment that includes a complete description of any ambiguity. MAGMA gives 100% accuracy in confident assignments when tested against both synthetic data, and 9 cross-validated examples using both solution- and solid-state NMR data. We show that this remarkable accuracy enables a user to distinguish between alternative protein structures. In a drug discovery application on HSP90, we show the method can rapidly and efficiently distinguish between possible ligand binding modes. By providing an exact and robust solution to methyl resonance assignment, MAGMA can facilitate significantly accelerated studies of supramolecular machines using methyl-based NMR spectroscopy.
NASA Astrophysics Data System (ADS)
Ghoraani, Behnaz; Krishnan, Sridhar
2009-12-01
The number of people affected by speech problems is increasing as the modern world places increasing demands on the human voice via mobile telephones, voice recognition software, and interpersonal verbal communications. In this paper, we propose a novel methodology for automatic pattern classification of pathological voices. The main contribution of this paper is extraction of meaningful and unique features using Adaptive time-frequency distribution (TFD) and nonnegative matrix factorization (NMF). We construct Adaptive TFD as an effective signal analysis domain to dynamically track the nonstationarity in the speech and utilize NMF as a matrix decomposition (MD) technique to quantify the constructed TFD. The proposed method extracts meaningful and unique features from the joint TFD of the speech, and automatically identifies and measures the abnormality of the signal. Depending on the abnormality measure of each signal, we classify the signal into normal or pathological. The proposed method is applied on the Massachusetts Eye and Ear Infirmary (MEEI) voice disorders database which consists of 161 pathological and 51 normal speakers, and an overall classification accuracy of 98.6% was achieved.
The role of fluctuations and interactions in pedestrian dynamics
NASA Astrophysics Data System (ADS)
Corbetta, Alessandro; Meeusen, Jasper; Benzi, Roberto; Lee, Chung-Min; Toschi, Federico
Understanding quantitatively the statistical behaviour of pedestrians walking in crowds is a major scientific challenge of paramount societal relevance. Walking humans exhibit a rich (stochastic) dynamics whose small and large deviations are driven, among others, by own will as well as by environmental conditions. Via 24/7 automatic pedestrian tracking from multiple overhead Microsoft Kinect depth sensors, we collected large ensembles of pedestrian trajectories (in the order of tens of millions) in different real-life scenarios. These scenarios include both narrow corridors and large urban hallways, enabling us to cover and compare a wide spectrum of typical pedestrian dynamics. We investigate the pedestrian motion measuring the PDFs, e.g. those of position, velocity and acceleration, and at unprecedentedly high statistical resolution. We consider the dependence of PDFs on flow conditions, focusing on diluted dynamics and pair-wise interactions (''collisions'') for mutual avoidance. By means of Langevin-like models we provide models for the measured data, inclusive typical fluctuations and rare events. This work is part of the JSTP research programme ``Vision driven visitor behaviour analysis and crowd management'' with Project Number 341-10-001, which is financed by the Netherlands Organisation for Scientific Research (NWO).
Spatio-temporal diffusion of dynamic PET images
NASA Astrophysics Data System (ADS)
Tauber, C.; Stute, S.; Chau, M.; Spiteri, P.; Chalon, S.; Guilloteau, D.; Buvat, I.
2011-10-01
Positron emission tomography (PET) images are corrupted by noise. This is especially true in dynamic PET imaging where short frames are required to capture the peak of activity concentration after the radiotracer injection. High noise results in a possible bias in quantification, as the compartmental models used to estimate the kinetic parameters are sensitive to noise. This paper describes a new post-reconstruction filter to increase the signal-to-noise ratio in dynamic PET imaging. It consists in a spatio-temporal robust diffusion of the 4D image based on the time activity curve (TAC) in each voxel. It reduces the noise in homogeneous areas while preserving the distinct kinetics in regions of interest corresponding to different underlying physiological processes. Neither anatomical priors nor the kinetic model are required. We propose an automatic selection of the scale parameter involved in the diffusion process based on a robust statistical analysis of the distances between TACs. The method is evaluated using Monte Carlo simulations of brain activity distributions. We demonstrate the usefulness of the method and its superior performance over two other post-reconstruction spatial and temporal filters. Our simulations suggest that the proposed method can be used to significantly increase the signal-to-noise ratio in dynamic PET imaging.
Hagopian, Louis P.; Rooker, Griffin W.; Zarcone, Jennifer R.; Bonner, Andrew C.; Arevalo, Alexander R.
2017-01-01
Hagopian, Rooker, and Zarcone (2015) evaluated a model for subtyping automatically reinforced self-injurious behavior (SIB) based on its sensitivity to changes in functional analysis conditions and the presence of self-restraint. The current study tested the generality of the model by applying it to all datasets of automatically reinforced SIB published from 1982 to 2015. We identified 49 datasets that included sufficient data to permit subtyping. Similar to the original study, Subtype-1 SIB was generally amenable to treatment using reinforcement alone, whereas Subtype-2 SIB was not. Conclusions could not be drawn about Subtype-3 SIB due to the small number of datasets. Nevertheless, the findings support the generality of the model and suggest that sensitivity of SIB to disruption by alternative reinforcement is an important dimension of automatically reinforced SIB. Findings also suggest that automatically reinforced SIB should no longer be considered a single category and that additional research is needed to better understand and treat Subtype-2 SIB. PMID:28032344
Automatic Temporal Tracking of Supra-Glacial Lakes
NASA Astrophysics Data System (ADS)
Liang, Y.; Lv, Q.; Gallaher, D. W.; Fanning, D.
2010-12-01
During the recent years, supra-glacial lakes in Greenland have attracted extensive global attention as they potentially play an important role in glacier movement, sea level rise, and climate change. Previous works focused on classification methods and individual cloud-free satellite images, which have limited capabilities in terms of tracking changes of lakes over time. The challenges of tracking supra-glacial lakes automatically include (1) massive amount of satellite images with diverse qualities and frequent cloud coverage, and (2) diversity and dynamics of large number of supra-glacial lakes on the Greenland ice sheet. In this study, we develop an innovative method to automatically track supra-glacial lakes temporally using the Moderate Resolution Imaging Spectroradiometer (MODIS) time-series data. The method works for both cloudy and cloud-free data and is unsupervised, i.e., no manual identification is required. After selecting the highest-quality image within each time interval, our method automatically detects supra-glacial lakes in individual images, using adaptive thresholding to handle diverse image qualities. We then track lakes across time series of images as lakes appear, change in size, and disappear. Using multi-year MODIS data during melting season, we demonstrate that this new method can detect and track supra-glacial lakes in both space and time with 95% accuracy. Attached figure shows an example of the current result. Detailed analysis of the temporal variation of detected lakes will be presented. (a) One of our experimental data. The Investigated region is centered at Jakobshavn Isbrae glacier in west Greenland. (b) Enlarged view of part of ice sheet. It is partially cloudy and with supra-glacial lakes on it. Lakes are shown as dark spots. (c) Current result. Red spots are detected lakes.
Sentry: An Automated Close Approach Monitoring System for Near-Earth Objects
NASA Astrophysics Data System (ADS)
Chamberlin, A. B.; Chesley, S. R.; Chodas, P. W.; Giorgini, J. D.; Keesey, M. S.; Wimberly, R. N.; Yeomans, D. K.
2001-11-01
In response to international concern about potential asteroid impacts on Earth, NASA's Near-Earth Object (NEO) Program Office has implemented a new system called ``Sentry'' to automatically update the orbits of all NEOs on a daily basis and compute Earth close approaches up to 100 years into the future. Results are published on our web site (http://neo.jpl.nasa.gov/) and updated orbits and ephemerides made available via the JPL Horizons ephemeris service (http://ssd.jpl.nasa.gov/horizons.html). Sentry collects new and revised astrometric observations from the Minor Planet Center (MPC) via their electronic circulars (MPECs) in near real time as well as radar and optical astrometry sent directly from observers. NEO discoveries and identifications are detected in MPECs and processed appropriately. In addition to these daily updates, Sentry synchronizes with each monthly batch of MPC astrometry and automatically updates all NEO observation files. Daily and monthly processing of NEO astrometry is managed using a queuing system which allows for manual intervention of selected NEOs without interfering with the automatic system. At the heart of Sentry is a fully automatic orbit determination program which handles outlier rejection and ensures convergence in the new solution. Updated orbital elements and their covariances are published via Horizons and our NEO web site, typically within 24 hours. A new version of Horizons, in development, will allow computation of ephemeris uncertainties using covariance data. The positions of NEOs with updated orbits are numerically integrated up to 100 years into the future and each close approach to any perturbing body in our dynamic model (all planets, Moon, Ceres, Pallas, Vesta) is recorded. Significant approaches are flagged for extended analysis including Monte Carlo studies. Results, such as minimum encounter distances and future Earth impact probabilities, are published on our NEO web site.
Multivariate singular spectrum analysis and the road to phase synchronization
NASA Astrophysics Data System (ADS)
Groth, Andreas; Ghil, Michael
2010-05-01
Singular spectrum analysis (SSA) and multivariate SSA (M-SSA) are based on the classical work of Kosambi (1943), Loeve (1945) and Karhunen (1946) and are closely related to principal component analysis. They have been introduced into information theory by Bertero, Pike and co-workers (1982, 1984) and into dynamical systems analysis by Broomhead and King (1986a,b). Ghil, Vautard and associates have applied SSA and M-SSA to the temporal and spatio-temporal analysis of short and noisy time series in climate dynamics and other fields in the geosciences since the late 1980s. M-SSA provides insight into the unknown or partially known dynamics of the underlying system by decomposing the delay-coordinate phase space of a given multivariate time series into a set of data-adaptive orthonormal components. These components can be classified essentially into trends, oscillatory patterns and noise, and allow one to reconstruct a robust "skeleton" of the dynamical system's structure. For an overview we refer to Ghil et al. (Rev. Geophys., 2002). In this talk, we present M-SSA in the context of synchronization analysis and illustrate its ability to unveil information about the mechanisms behind the adjustment of rhythms in coupled dynamical systems. The focus of the talk is on the special case of phase synchronization between coupled chaotic oscillators (Rosenblum et al., PRL, 1996). Several ways of measuring phase synchronization are in use, and the robust definition of a reasonable phase for each oscillator is critical in each of them. We illustrate here the advantages of M-SSA in the automatic identification of oscillatory modes and in drawing conclusions about the transition to phase synchronization. Without using any a priori definition of a suitable phase, we show that M-SSA is able to detect phase synchronization in a chain of coupled chaotic oscillators (Osipov et al., PRE, 1996). Recently, Muller et al. (PRE, 2005) and Allefeld et al. (Intl. J. Bif. Chaos, 2007) have demonstrated the usefulness of principal component analysis in detecting phase synchronization from multivariate time series. The present talk provides a generalization of this idea and presents a robust implementation thereof via M-SSA.
Automatic Control of the Concrete Mixture Homogeneity in Cycling Mixers
NASA Astrophysics Data System (ADS)
Anatoly Fedorovich, Tikhonov; Drozdov, Anatoly
2018-03-01
The article describes the factors affecting the concrete mixture quality related to the moisture content of aggregates, since the effectiveness of the concrete mixture production is largely determined by the availability of quality management tools at all stages of the technological process. It is established that the unaccounted moisture of aggregates adversely affects the concrete mixture homogeneity and, accordingly, the strength of building structures. A new control method and the automatic control system of the concrete mixture homogeneity in the technological process of mixing components have been proposed, since the tasks of providing a concrete mixture are performed by the automatic control system of processing kneading-and-mixing machinery with operational automatic control of homogeneity. Theoretical underpinnings of the control of the mixture homogeneity are presented, which are related to a change in the frequency of vibrodynamic vibrations of the mixer body. The structure of the technical means of the automatic control system for regulating the supply of water is determined depending on the change in the concrete mixture homogeneity during the continuous mixing of components. The following technical means for establishing automatic control have been chosen: vibro-acoustic sensors, remote terminal units, electropneumatic control actuators, etc. To identify the quality indicator of automatic control, the system offers a structure flowchart with transfer functions that determine the ACS operation in transient dynamic mode.
NASA Astrophysics Data System (ADS)
Wakamatsu, Hidetoshi; Gaohua, Lu
Various surface-cooling apparatus such as the cooling cap, muffler and blankets have been commonly used for the cooling of the brain to provide hypothermic neuro-protection for patients of hypoxic-ischemic encephalopathy. The present paper is aimed at the brain temperature regulation from the viewpoint of automatic system control, in order to help clinicians decide an optimal temperature of the cooling fluid provided for these three types of apparatus. At first, a biothermal model characterized by dynamic ambient temperatures is constructed for adult patient, especially on account of the clinical practice of hypothermia and anesthesia in the brain hypothermia treatment. Secondly, the model is represented by the state equation as a lumped parameter linear dynamic system. The biothermal model is justified from their various responses corresponding to clinical phenomena and treatment. Finally, the optimal regulator is tentatively designed to give clinicians some suggestions on the optimal temperature regulation of the patient’s brain. It suggests the patient’s brain temperature could be optimally controlled to follow-up the temperature process prescribed by the clinicians. This study benefits us a great clinical possibility for the automatic hypothermia treatment.
Tian, Shu; Yin, Xu-Cheng; Wang, Zhi-Bin; Zhou, Fang; Hao, Hong-Wei
2015-01-01
The phacoemulsification surgery is one of the most advanced surgeries to treat cataract. However, the conventional surgeries are always with low automatic level of operation and over reliance on the ability of surgeons. Alternatively, one imaginative scene is to use video processing and pattern recognition technologies to automatically detect the cataract grade and intelligently control the release of the ultrasonic energy while operating. Unlike cataract grading in the diagnosis system with static images, complicated background, unexpected noise, and varied information are always introduced in dynamic videos of the surgery. Here we develop a Video-Based Intelligent Recognitionand Decision (VeBIRD) system, which breaks new ground by providing a generic framework for automatically tracking the operation process and classifying the cataract grade in microscope videos of the phacoemulsification cataract surgery. VeBIRD comprises a robust eye (iris) detector with randomized Hough transform to precisely locate the eye in the noise background, an effective probe tracker with Tracking-Learning-Detection to thereafter track the operation probe in the dynamic process, and an intelligent decider with discriminative learning to finally recognize the cataract grade in the complicated video. Experiments with a variety of real microscope videos of phacoemulsification verify VeBIRD's effectiveness.
Yin, Xu-Cheng; Wang, Zhi-Bin; Zhou, Fang; Hao, Hong-Wei
2015-01-01
The phacoemulsification surgery is one of the most advanced surgeries to treat cataract. However, the conventional surgeries are always with low automatic level of operation and over reliance on the ability of surgeons. Alternatively, one imaginative scene is to use video processing and pattern recognition technologies to automatically detect the cataract grade and intelligently control the release of the ultrasonic energy while operating. Unlike cataract grading in the diagnosis system with static images, complicated background, unexpected noise, and varied information are always introduced in dynamic videos of the surgery. Here we develop a Video-Based Intelligent Recognitionand Decision (VeBIRD) system, which breaks new ground by providing a generic framework for automatically tracking the operation process and classifying the cataract grade in microscope videos of the phacoemulsification cataract surgery. VeBIRD comprises a robust eye (iris) detector with randomized Hough transform to precisely locate the eye in the noise background, an effective probe tracker with Tracking-Learning-Detection to thereafter track the operation probe in the dynamic process, and an intelligent decider with discriminative learning to finally recognize the cataract grade in the complicated video. Experiments with a variety of real microscope videos of phacoemulsification verify VeBIRD's effectiveness. PMID:26693249
Semi-automatic 3D lung nodule segmentation in CT using dynamic programming
NASA Astrophysics Data System (ADS)
Sargent, Dustin; Park, Sun Young
2017-02-01
We present a method for semi-automatic segmentation of lung nodules in chest CT that can be extended to general lesion segmentation in multiple modalities. Most semi-automatic algorithms for lesion segmentation or similar tasks use region-growing or edge-based contour finding methods such as level-set. However, lung nodules and other lesions are often connected to surrounding tissues, which makes these algorithms prone to growing the nodule boundary into the surrounding tissue. To solve this problem, we apply a 3D extension of the 2D edge linking method with dynamic programming to find a closed surface in a spherical representation of the nodule ROI. The algorithm requires a user to draw a maximal diameter across the nodule in the slice in which the nodule cross section is the largest. We report the lesion volume estimation accuracy of our algorithm on the FDA lung phantom dataset, and the RECIST diameter estimation accuracy on the lung nodule dataset from the SPIE 2016 lung nodule classification challenge. The phantom results in particular demonstrate that our algorithm has the potential to mitigate the disparity in measurements performed by different radiologists on the same lesions, which could improve the accuracy of disease progression tracking.
NASA Astrophysics Data System (ADS)
Valdes-Abellan, Javier; Jiménez-Martínez, Joaquin; Candela, Lucila
2013-04-01
For monitoring the vadose zone, different strategies can be chosen, depending on the objectives and scale of observation. The effects of non-conventional water use on the vadose zone might produce impacts in porous media which could lead to changes in soil hydraulic properties, among others. Controlling these possible effects requires an accurate monitoring strategy that controls the volumetric water content, θ, and soil pressure, h, along the studied profile. According to the available literature, different monitoring systems have been carried out independently, however less attention has received comparative studies between different techniques. An experimental plot of 9x5 m2 was set with automatic and non-automatic sensors to control θ and h up to 1.5m depth. The non-automatic system consisted of ten Jet Fill tensiometers at 30, 45, 60, 90 and 120 cm (Soil Moisture®) and a polycarbonate access tube of 44 mm (i.d) for soil moisture measurements with a TRIME FM TDR portable probe (IMKO®). Vertical installation was carefully performed; measurements with this system were manual, twice a week for θ and three times per week for h. The automatic system composed of five 5TE sensors (Decagon Devices®) installed at 20, 40, 60, 90 and 120 cm for θ measurements and one MPS1 sensor (Decagon Devices®) at 60 cm depth for h. Installation took place laterally in a 40-50 cm length hole bored in a side of a trench that was excavated. All automatic sensors hourly recorded and stored in a data-logger. Boundary conditions were controlled with a volume-meter and with a meteorological station. ET was modelled with Penman-Monteith equation. Soil characterization include bulk density, gravimetric water content, grain size distribution, saturated hydraulic conductivity and soil water retention curves determined following laboratory standards. Soil mineralogy was determined by X-Ray difractometry. Unsaturated soil hydraulic parameters were model-fitted through SWRC-fit code and ROSETTA based on soil textural fractions. Simulation of water flow using automatic and non-automatic date was carried out by HYDRUS-1D independently. A good agreement from collected automatic and non-automatic data and modelled results can be recognized. General trend was captured, except for the outlier values as expected. Slightly differences were found between hydraulic properties obtained from laboratory determinations, and from inverse modelling from the two approaches. Differences up to 14% of flux through the lower boundary were detected between the two strategies According to results, automatic sensors have more resolution and then they're more appropriated to detect subtle changes of soil hydraulic properties. Nevertheless, if the aim of the research is to control the general trend of water dynamics, no significant differences were observed between the two systems.
Samson, Nathalie; Praud, Jean-Paul; Quenet, Brigitte; Similowski, Thomas; Straus, Christian
2017-01-18
Sucking, swallowing and breathing are dynamic motor behaviors. Breathing displays features of chaos-like dynamics, in particular nonlinearity and complexity, which take their source in the automatic command of breathing. In contrast, buccal/gill ventilation in amphibians is one of the rare motor behaviors that do not display nonlinear complexity. This study aimed at assessing whether sucking and swallowing would also follow nonlinear complex dynamics in the newborn lamb. Breathing movements were recorded before, during and after bottle-feeding. Sucking pressure and the integrated EMG of the thyroartenoid muscle, as an index of swallowing, were recorded during bottle-feeding. Nonlinear complexity of the whole signals was assessed through the calculation of the noise limit value (NL). Breathing and swallowing always exhibited chaos-like dynamics. The NL of breathing did not change significantly before, during or after bottle-feeding. On the other hand, sucking inconsistently and significantly less frequently than breathing exhibited a chaos-like dynamics. Therefore, the central pattern generator (CPG) that drives sucking may be functionally different from the breathing CPG. Furthermore, the analogy between buccal/gill ventilation and sucking suggests that the latter may take its phylogenetic origin in the gill ventilation CPG of the common ancestor of extant amphibians and mammals. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
A Dimensionality Reduction-Based Multi-Step Clustering Method for Robust Vessel Trajectory Analysis
Liu, Jingxian; Wu, Kefeng
2017-01-01
The Shipboard Automatic Identification System (AIS) is crucial for navigation safety and maritime surveillance, data mining and pattern analysis of AIS information have attracted considerable attention in terms of both basic research and practical applications. Clustering of spatio-temporal AIS trajectories can be used to identify abnormal patterns and mine customary route data for transportation safety. Thus, the capacities of navigation safety and maritime traffic monitoring could be enhanced correspondingly. However, trajectory clustering is often sensitive to undesirable outliers and is essentially more complex compared with traditional point clustering. To overcome this limitation, a multi-step trajectory clustering method is proposed in this paper for robust AIS trajectory clustering. In particular, the Dynamic Time Warping (DTW), a similarity measurement method, is introduced in the first step to measure the distances between different trajectories. The calculated distances, inversely proportional to the similarities, constitute a distance matrix in the second step. Furthermore, as a widely-used dimensional reduction method, Principal Component Analysis (PCA) is exploited to decompose the obtained distance matrix. In particular, the top k principal components with above 95% accumulative contribution rate are extracted by PCA, and the number of the centers k is chosen. The k centers are found by the improved center automatically selection algorithm. In the last step, the improved center clustering algorithm with k clusters is implemented on the distance matrix to achieve the final AIS trajectory clustering results. In order to improve the accuracy of the proposed multi-step clustering algorithm, an automatic algorithm for choosing the k clusters is developed according to the similarity distance. Numerous experiments on realistic AIS trajectory datasets in the bridge area waterway and Mississippi River have been implemented to compare our proposed method with traditional spectral clustering and fast affinity propagation clustering. Experimental results have illustrated its superior performance in terms of quantitative and qualitative evaluations. PMID:28777353
Work zone speed reduction utilizing dynamic speed signs
DOT National Transportation Integrated Search
2011-08-30
Vast quantities of transportation data are automatically recorded by intelligent transportations infrastructure, such as inductive loop detectors, video cameras, and side-fire radar devices. Such devices are typically deployed by traffic management c...
Partanen, Eino; Leminen, Alina; de Paoli, Stine; Bundgaard, Anette; Kingo, Osman Skjold; Krøjgaard, Peter; Shtyrov, Yury
2017-07-15
Children learn new words and word forms with ease, often acquiring a new word after very few repetitions. Recent neurophysiological research on word form acquisition in adults indicates that novel words can be acquired within minutes of repetitive exposure to them, regardless of the individual's focused attention on the speech input. Although it is well-known that children surpass adults in language acquisition, the developmental aspects of such rapid and automatic neural acquisition mechanisms remain unexplored. To address this open question, we used magnetoencephalography (MEG) to scrutinise brain dynamics elicited by spoken words and word-like sounds in healthy monolingual (Danish) children throughout a 20-min repetitive passive exposure session. We found rapid neural dynamics manifested as an enhancement of early (~100ms) brain activity over the short exposure session, with distinct spatiotemporal patterns for different novel sounds. For novel Danish word forms, signs of such enhancement were seen in the left temporal regions only, suggesting reliance on pre-existing language circuits for acquisition of novel word forms with native phonology. In contrast, exposure both to novel word forms with non-native phonology and to novel non-speech sounds led to activity enhancement in both left and right hemispheres, suggesting that more wide-spread cortical networks contribute to the build-up of memory traces for non-native and non-speech sounds. Similar studies in adults have previously reported more sluggish (~15-25min, as opposed to 4min in the present study) or non-existent neural dynamics for non-native sound acquisition, which might be indicative of a higher degree of plasticity in the children's brain. Overall, the results indicate a rapid and highly plastic mechanism for a dynamic build-up of memory traces for novel acoustic information in the children's brain that operates automatically and recruits bilateral temporal cortical circuits. Copyright © 2017 Elsevier Inc. All rights reserved.
Segmentation, dynamic storage, and variable loading on CDC equipment
NASA Technical Reports Server (NTRS)
Tiffany, S. H.
1980-01-01
Techniques for varying the segmented load structure of a program and for varying the dynamic storage allocation, depending upon whether a batch type or interactive type run is desired, are explained and demonstrated. All changes are based on a single data input to the program. The techniques involve: code within the program to suppress scratch pad input/output (I/O) for a batch run or translate the in-core data storage area from blank common to the end-of-code+1 address of a particular segment for an interactive run; automatic editing of the segload directives prior to loading, based upon data input to the program, to vary the structure of the load for interactive and batch runs; and automatic editing of the load map to determine the initial addresses for in core data storage for an interactive run.
Automatic Generation of OpenMP Directives and Its Application to Computational Fluid Dynamics Codes
NASA Technical Reports Server (NTRS)
Yan, Jerry; Jin, Haoqiang; Frumkin, Michael; Yan, Jerry (Technical Monitor)
2000-01-01
The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. As great progress was made in hardware and software technologies, performance of parallel programs with compiler directives has demonstrated large improvement. The introduction of OpenMP directives, the industrial standard for shared-memory programming, has minimized the issue of portability. In this study, we have extended CAPTools, a computer-aided parallelization toolkit, to automatically generate OpenMP-based parallel programs with nominal user assistance. We outline techniques used in the implementation of the tool and discuss the application of this tool on the NAS Parallel Benchmarks and several computational fluid dynamics codes. This work demonstrates the great potential of using the tool to quickly port parallel programs and also achieve good performance that exceeds some of the commercial tools.
Automatic Parking of Self-Driving CAR Based on LIDAR
NASA Astrophysics Data System (ADS)
Lee, B.; Wei, Y.; Guo, I. Y.
2017-09-01
To overcome the deficiency of ultrasonic sensor and camera, this paper proposed a method of autonomous parking based on the self-driving car, using HDL-32E LiDAR. First the 3-D point cloud data was preprocessed. Then we calculated the minimum size of parking space according to the dynamic theories of vehicle. Second the rapidly-exploring random tree algorithm (RRT) algorithm was improved in two aspects based on the moving characteristic of autonomous car. And we calculated the parking path on the basis of the vehicle's dynamics and collision constraints. Besides, we used the fuzzy logic controller to control the brake and accelerator in order to realize the stably of speed. At last the experiments were conducted in an autonomous car, and the results show that the proposed automatic parking system is feasible and effective.
SAMuS: Service-Oriented Architecture for Multisensor Surveillance in Smart Homes
Van de Walle, Rik
2014-01-01
The design of a service-oriented architecture for multisensor surveillance in smart homes is presented as an integrated solution enabling automatic deployment, dynamic selection, and composition of sensors. Sensors are implemented as Web-connected devices, with a uniform Web API. RESTdesc is used to describe the sensors and a novel solution is presented to automatically compose Web APIs that can be applied with existing Semantic Web reasoners. We evaluated the solution by building a smart Kinect sensor that is able to dynamically switch between IR and RGB and optimizing person detection by incorporating feedback from pressure sensors, as such demonstrating the collaboration among sensors to enhance detection of complex events. The performance results show that the platform scales for many Web APIs as composition time remains limited to a few hundred milliseconds in almost all cases. PMID:24778579
Automatic Adaptation to Fast Input Changes in a Time-Invariant Neural Circuit
Bharioke, Arjun; Chklovskii, Dmitri B.
2015-01-01
Neurons must faithfully encode signals that can vary over many orders of magnitude despite having only limited dynamic ranges. For a correlated signal, this dynamic range constraint can be relieved by subtracting away components of the signal that can be predicted from the past, a strategy known as predictive coding, that relies on learning the input statistics. However, the statistics of input natural signals can also vary over very short time scales e.g., following saccades across a visual scene. To maintain a reduced transmission cost to signals with rapidly varying statistics, neuronal circuits implementing predictive coding must also rapidly adapt their properties. Experimentally, in different sensory modalities, sensory neurons have shown such adaptations within 100 ms of an input change. Here, we show first that linear neurons connected in a feedback inhibitory circuit can implement predictive coding. We then show that adding a rectification nonlinearity to such a feedback inhibitory circuit allows it to automatically adapt and approximate the performance of an optimal linear predictive coding network, over a wide range of inputs, while keeping its underlying temporal and synaptic properties unchanged. We demonstrate that the resulting changes to the linearized temporal filters of this nonlinear network match the fast adaptations observed experimentally in different sensory modalities, in different vertebrate species. Therefore, the nonlinear feedback inhibitory network can provide automatic adaptation to fast varying signals, maintaining the dynamic range necessary for accurate neuronal transmission of natural inputs. PMID:26247884
Debbarma, Sanjoy; Saikia, Lalit Chandra; Sinha, Nidul
2014-03-01
Present work focused on automatic generation control (AGC) of a three unequal area thermal systems considering reheat turbines and appropriate generation rate constraints (GRC). A fractional order (FO) controller named as I(λ)D(µ) controller based on crone approximation is proposed for the first time as an appropriate technique to solve the multi-area AGC problem in power systems. A recently developed metaheuristic algorithm known as firefly algorithm (FA) is used for the simultaneous optimization of the gains and other parameters such as order of integrator (λ) and differentiator (μ) of I(λ)D(µ) controller and governor speed regulation parameters (R). The dynamic responses corresponding to optimized I(λ)D(µ) controller gains, λ, μ, and R are compared with that of classical integer order (IO) controllers such as I, PI and PID controllers. Simulation results show that the proposed I(λ)D(µ) controller provides more improved dynamic responses and outperforms the IO based classical controllers. Further, sensitivity analysis confirms the robustness of the so optimized I(λ)D(µ) controller to wide changes in system loading conditions and size and position of SLP. Proposed controller is also found to have performed well as compared to IO based controllers when SLP takes place simultaneously in any two areas or all the areas. Robustness of the proposed I(λ)D(µ) controller is also tested against system parameter variations. © 2013 ISA. Published by Elsevier Ltd. All rights reserved.
Ceballos, Melisa Rodas; García-Tenorio, Rafael; Estela, José Manuel; Cerdà, Víctor; Ferrer, Laura
2017-12-01
Leached fractions of U and Th from different environmental solid matrices were evaluated by an automatic system enabling the on-line lixiviation and extraction/pre-concentration of these two elements previous ICP-MS detection. UTEVA resin was used as selective extraction material. Ten leached fraction, using artificial rainwater (pH 5.4) as leaching agent, and a residual fraction were analyzed for each sample, allowing the study of behavior of U and Th in dynamic lixiviation conditions. Multivariate techniques have been employed for the efficient optimization of the independent variables that affect the lixiviation process. The system reached LODs of 0.1 and 0.7ngkg -1 of U and Th, respectively. The method was satisfactorily validated for three solid matrices, by the analysis of a soil reference material (IAEA-375), a certified sediment reference material (BCR- 320R) and a phosphogypsum reference material (MatControl CSN-CIEMAT 2008). Besides, environmental samples were analyzed, showing a similar behavior, i.e. the content of radionuclides decreases with the successive extractions. In all cases, the accumulative leached fraction of U and Th for different solid matrices studied (soil, sediment and phosphogypsum) were extremely low, up to 0.05% and 0.005% of U and Th, respectively. However, a great variability was observed in terms of mass concentration released, e.g. between 44 and 13,967ngUkg -1 . Copyright © 2017 Elsevier B.V. All rights reserved.
Li, Pingjing; He, Man; Chen, Beibei; Hu, Bin
2015-10-09
A simple home-made automatic dynamic hollow fiber based liquid-liquid-liquid microextraction (AD-HF-LLLME) device was designed and constructed for the simultaneous extraction of organomercury and inorganic mercury species with the assistant of a programmable flow injection analyzer. With 18-crown-6 as the complexing reagent, mercury species including methyl-, ethyl-, phenyl- and inorganic mercury were extracted into the organic phase (chlorobenzene), and then back-extracted into the acceptor phase of 0.1% (m/v) 3-mercapto-1-propanesulfonic acid (MPS) aqueous solution. Compared with automatic static (AS)-HF-LLLME system, the extraction equilibrium of target mercury species was obtained in shorter time with higher extraction efficiency in AD-HF-LLLME system. Based on it, a new method of AD-HF-LLLME coupled with large volume sample stacking (LVSS)-capillary electrophoresis (CE)/UV detection was developed for the simultaneous analysis of methyl-, phenyl- and inorganic mercury species in biological samples and environmental water. Under the optimized conditions, AD-HF-LLLME provided high enrichment factors (EFs) of 149-253-fold within relatively short extraction equilibrium time (25min) and good precision with RSD between 3.8 and 8.1%. By combining AD-HF-LLLME with LVSS-CE/UV, EFs were magnified up to 2195-fold and the limits of detection (at S/N=3) for target mercury species were improved to be sub ppb level. Copyright © 2015 Elsevier B.V. All rights reserved.
Comparison of automatic control systems
NASA Technical Reports Server (NTRS)
Oppelt, W
1941-01-01
This report deals with a reciprocal comparison of an automatic pressure control, an automatic rpm control, an automatic temperature control, and an automatic directional control. It shows the difference between the "faultproof" regulator and the actual regulator which is subject to faults, and develops this difference as far as possible in a parallel manner with regard to the control systems under consideration. Such as analysis affords, particularly in its extension to the faults of the actual regulator, a deep insight into the mechanism of the regulator process.
Dynamic load balancing algorithm for molecular dynamics based on Voronoi cells domain decompositions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fattebert, J.-L.; Richards, D.F.; Glosli, J.N.
2012-12-01
We present a new algorithm for automatic parallel load balancing in classical molecular dynamics. It assumes a spatial domain decomposition of particles into Voronoi cells. It is a gradient method which attempts to minimize a cost function by displacing Voronoi sites associated with each processor/sub-domain along steepest descent directions. Excellent load balance has been obtained for quasi-2D and 3D practical applications, with up to 440·10 6 particles on 65,536 MPI tasks.
An Analysis of Eruptions Detected by the LMSAL Eruption Patrol
NASA Astrophysics Data System (ADS)
Hurlburt, N. E.; Higgins, P. A.; Jaffey, S.
2014-12-01
Observations of the solar atmosphere reveals a wide range of real and apparent motions, from small scale jets and spicules to global-scale coronal mass ejections. Identifying and characterizing these motions are essential to advance our understanding the drivers of space weather. Automated and visual identifications are used in identifying CMEs. To date, the precursors to these — eruptions near the solar surface — have been identified primarily by visual inspection. Here we report on an analysis of the eruptions detected by the Eruption Patrol, a data mining module designed to automatically identify eruptions from data collected by Solar Dynamics Observatory's Atmospheric Imaging Assembly (SDO/AIA). We describe the module and use it both to explore relations with other solar events recorded in the Heliophysics Event Knowledgebase and to identify and access data collected by the Interface Region Imaging Spectrograph (IRIS) and Solar Optical Telescope (SOT) on Hinode for further analysis.
Content-based analysis of Ki-67 stained meningioma specimens for automatic hot-spot selection.
Swiderska-Chadaj, Zaneta; Markiewicz, Tomasz; Grala, Bartlomiej; Lorent, Malgorzata
2016-10-07
Hot-spot based examination of immunohistochemically stained histological specimens is one of the most important procedures in pathomorphological practice. The development of image acquisition equipment and computational units allows for the automation of this process. Moreover, a lot of possible technical problems occur in everyday histological material, which increases the complexity of the problem. Thus, a full context-based analysis of histological specimens is also needed in the quantification of immunohistochemically stained specimens. One of the most important reactions is the Ki-67 proliferation marker in meningiomas, the most frequent intracranial tumour. The aim of our study is to propose a context-based analysis of Ki-67 stained specimens of meningiomas for automatic selection of hot-spots. The proposed solution is based on textural analysis, mathematical morphology, feature ranking and classification, as well as on the proposed hot-spot gradual extinction algorithm to allow for the proper detection of a set of hot-spot fields. The designed whole slide image processing scheme eliminates such artifacts as hemorrhages, folds or stained vessels from the region of interest. To validate automatic results, a set of 104 meningioma specimens were selected and twenty hot-spots inside them were identified independently by two experts. The Spearman rho correlation coefficient was used to compare the results which were also analyzed with the help of a Bland-Altman plot. The results show that most of the cases (84) were automatically examined properly with two fields of view with a technical problem at the very most. Next, 13 had three such fields, and only seven specimens did not meet the requirement for the automatic examination. Generally, the Automatic System identifies hot-spot areas, especially their maximum points, better. Analysis of the results confirms the very high concordance between an automatic Ki-67 examination and the expert's results, with a Spearman rho higher than 0.95. The proposed hot-spot selection algorithm with an extended context-based analysis of whole slide images and hot-spot gradual extinction algorithm provides an efficient tool for simulation of a manual examination. The presented results have confirmed that the automatic examination of Ki-67 in meningiomas could be introduced in the near future.
Effects of 99mTc-TRODAT-1 drug template on image quantitative analysis
Yang, Bang-Hung; Chou, Yuan-Hwa; Wang, Shyh-Jen; Chen, Jyh-Cheng
2018-01-01
99mTc-TRODAT-1 is a type of drug that can bind to dopamine transporters in living organisms and is often used in SPCT imaging for observation of changes in the activity uptake of dopamine in the striatum. Therefore, it is currently widely used in studies on clinical diagnosis of Parkinson’s disease (PD) and movement-related disorders. In conventional 99mTc-TRODAT-1 SPECT image evaluation, visual inspection or manual selection of ROI for semiquantitative analysis is mainly used to observe and evaluate the degree of striatal defects. However, these methods are dependent on the subjective opinions of observers, which lead to human errors, have shortcomings such as long duration, increased effort, and have low reproducibility. To solve this problem, this study aimed to establish an automatic semiquantitative analytical method for 99mTc-TRODAT-1. This method combines three drug templates (one built-in SPECT template in SPM software and two self-generated MRI-based and HMPAO-based TRODAT-1 templates) for the semiquantitative analysis of the striatal phantom and clinical images. At the same time, the results of automatic analysis of the three templates were compared with results from a conventional manual analysis for examining the feasibility of automatic analysis and the effects of drug templates on automatic semiquantitative analysis results. After comparison, it was found that the MRI-based TRODAT-1 template generated from MRI images is the most suitable template for 99mTc-TRODAT-1 automatic semiquantitative analysis. PMID:29543874
Tavano, Alessandro; Pesarin, Anna; Murino, Vittorio; Cristani, Marco
2014-01-01
Individuals with Asperger syndrome/High Functioning Autism fail to spontaneously attribute mental states to the self and others, a life-long phenotypic characteristic known as mindblindness. We hypothesized that mindblindness would affect the dynamics of conversational interaction. Using generative models, in particular Gaussian mixture models and observed influence models, conversations were coded as interacting Markov processes, operating on novel speech/silence patterns, termed Steady Conversational Periods (SCPs). SCPs assume that whenever an agent's process changes state (e.g., from silence to speech), it causes a general transition of the entire conversational process, forcing inter-actant synchronization. SCPs fed into observed influence models, which captured the conversational dynamics of children and adolescents with Asperger syndrome/High Functioning Autism, and age-matched typically developing participants. Analyzing the parameters of the models by means of discriminative classifiers, the dialogs of patients were successfully distinguished from those of control participants. We conclude that meaning-free speech/silence sequences, reflecting inter-actant synchronization, at least partially encode typical and atypical conversational dynamics. This suggests a direct influence of theory of mind abilities onto basic speech initiative behavior.
NASA Astrophysics Data System (ADS)
Almesallmy, Mohammed
Methodologies are developed for dynamic analysis of mechanical systems with emphasis on inertial propulsion systems. This work adopted the Lagrangian methodology. Lagrangian methodology is the most efficient classical computational technique, which we call Equations of Motion Code (EOMC). The EOMC is applied to several simple dynamic mechanical systems for easier understanding of the method and to aid other investigators in developing equations of motion of any dynamic system. In addition, it is applied to a rigid multibody system, such as Thomson IPS [Thomson 1986]. Furthermore, a simple symbolic algorithm is developed using Maple software, which can be used to convert any nonlinear n-order ordinary differential equation (ODE) systems into 1st-order ODE system in ready format to be used in Matlab software. A side issue, but equally important, we have started corresponding with the U.S. Patent office to persuade them that patent applications, claiming gross linear motion based on inertial propulsion systems should be automatically rejected. The precedent is rejection of patent applications involving perpetual motion machines.
Time-variant analysis of rotorcraft systems dynamics - An exploitation of vector processors
NASA Technical Reports Server (NTRS)
Amirouche, F. M. L.; Xie, M.; Shareef, N. H.
1993-01-01
In this paper a generalized algorithmic procedure is presented for handling constraints in mechanical transmissions. The latter are treated as multibody systems of interconnected rigid/flexible bodies. The constraint Jacobian matrices are generated automatically and suitably updated in time, depending on the geometrical and kinematical constraint conditions describing the interconnection between shafts or gears. The type of constraints are classified based on the interconnection of the bodies by assuming that one or more points of contact exist between them. The effects due to elastic deformation of the flexible bodies are included by allowing each body element to undergo small deformations. The procedure is based on recursively formulated Kane's dynamical equations of motion and the finite element method, including the concept of geometrical stiffening effects. The method is implemented on an IBM-3090-600j vector processor with pipe-lining capabilities. A significant increase in the speed of execution is achieved by vectorizing the developed code in computationally intensive areas. An example consisting of two meshing disks rotating at high angular velocity is presented. Applications are intended for the study of the dynamic behavior of helicopter transmissions.
MOD-0A 200 kW wind turbine generator design and analysis report
NASA Astrophysics Data System (ADS)
Anderson, T. S.; Bodenschatz, C. A.; Eggers, A. G.; Hughes, P. S.; Lampe, R. F.; Lipner, M. H.; Schornhorst, J. R.
1980-08-01
The design, analysis, and initial performance of the MOD-OA 200 kW wind turbine generator at Clayton, NM is documented. The MOD-OA was designed and built to obtain operation and performance data and experience in utility environments. The project requirements, approach, system description, design requirements, design, analysis, system tests, installation, safety considerations, failure modes and effects analysis, data acquisition, and initial performance for the wind turbine are discussed. The design and analysis of the rotor, drive train, nacelle equipment, yaw drive mechanism and brake, tower, foundation, electricl system, and control systems are presented. The rotor includes the blades, hub, and pitch change mechanism. The drive train includes the low speed shaft, speed increaser, high speed shaft, and rotor brake. The electrical system includes the generator, switchgear, transformer, and utility connection. The control systems are the blade pitch, yaw, and generator control, and the safety system. Manual, automatic, and remote control are discussed. Systems analyses on dynamic loads and fatigue are presented.
MOD-0A 200 kW wind turbine generator design and analysis report
NASA Technical Reports Server (NTRS)
Anderson, T. S.; Bodenschatz, C. A.; Eggers, A. G.; Hughes, P. S.; Lampe, R. F.; Lipner, M. H.; Schornhorst, J. R.
1980-01-01
The design, analysis, and initial performance of the MOD-OA 200 kW wind turbine generator at Clayton, NM is documented. The MOD-OA was designed and built to obtain operation and performance data and experience in utility environments. The project requirements, approach, system description, design requirements, design, analysis, system tests, installation, safety considerations, failure modes and effects analysis, data acquisition, and initial performance for the wind turbine are discussed. The design and analysis of the rotor, drive train, nacelle equipment, yaw drive mechanism and brake, tower, foundation, electricl system, and control systems are presented. The rotor includes the blades, hub, and pitch change mechanism. The drive train includes the low speed shaft, speed increaser, high speed shaft, and rotor brake. The electrical system includes the generator, switchgear, transformer, and utility connection. The control systems are the blade pitch, yaw, and generator control, and the safety system. Manual, automatic, and remote control are discussed. Systems analyses on dynamic loads and fatigue are presented.
Headway Deviation Effects on Bus Passenger Loads : Analysis of Tri-Met's Archived AVL-APC Data
DOT National Transportation Integrated Search
2003-01-01
In this paper we empirically analyze the relationship between transit service headway deviations and passenger loads, using archived data from Tri-Met's automatic vehicle location and automatic passenger counter systems. The analysis employs twostage...
Automatic Topography Using High Precision Digital Moire Methods
NASA Astrophysics Data System (ADS)
Yatagai, T.; Idesawa, M.; Saito, S.
1983-07-01
Three types of moire topographic methods using digital techniques are proposed. Deformed gratings obtained by projecting a reference grating onto an object under test are subjected to digital analysis. The electronic analysis procedures of deformed gratings described here enable us to distinguish between depression and elevation of the object, so that automatic measurement of 3-D shapes and automatic moire fringe interpolation are performed. Based on the digital moire methods, we have developed a practical measurement system, with a linear photodiode array on a micro-stage as a scanning image sensor. Examples of fringe analysis in medical applications are presented.
Lacson, Ronilda C; Barzilay, Regina; Long, William J
2006-10-01
Spoken medical dialogue is a valuable source of information for patients and caregivers. This work presents a first step towards automatic analysis and summarization of spoken medical dialogue. We first abstract a dialogue into a sequence of semantic categories using linguistic and contextual features integrated in a supervised machine-learning framework. Our model has a classification accuracy of 73%, compared to 33% achieved by a majority baseline (p<0.01). We then describe and implement a summarizer that utilizes this automatically induced structure. Our evaluation results indicate that automatically generated summaries exhibit high resemblance to summaries written by humans. In addition, task-based evaluation shows that physicians can reasonably answer questions related to patient care by looking at the automatically generated summaries alone, in contrast to the physicians' performance when they were given summaries from a naïve summarizer (p<0.05). This work demonstrates the feasibility of automatically structuring and summarizing spoken medical dialogue.
Ivezic, Nenad; Potok, Thomas E.
2003-09-30
A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.
Dynamic load balancing of applications
Wheat, S.R.
1997-05-13
An application-level method for dynamically maintaining global load balance on a parallel computer, particularly on massively parallel MIMD computers is disclosed. Global load balancing is achieved by overlapping neighborhoods of processors, where each neighborhood performs local load balancing. The method supports a large class of finite element and finite difference based applications and provides an automatic element management system to which applications are easily integrated. 13 figs.
AMBULATORY BLOOD PRESSURE MONITORING: THE NEED OF 7-DAY RECORD
HALBERG, F.; KATINAS, G.; CORNÉLISSEN, G.; SCHWARTZKOPFF, O.; FIŠER, B.; SIEGELOVÁ, J.; DUŠEK, J.; JANČÍK, J.
2008-01-01
The need for systematic around-the-clock self-measurements of blood pressure (BP) and heart rate (HR), or preferably for automatic monitoring as the need arises and can be met by inexpensive tools, is illustrated in two case reports. Miniaturized unobtrusive, as yet unavailable instrumentation for the automatic measurement of BP and HR should be a high priority for both government and industry. Automatic ambulatorily functioning monitors already represent great progress, enabling us to introduce the concept of eventually continuous or, as yet, intermittent home ABPM. On BP and HR records, gliding spectra aligned with global spectra visualize the changing dynamics involved in health and disease, and can be part of an eventually automated system of therapy adjusted to the ever-present variability of BP. In the interim, with tools already available, chronomics on self- or automatic measurements can be considered, with analyses provided by the Halberg Chronobiology Center, as an alternative to “flying blind”, as an editor put it. Chronomics assessing variability has to be considered. PMID:19018289
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teuton, Jeremy R.; Griswold, Richard L.; Mehdi, Beata L.
Precise analysis of both (S)TEM images and video are time and labor intensive processes. As an example, determining when crystal growth and shrinkage occurs during the dynamic process of Li dendrite deposition and stripping involves manually scanning through each frame in the video to extract a specific set of frames/images. For large numbers of images, this process can be very time consuming, so a fast and accurate automated method is desirable. Given this need, we developed software that uses analysis of video compression statistics for detecting and characterizing events in large data sets. This software works by converting the datamore » into a series of images which it compresses into an MPEG-2 video using the open source “avconv” utility [1]. The software does not use the video itself, but rather analyzes the video statistics from the first pass of the video encoding that avconv records in the log file. This file contains statistics for each frame of the video including the frame quality, intra-texture and predicted texture bits, forward and backward motion vector resolution, among others. In all, avconv records 15 statistics for each frame. By combining different statistics, we have been able to detect events in various types of data. We have developed an interactive tool for exploring the data and the statistics that aids the analyst in selecting useful statistics for each analysis. Going forward, an algorithm for detecting and possibly describing events automatically can be written based on statistic(s) for each data type.« less
An engineering approach to automatic programming
NASA Technical Reports Server (NTRS)
Rubin, Stuart H.
1990-01-01
An exploratory study of the automatic generation and optimization of symbolic programs using DECOM - a prototypical requirement specification model implemented in pure LISP was undertaken. It was concluded, on the basis of this study, that symbolic processing languages such as LISP can support a style of programming based upon formal transformation and dependent upon the expression of constraints in an object-oriented environment. Such languages can represent all aspects of the software generation process (including heuristic algorithms for effecting parallel search) as dynamic processes since data and program are represented in a uniform format.
A modular (almost) automatic set-up for elastic multi-tenants cloud (micro)infrastructures
NASA Astrophysics Data System (ADS)
Amoroso, A.; Astorino, F.; Bagnasco, S.; Balashov, N. A.; Bianchi, F.; Destefanis, M.; Lusso, S.; Maggiora, M.; Pellegrino, J.; Yan, L.; Yan, T.; Zhang, X.; Zhao, X.
2017-10-01
An auto-installing tool on an usb drive can allow for a quick and easy automatic deployment of OpenNebula-based cloud infrastructures remotely managed by a central VMDIRAC instance. A single team, in the main site of an HEP Collaboration or elsewhere, can manage and run a relatively large network of federated (micro-)cloud infrastructures, making an highly dynamic and elastic use of computing resources. Exploiting such an approach can lead to modular systems of cloud-bursting infrastructures addressing complex real-life scenarios.
Methods for automatically analyzing humpback song units.
Rickwood, Peter; Taylor, Andrew
2008-03-01
This paper presents mathematical techniques for automatically extracting and analyzing bioacoustic signals. Automatic techniques are described for isolation of target signals from background noise, extraction of features from target signals and unsupervised classification (clustering) of the target signals based on these features. The only user-provided inputs, other than raw sound, is an initial set of signal processing and control parameters. Of particular note is that the number of signal categories is determined automatically. The techniques, applied to hydrophone recordings of humpback whales (Megaptera novaeangliae), produce promising initial results, suggesting that they may be of use in automated analysis of not only humpbacks, but possibly also in other bioacoustic settings where automated analysis is desirable.
ERIC Educational Resources Information Center
Cornell Univ., Ithaca, NY. Dept. of Computer Science.
Four papers are included in Part One of the eighteenth report on Salton's Magical Automatic Retriever of Texts (SMART) project. The first paper: "Content Analysis in Information Retrieval" by S. F. Weiss presents the results of experiments aimed at determining the conditions under which content analysis improves retrieval results as well…
19 CFR 360.103 - Automatic issuance of import licenses.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 19 Customs Duties 3 2010-04-01 2010-04-01 false Automatic issuance of import licenses. 360.103 Section 360.103 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE STEEL IMPORT MONITORING AND ANALYSIS SYSTEM § 360.103 Automatic issuance of import licenses. (a) In general. Steel import...
19 CFR 360.103 - Automatic issuance of import licenses.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 19 Customs Duties 3 2014-04-01 2014-04-01 false Automatic issuance of import licenses. 360.103 Section 360.103 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE STEEL IMPORT MONITORING AND ANALYSIS SYSTEM § 360.103 Automatic issuance of import licenses. (a) In general. Steel import...
19 CFR 360.103 - Automatic issuance of import licenses.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 19 Customs Duties 3 2013-04-01 2013-04-01 false Automatic issuance of import licenses. 360.103 Section 360.103 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE STEEL IMPORT MONITORING AND ANALYSIS SYSTEM § 360.103 Automatic issuance of import licenses. (a) In general. Steel import...
19 CFR 360.103 - Automatic issuance of import licenses.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 19 Customs Duties 3 2012-04-01 2012-04-01 false Automatic issuance of import licenses. 360.103 Section 360.103 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE STEEL IMPORT MONITORING AND ANALYSIS SYSTEM § 360.103 Automatic issuance of import licenses. (a) In general. Steel import...
19 CFR 360.103 - Automatic issuance of import licenses.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 19 Customs Duties 3 2011-04-01 2011-04-01 false Automatic issuance of import licenses. 360.103 Section 360.103 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE STEEL IMPORT MONITORING AND ANALYSIS SYSTEM § 360.103 Automatic issuance of import licenses. (a) In general. Steel import...
Automatic Thesaurus Generation for an Electronic Community System.
ERIC Educational Resources Information Center
Chen, Hsinchun; And Others
1995-01-01
This research reports an algorithmic approach to the automatic generation of thesauri for electronic community systems. The techniques used include term filtering, automatic indexing, and cluster analysis. The Worm Community System, used by molecular biologists studying the nematode worm C. elegans, was used as the testbed for this research.…
Automatic Error Analysis Using Intervals
ERIC Educational Resources Information Center
Rothwell, E. J.; Cloud, M. J.
2012-01-01
A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…
NASA Astrophysics Data System (ADS)
Andriushin, A. V.; Zverkov, V. P.; Kuzishchin, V. F.; Ryzhkov, O. S.; Sabanin, V. R.
2017-11-01
The research and setting results of steam pressure in the main steam collector “Do itself” automatic control system (ACS) with high-speed feedback on steam pressure in the turbine regulating stage are presented. The ACS setup is performed on the simulation model of the controlled object developed for this purpose with load-dependent static and dynamic characteristics and a non-linear control algorithm with pulse control of the turbine main servomotor. A method for tuning nonlinear ACS with a numerical algorithm for multiparametric optimization and a procedure for separate dynamic adjustment of control devices in a two-loop ACS are proposed and implemented. It is shown that the nonlinear ACS adjusted with the proposed method with the regulators constant parameters ensures reliable and high-quality operation without the occurrence of oscillations in the transient processes the operating range of the turbine loads.
Dynamic Human Body Modeling Using a Single RGB Camera.
Zhu, Haiyu; Yu, Yao; Zhou, Yu; Du, Sidan
2016-03-18
In this paper, we present a novel automatic pipeline to build personalized parametric models of dynamic people using a single RGB camera. Compared to previous approaches that use monocular RGB images, our system can model a 3D human body automatically and incrementally, taking advantage of human motion. Based on coarse 2D and 3D poses estimated from image sequences, we first perform a kinematic classification of human body parts to refine the poses and obtain reconstructed body parts. Next, a personalized parametric human model is generated by driving a general template to fit the body parts and calculating the non-rigid deformation. Experimental results show that our shape estimation method achieves comparable accuracy with reconstructed models using depth cameras, yet requires neither user interaction nor any dedicated devices, leading to the feasibility of using this method on widely available smart phones.
Dynamic Human Body Modeling Using a Single RGB Camera
Zhu, Haiyu; Yu, Yao; Zhou, Yu; Du, Sidan
2016-01-01
In this paper, we present a novel automatic pipeline to build personalized parametric models of dynamic people using a single RGB camera. Compared to previous approaches that use monocular RGB images, our system can model a 3D human body automatically and incrementally, taking advantage of human motion. Based on coarse 2D and 3D poses estimated from image sequences, we first perform a kinematic classification of human body parts to refine the poses and obtain reconstructed body parts. Next, a personalized parametric human model is generated by driving a general template to fit the body parts and calculating the non-rigid deformation. Experimental results show that our shape estimation method achieves comparable accuracy with reconstructed models using depth cameras, yet requires neither user interaction nor any dedicated devices, leading to the feasibility of using this method on widely available smart phones. PMID:26999159
A Module for Automatic Dock and Detumble (MADD) for orbital rescue operations
NASA Technical Reports Server (NTRS)
Snow, W. R.; Kunciw, B. G.; Kaplan, M. H.
1973-01-01
The module for automatic dock and detumble (MADD) is an automated device for bringing a passive, tumbling space base under control in an orbital rescue situation. The conceptual design of such a device resulted from a consideration of tumbling motion analyses and mission constraints. Specific topics of investigation include orbit and attitude dynamics and detumble profiles. Position and attitude control systems for the various phases of operation were developed. Dynamic motion of a passive vehicle with MADD attached is considered as an example application and to determine control requirements. Since time is a critical factor in rescue operations, it is essential to execute the detumbling maneuver in a minimum of time. Optimization of the MADD thrusting sequence has also been investigated. Results indicate the control torque must be directed opposite to the angular momentum vector for the assumption used here.
Automatic guidance control of an articulated all-wheel-steered vehicle
NASA Astrophysics Data System (ADS)
Kim, Young Chol; Yun, Kyong-Han; Min, Kyung-Deuk
2014-04-01
This paper presents automatic guidance control of a single-articulated all-wheel-steered vehicle being developed by the Korea Railroad Research Institute. The vehicle has an independent drive motor on each wheel except for the front axle. The guidance controller is designed so that the vehicle follows the given reference path within permissible lateral deviations. We use a three-input/three-output linearised model derived from the nonlinear dynamic model of the vehicle. For the purpose of simplifying the controller and making it tunable, we consider a decentralised control configuration. We first design a second-order decoupling compensator for the two-input/two-output system that is strongly coupled and then design a first-order controller for each decoupled feedback loop by using the characteristic ratio assignment method. The simulation results for the nonlinear dynamic model indicate that the proposed control configuration successfully achieves the design objectives.
Initial dynamics of the EKG during an electrical defibrillation of the heart
NASA Technical Reports Server (NTRS)
Bikov, I. I.; Chebotarov, Y. P.; Nikolaev, V. G.
1980-01-01
In tests on 11 mature dogs, immobilized by means of an automatic blocking and synchronization system, artefact free EKG were obtained, beginning 0.04-0.06 sec after passage of a defibrillating current. Different versions of the start of fibrillation were noted, in application of the defibrillating stimulus in the early phase of the cardiac cycle. A swinging phenomenon, increasing amplitude, of fibrillation was noted for 0.4-1.5 sec after delivery of a subthreshold stimulus. Conditions for a positive outcome of repeated defibrillation were found, and a relationship was noted between the configuration of the exciting process with respect to the lines of force of the defibrillating current and the defibrillation threshold. It was shown that the initial EKG dynamics after defibrillation is based on a gradual shift of the pacemaker from the myocardium of the ventricles to the sinus node, through phases of atrioventricular and atrial automatism.
Automatic Parametrization of Somatosensory Evoked Potentials With Chirp Modeling.
Vayrynen, Eero; Noponen, Kai; Vipin, Ashwati; Thow, X Y; Al-Nashash, Hasan; Kortelainen, Jukka; All, Angelo
2016-09-01
In this paper, an approach using polynomial phase chirp signals to model somatosensory evoked potentials (SEPs) is proposed. SEP waveforms are assumed as impulses undergoing group velocity dispersion while propagating along a multipath neural connection. Mathematical analysis of pulse dispersion resulting in chirp signals is performed. An automatic parameterization of SEPs is proposed using chirp models. A Particle Swarm Optimization algorithm is used to optimize the model parameters. Features describing the latencies and amplitudes of SEPs are automatically derived. A rat model is then used to evaluate the automatic parameterization of SEPs in two experimental cases, i.e., anesthesia level and spinal cord injury (SCI). Experimental results show that chirp-based model parameters and the derived SEP features are significant in describing both anesthesia level and SCI changes. The proposed automatic optimization based approach for extracting chirp parameters offers potential for detailed SEP analysis in future studies. The method implementation in Matlab technical computing language is provided online.
Finite element fatigue analysis of rectangular clutch spring of automatic slack adjuster
NASA Astrophysics Data System (ADS)
Xu, Chen-jie; Luo, Zai; Hu, Xiao-feng; Jiang, Wen-song
2015-02-01
The failure of rectangular clutch spring of automatic slack adjuster directly affects the work of automatic slack adjuster. We establish the structural mechanics model of automatic slack adjuster rectangular clutch spring based on its working principle and mechanical structure. In addition, we upload such structural mechanics model to ANSYS Workbench FEA system to predict the fatigue life of rectangular clutch spring. FEA results show that the fatigue life of rectangular clutch spring is 2.0403×105 cycle under the effect of braking loads. In the meantime, fatigue tests of 20 automatic slack adjusters are carried out on the fatigue test bench to verify the conclusion of the structural mechanics model. The experimental results show that the mean fatigue life of rectangular clutch spring is 1.9101×105, which meets the results based on the finite element analysis using ANSYS Workbench FEA system.
Fully automatic registration and segmentation of first-pass myocardial perfusion MR image sequences.
Gupta, Vikas; Hendriks, Emile A; Milles, Julien; van der Geest, Rob J; Jerosch-Herold, Michael; Reiber, Johan H C; Lelieveldt, Boudewijn P F
2010-11-01
Derivation of diagnostically relevant parameters from first-pass myocardial perfusion magnetic resonance images involves the tedious and time-consuming manual segmentation of the myocardium in a large number of images. To reduce the manual interaction and expedite the perfusion analysis, we propose an automatic registration and segmentation method for the derivation of perfusion linked parameters. A complete automation was accomplished by first registering misaligned images using a method based on independent component analysis, and then using the registered data to automatically segment the myocardium with active appearance models. We used 18 perfusion studies (100 images per study) for validation in which the automatically obtained (AO) contours were compared with expert drawn contours on the basis of point-to-curve error, Dice index, and relative perfusion upslope in the myocardium. Visual inspection revealed successful segmentation in 15 out of 18 studies. Comparison of the AO contours with expert drawn contours yielded 2.23 ± 0.53 mm and 0.91 ± 0.02 as point-to-curve error and Dice index, respectively. The average difference between manually and automatically obtained relative upslope parameters was found to be statistically insignificant (P = .37). Moreover, the analysis time per slice was reduced from 20 minutes (manual) to 1.5 minutes (automatic). We proposed an automatic method that significantly reduced the time required for analysis of first-pass cardiac magnetic resonance perfusion images. The robustness and accuracy of the proposed method were demonstrated by the high spatial correspondence and statistically insignificant difference in perfusion parameters, when AO contours were compared with expert drawn contours. Copyright © 2010 AUR. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Giannini, Valentina; Vignati, Anna; Mazzetti, Simone; De Luca, Massimo; Bracco, Christian; Stasi, Michele; Russo, Filippo; Armando, Enrico; Regge, Daniele
2013-02-01
Prostate specific antigen (PSA)-based screening reduces the rate of death from prostate cancer (PCa) by 31%, but this benefit is associated with a high risk of overdiagnosis and overtreatment. As prostate transrectal ultrasound-guided biopsy, the standard procedure for prostate histological sampling, has a sensitivity of 77% with a considerable false-negative rate, more accurate methods need to be found to detect or rule out significant disease. Prostate magnetic resonance imaging has the potential to improve the specificity of PSA-based screening scenarios as a non-invasive detection tool, in particular exploiting the combination of anatomical and functional information in a multiparametric framework. The purpose of this study was to describe a computer aided diagnosis (CAD) method that automatically produces a malignancy likelihood map by combining information from dynamic contrast enhanced MR images and diffusion weighted images. The CAD system consists of multiple sequential stages, from a preliminary registration of images of different sequences, in order to correct for susceptibility deformation and/or movement artifacts, to a Bayesian classifier, which fused all the extracted features into a probability map. The promising results (AUROC=0.87) should be validated on a larger dataset, but they suggest that the discrimination on a voxel basis between benign and malignant tissues is feasible with good performances. This method can be of benefit to improve the diagnostic accuracy of the radiologist, reduce reader variability and speed up the reading time, automatically highlighting probably cancer suspicious regions.
Automated Functional Analysis of Astrocytes from Chronic Time-Lapse Calcium Imaging Data
Wang, Yinxue; Shi, Guilai; Miller, David J.; Wang, Yizhi; Wang, Congchao; Broussard, Gerard; Wang, Yue; Tian, Lin; Yu, Guoqiang
2017-01-01
Recent discoveries that astrocytes exert proactive regulatory effects on neural information processing and that they are deeply involved in normal brain development and disease pathology have stimulated broad interest in understanding astrocyte functional roles in brain circuit. Measuring astrocyte functional status is now technically feasible, due to recent advances in modern microscopy and ultrasensitive cell-type specific genetically encoded Ca2+ indicators for chronic imaging. However, there is a big gap between the capability of generating large dataset via calcium imaging and the availability of sophisticated analytical tools for decoding the astrocyte function. Current practice is essentially manual, which not only limits analysis throughput but also risks introducing bias and missing important information latent in complex, dynamic big data. Here, we report a suite of computational tools, called Functional AStrocyte Phenotyping (FASP), for automatically quantifying the functional status of astrocytes. Considering the complex nature of Ca2+ signaling in astrocytes and low signal to noise ratio, FASP is designed with data-driven and probabilistic principles, to flexibly account for various patterns and to perform robustly with noisy data. In particular, FASP explicitly models signal propagation, which rules out the applicability of tools designed for other types of data. We demonstrate the effectiveness of FASP using extensive synthetic and real data sets. The findings by FASP were verified by manual inspection. FASP also detected signals that were missed by purely manual analysis but could be confirmed by more careful manual examination under the guidance of automatic analysis. All algorithms and the analysis pipeline are packaged into a plugin for Fiji (ImageJ), with the source code freely available online at https://github.com/VTcbil/FASP. PMID:28769780
Automated Functional Analysis of Astrocytes from Chronic Time-Lapse Calcium Imaging Data.
Wang, Yinxue; Shi, Guilai; Miller, David J; Wang, Yizhi; Wang, Congchao; Broussard, Gerard; Wang, Yue; Tian, Lin; Yu, Guoqiang
2017-01-01
Recent discoveries that astrocytes exert proactive regulatory effects on neural information processing and that they are deeply involved in normal brain development and disease pathology have stimulated broad interest in understanding astrocyte functional roles in brain circuit. Measuring astrocyte functional status is now technically feasible, due to recent advances in modern microscopy and ultrasensitive cell-type specific genetically encoded Ca 2+ indicators for chronic imaging. However, there is a big gap between the capability of generating large dataset via calcium imaging and the availability of sophisticated analytical tools for decoding the astrocyte function. Current practice is essentially manual, which not only limits analysis throughput but also risks introducing bias and missing important information latent in complex, dynamic big data. Here, we report a suite of computational tools, called Functional AStrocyte Phenotyping (FASP), for automatically quantifying the functional status of astrocytes. Considering the complex nature of Ca 2+ signaling in astrocytes and low signal to noise ratio, FASP is designed with data-driven and probabilistic principles, to flexibly account for various patterns and to perform robustly with noisy data. In particular, FASP explicitly models signal propagation, which rules out the applicability of tools designed for other types of data. We demonstrate the effectiveness of FASP using extensive synthetic and real data sets. The findings by FASP were verified by manual inspection. FASP also detected signals that were missed by purely manual analysis but could be confirmed by more careful manual examination under the guidance of automatic analysis. All algorithms and the analysis pipeline are packaged into a plugin for Fiji (ImageJ), with the source code freely available online at https://github.com/VTcbil/FASP.
Arnemann, Philip-Helge; Hessler, Michael; Kampmeier, Tim; Morelli, Andrea; Van Aken, Hugo Karel; Westphal, Martin; Rehberg, Sebastian; Ertmer, Christian
2016-12-01
Life-threatening diseases of critically ill patients are known to derange microcirculation. Automatic analysis of microcirculation would provide a bedside diagnostic tool for microcirculatory disorders and allow immediate therapeutic decisions based upon microcirculation analysis. After induction of general anaesthesia and instrumentation for haemodynamic monitoring, haemorrhagic shock was induced in ten female sheep by stepwise blood withdrawal of 3 × 10 mL per kilogram body weight. Before and after the induction of haemorrhagic shock, haemodynamic variables, samples for blood gas analysis, and videos of conjunctival microcirculation were obtained by incident dark field illumination microscopy. Microcirculatory videos were analysed (1) manually with AVA software version 3.2 by an experienced user and (2) automatically by AVA software version 4.2 for total vessel density (TVD), perfused vessel density (PVD) and proportion of perfused vessels (PPV). Correlation between the two analysis methods was examined by intraclass correlation coefficient and Bland-Altman analysis. The induction of haemorrhagic shock decreased the mean arterial pressure (from 87 ± 11 to 40 ± 7 mmHg; p < 0.001); stroke volume index (from 38 ± 14 to 20 ± 5 ml·m -2 ; p = 0.001) and cardiac index (from 2.9 ± 0.9 to 1.8 ± 0.5 L·min -1 ·m -2 ; p < 0.001) and increased the heart rate (from 72 ± 9 to 87 ± 11 bpm; p < 0.001) and lactate concentration (from 0.9 ± 0.3 to 2.0 ± 0.6 mmol·L -1 ; p = 0.001). Manual analysis showed no change in TVD (17.8 ± 4.2 to 17.8 ± 3.8 mm*mm -2 ; p = 0.993), whereas PVD (from 15.6 ± 4.6 to 11.5 ± 6.5 mm*mm -2 ; p = 0.041) and PPV (from 85.9 ± 11.8 to 62.7 ± 29.6%; p = 0.017) decreased significantly. Automatic analysis was not able to identify these changes. Correlation analysis showed a poor correlation between the analysis methods and a wide spread of values in Bland-Altman analysis. As characteristic changes in microcirculation during ovine haemorrhagic shock were not detected by automatic analysis and correlation between automatic and manual analyses (current gold standard) was poor, the use of the investigated software for automatic analysis of microcirculation cannot be recommended in its current version at least in the investigated model. Further improvements in automatic vessel detection are needed before its routine use.
Effectiveness of an automatic tracking software in underwater motion analysis.
Magalhaes, Fabrício A; Sawacha, Zimi; Di Michele, Rocco; Cortesi, Matteo; Gatta, Giorgio; Fantozzi, Silvia
2013-01-01
Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP), based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers' positions) were manually tracked to determine the markers' center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM). Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker's coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4%) than for COM (17.8%). Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis. Key PointsThe availability of effective software for automatic tracking would represent a significant advance for the practical use of kinematic analysis in swimming and other aquatic sports.An important feature of automatic tracking software is to require limited human interventions and supervision, thus allowing short processing time.When tracking underwater movements, the degree of automation of the tracking procedure is influenced by the capability of the algorithm to overcome difficulties linked to the small target size, the low image quality and the presence of background clutters.The newly developed feature-tracking algorithm has shown a good automatic tracking effectiveness in underwater motion analysis with significantly smaller percentage of required manual interventions when compared to a commercial software.
Automatic emotional expression analysis from eye area
NASA Astrophysics Data System (ADS)
Akkoç, Betül; Arslan, Ahmet
2015-02-01
Eyes play an important role in expressing emotions in nonverbal communication. In the present study, emotional expression classification was performed based on the features that were automatically extracted from the eye area. Fırst, the face area and the eye area were automatically extracted from the captured image. Afterwards, the parameters to be used for the analysis through discrete wavelet transformation were obtained from the eye area. Using these parameters, emotional expression analysis was performed through artificial intelligence techniques. As the result of the experimental studies, 6 universal emotions consisting of expressions of happiness, sadness, surprise, disgust, anger and fear were classified at a success rate of 84% using artificial neural networks.
Towards automatic music transcription: note extraction based on independent subspace analysis
NASA Astrophysics Data System (ADS)
Wellhausen, Jens; Hoynck, Michael
2005-01-01
Due to the increasing amount of music available electronically the need of automatic search, retrieval and classification systems for music becomes more and more important. In this paper an algorithm for automatic transcription of polyphonic piano music into MIDI data is presented, which is a very interesting basis for database applications, music analysis and music classification. The first part of the algorithm performs a note accurate temporal audio segmentation. In the second part, the resulting segments are examined using Independent Subspace Analysis to extract sounding notes. Finally, the results are used to build a MIDI file as a new representation of the piece of music which is examined.
Towards automatic music transcription: note extraction based on independent subspace analysis
NASA Astrophysics Data System (ADS)
Wellhausen, Jens; Höynck, Michael
2004-12-01
Due to the increasing amount of music available electronically the need of automatic search, retrieval and classification systems for music becomes more and more important. In this paper an algorithm for automatic transcription of polyphonic piano music into MIDI data is presented, which is a very interesting basis for database applications, music analysis and music classification. The first part of the algorithm performs a note accurate temporal audio segmentation. In the second part, the resulting segments are examined using Independent Subspace Analysis to extract sounding notes. Finally, the results are used to build a MIDI file as a new representation of the piece of music which is examined.
A hierarchical structure for automatic meshing and adaptive FEM analysis
NASA Technical Reports Server (NTRS)
Kela, Ajay; Saxena, Mukul; Perucchio, Renato
1987-01-01
A new algorithm for generating automatically, from solid models of mechanical parts, finite element meshes that are organized as spatially addressable quaternary trees (for 2-D work) or octal trees (for 3-D work) is discussed. Because such meshes are inherently hierarchical as well as spatially addressable, they permit efficient substructuring techniques to be used for both global analysis and incremental remeshing and reanalysis. The global and incremental techniques are summarized and some results from an experimental closed loop 2-D system in which meshing, analysis, error evaluation, and remeshing and reanalysis are done automatically and adaptively are presented. The implementation of 3-D work is briefly discussed.
Research in interactive scene analysis
NASA Technical Reports Server (NTRS)
Tenenbaum, J. M.; Garvey, T. D.; Weyl, S. A.; Wolf, H. C.
1975-01-01
An interactive scene interpretation system (ISIS) was developed as a tool for constructing and experimenting with man-machine and automatic scene analysis methods tailored for particular image domains. A recently developed region analysis subsystem based on the paradigm of Brice and Fennema is described. Using this subsystem a series of experiments was conducted to determine good criteria for initially partitioning a scene into atomic regions and for merging these regions into a final partition of the scene along object boundaries. Semantic (problem-dependent) knowledge is essential for complete, correct partitions of complex real-world scenes. An interactive approach to semantic scene segmentation was developed and demonstrated on both landscape and indoor scenes. This approach provides a reasonable methodology for segmenting scenes that cannot be processed completely automatically, and is a promising basis for a future automatic system. A program is described that can automatically generate strategies for finding specific objects in a scene based on manually designated pictorial examples.
Design and analysis of the Gemini chain system in dual clutch transmission of automobile
NASA Astrophysics Data System (ADS)
Cheng, Yabing; Guo, Haitao; Fu, Zhenming; Wan, Nen; Li, Lei; Wang, Yang
2015-01-01
Chain drive system is widely used in the conditions of high-speed, overload, variable speed and load. Many studies are focused on the meshing theory and wear characteristics of chain drive system, but system design, analysis, and noise characteristics of the chain drive system are weak. System design and noise characteristic are studied for a new type Gemini chain of dual-clutch automatic transmission. Based on the meshing theory of silent chain, the design parameters of the Gemini chain system are calculated and the mathematical models and dynamic analysis models of the Gemini chain system are established. Dynamic characteristics of the Gemini chain system is simulated and the contact force of plate and pin, plate and sprockets, the chain tension forces, the transmission error and the stress of plates and pins are analyzed. According to the simulation results of the Gemini chain system, the noise experiment about system is carried out. The noise values are tested at different speed and load and spectral characteristics are analyzed. The results of simulation and experimental show that the contact forces of plate and pin, plate and sprockets are smaller than the allowable stress values, the chain tension force is less than ultimate tension and transmission error is limited in 1.2%. The noise values can meet the requirements of industrial design, and it is proved that the design and analysis method of the Gemini chain system is scientific and feasible. The design and test system is built from analysis to test of Gemini chain system. This research presented will provide a corresponding theoretical guidance for the design and dynamic characteristics and noise characteristics of chain drive system.
Gerth, Sabrina; Klassert, Annegret; Dolk, Thomas; Fliesser, Michael; Fischer, Martin H; Nottbusch, Guido; Festman, Julia
2016-01-01
Due to their multifunctionality, tablets offer tremendous advantages for research on handwriting dynamics or for interactive use of learning apps in schools. Further, the widespread use of tablet computers has had a great impact on handwriting in the current generation. But, is it advisable to teach how to write and to assess handwriting in pre- and primary schoolchildren on tablets rather than on paper? Since handwriting is not automatized before the age of 10 years, children's handwriting movements require graphomotor and visual feedback as well as permanent control of movement execution during handwriting. Modifications in writing conditions, for instance the smoother writing surface of a tablet, might influence handwriting performance in general and in particular those of non-automatized beginning writers. In order to investigate how handwriting performance is affected by a difference in friction of the writing surface, we recruited three groups with varying levels of handwriting automaticity: 25 preschoolers, 27 second graders, and 25 adults. We administered three tasks measuring graphomotor abilities, visuomotor abilities, and handwriting performance (only second graders and adults). We evaluated two aspects of handwriting performance: the handwriting quality with a visual score and the handwriting dynamics using online handwriting measures [e.g., writing duration, writing velocity, strokes and number of inversions in velocity (NIV)]. In particular, NIVs which describe the number of velocity peaks during handwriting are directly related to the level of handwriting automaticity. In general, we found differences between writing on paper compared to the tablet. These differences were partly task-dependent. The comparison between tablet and paper revealed a faster writing velocity for all groups and all tasks on the tablet which indicates that all participants-even the experienced writers-were influenced by the lower friction of the tablet surface. Our results for the group-comparison show advancing levels in handwriting automaticity from preschoolers to second graders to adults, which confirms that our method depicts handwriting performance in groups with varying degrees of handwriting automaticity. We conclude that the smoother tablet surface requires additional control of handwriting movements and therefore might present an additional challenge for learners of handwriting.
Gerth, Sabrina; Klassert, Annegret; Dolk, Thomas; Fliesser, Michael; Fischer, Martin H.; Nottbusch, Guido; Festman, Julia
2016-01-01
Due to their multifunctionality, tablets offer tremendous advantages for research on handwriting dynamics or for interactive use of learning apps in schools. Further, the widespread use of tablet computers has had a great impact on handwriting in the current generation. But, is it advisable to teach how to write and to assess handwriting in pre- and primary schoolchildren on tablets rather than on paper? Since handwriting is not automatized before the age of 10 years, children's handwriting movements require graphomotor and visual feedback as well as permanent control of movement execution during handwriting. Modifications in writing conditions, for instance the smoother writing surface of a tablet, might influence handwriting performance in general and in particular those of non-automatized beginning writers. In order to investigate how handwriting performance is affected by a difference in friction of the writing surface, we recruited three groups with varying levels of handwriting automaticity: 25 preschoolers, 27 second graders, and 25 adults. We administered three tasks measuring graphomotor abilities, visuomotor abilities, and handwriting performance (only second graders and adults). We evaluated two aspects of handwriting performance: the handwriting quality with a visual score and the handwriting dynamics using online handwriting measures [e.g., writing duration, writing velocity, strokes and number of inversions in velocity (NIV)]. In particular, NIVs which describe the number of velocity peaks during handwriting are directly related to the level of handwriting automaticity. In general, we found differences between writing on paper compared to the tablet. These differences were partly task-dependent. The comparison between tablet and paper revealed a faster writing velocity for all groups and all tasks on the tablet which indicates that all participants—even the experienced writers—were influenced by the lower friction of the tablet surface. Our results for the group-comparison show advancing levels in handwriting automaticity from preschoolers to second graders to adults, which confirms that our method depicts handwriting performance in groups with varying degrees of handwriting automaticity. We conclude that the smoother tablet surface requires additional control of handwriting movements and therefore might present an additional challenge for learners of handwriting. PMID:27672372
Jayender, Jagadaeesan; Chikarmane, Sona; Jolesz, Ferenc A; Gombos, Eva
2014-08-01
To accurately segment invasive ductal carcinomas (IDCs) from dynamic contrast-enhanced MRI (DCE-MRI) using time series analysis based on linear dynamic system (LDS) modeling. Quantitative segmentation methods based on black-box modeling and pharmacokinetic modeling are highly dependent on imaging pulse sequence, timing of bolus injection, arterial input function, imaging noise, and fitting algorithms. We modeled the underlying dynamics of the tumor by an LDS and used the system parameters to segment the carcinoma on the DCE-MRI. Twenty-four patients with biopsy-proven IDCs were analyzed. The lesions segmented by the algorithm were compared with an expert radiologist's segmentation and the output of a commercial software, CADstream. The results are quantified in terms of the accuracy and sensitivity of detecting the lesion and the amount of overlap, measured in terms of the Dice similarity coefficient (DSC). The segmentation algorithm detected the tumor with 90% accuracy and 100% sensitivity when compared with the radiologist's segmentation and 82.1% accuracy and 100% sensitivity when compared with the CADstream output. The overlap of the algorithm output with the radiologist's segmentation and CADstream output, computed in terms of the DSC was 0.77 and 0.72, respectively. The algorithm also shows robust stability to imaging noise. Simulated imaging noise with zero mean and standard deviation equal to 25% of the base signal intensity was added to the DCE-MRI series. The amount of overlap between the tumor maps generated by the LDS-based algorithm from the noisy and original DCE-MRI was DSC = 0.95. The time-series analysis based segmentation algorithm provides high accuracy and sensitivity in delineating the regions of enhanced perfusion corresponding to tumor from DCE-MRI. © 2013 Wiley Periodicals, Inc.
Klapsing, Philipp; Herrmann, Peter; Quintel, Michael; Moerer, Onnen
2017-12-01
Quantitative lung computed tomographic (CT) analysis yields objective data regarding lung aeration but is currently not used in clinical routine primarily because of the labor-intensive process of manual CT segmentation. Automatic lung segmentation could help to shorten processing times significantly. In this study, we assessed bias and precision of lung CT analysis using automatic segmentation compared with manual segmentation. In this monocentric clinical study, 10 mechanically ventilated patients with mild to moderate acute respiratory distress syndrome were included who had received lung CT scans at 5- and 45-mbar airway pressure during a prior study. Lung segmentations were performed both automatically using a computerized algorithm and manually. Automatic segmentation yielded similar lung volumes compared with manual segmentation with clinically minor differences both at 5 and 45 mbar. At 5 mbar, results were as follows: overdistended lung 49.58mL (manual, SD 77.37mL) and 50.41mL (automatic, SD 77.3mL), P=.028; normally aerated lung 2142.17mL (manual, SD 1131.48mL) and 2156.68mL (automatic, SD 1134.53mL), P = .1038; and poorly aerated lung 631.68mL (manual, SD 196.76mL) and 646.32mL (automatic, SD 169.63mL), P = .3794. At 45 mbar, values were as follows: overdistended lung 612.85mL (manual, SD 449.55mL) and 615.49mL (automatic, SD 451.03mL), P=.078; normally aerated lung 3890.12mL (manual, SD 1134.14mL) and 3907.65mL (automatic, SD 1133.62mL), P = .027; and poorly aerated lung 413.35mL (manual, SD 57.66mL) and 469.58mL (automatic, SD 70.14mL), P=.007. Bland-Altman analyses revealed the following mean biases and limits of agreement at 5 mbar for automatic vs manual segmentation: overdistended lung +0.848mL (±2.062mL), normally aerated +14.51mL (±49.71mL), and poorly aerated +14.64mL (±98.16mL). At 45 mbar, results were as follows: overdistended +2.639mL (±8.231mL), normally aerated 17.53mL (±41.41mL), and poorly aerated 56.23mL (±100.67mL). Automatic single CT image and whole lung segmentation were faster than manual segmentation (0.17 vs 125.35seconds [P<.0001] and 10.46 vs 7739.45seconds [P<.0001]). Automatic lung CT segmentation allows fast analysis of aerated lung regions. A reduction of processing times by more than 99% allows the use of quantitative CT at the bedside. Copyright © 2016 Elsevier Inc. All rights reserved.
Data dependent systems approach to modal analysis Part 1: Theory
NASA Astrophysics Data System (ADS)
Pandit, S. M.; Mehta, N. P.
1988-05-01
The concept of Data Dependent Systems (DDS) and its applicability in the context of modal vibration analysis is presented. The ability of the DDS difference equation models to provide a complete representation of a linear dynamic system from its sampled response data forms the basis of the approach. The models are decomposed into deterministic and stochastic components so that system characteristics are isolated from noise effects. The modelling strategy is outlined, and the method of analysis associated with modal parameter identification is described in detail. Advantages and special features of the DDS methodology are discussed. Since the correlated noise is appropriately and automatically modelled by the DDS, the modal parameters are shown to be estimated very accurately and hence no preprocessing of the data is needed. Complex mode shapes and non-classical damping are as easily analyzed as the classical normal mode analysis. These features are illustrated by using simulated data in this Part I and real data on a disc-brake rotor in Part II.
The N-BOD2 user's and programmer's manual
NASA Technical Reports Server (NTRS)
Frisch, H. P.
1978-01-01
A general purpose digital computer program was developed and designed to aid in the analysis of spacecraft attitude dynamics. The program provides the analyst with the capability of automatically deriving and numerically solving the equations of motion of any system that can be modeled as a topological tree of coupled rigid bodies, flexible bodies, point masses, and symmetrical momentum wheels. Two modes of output are available. The composite system equations of motion may be outputted on a line printer in a symbolic form that may be easily translated into common vector-dyadic notation, or the composite system equations of motion may be solved numerically and any desirable set of system state variables outputted as a function of time.
Defects and anharmonicity induced electron spectra of YBa2Cu3O7-δ superconductors
NASA Astrophysics Data System (ADS)
Singh, Anu; Indu, B. D.
2018-05-01
The effects of defects and anharmonicities on the electron density of states (EDOS) have been studied in high-temperature superconductors (HTS) adopting the many body quantum dynamical theory of electron Green's functions via a generalized Hamiltonian that includes the effects of electron-phonon interactions, anharmonicities and point impurities. The automatic emergence of pairons and temperature dependence of EDOS are appear as special feature of the theory. The results thus obtained and their numerical analysis for YBa2Cu3O7-δ superconductors clearly demonstrate that the presence of defects, anharmonicities and electron-phonon interactions modifies the behavior of EDOS over a wide range of temperature.
Automated Extraction of Secondary Flow Features
NASA Technical Reports Server (NTRS)
Dorney, Suzanne M.; Haimes, Robert
2005-01-01
The use of Computational Fluid Dynamics (CFD) has become standard practice in the design and development of the major components used for air and space propulsion. To aid in the post-processing and analysis phase of CFD many researchers now use automated feature extraction utilities. These tools can be used to detect the existence of such features as shocks, vortex cores and separation and re-attachment lines. The existence of secondary flow is another feature of significant importance to CFD engineers. Although the concept of secondary flow is relatively understood there is no commonly accepted mathematical definition for secondary flow. This paper will present a definition for secondary flow and one approach for automatically detecting and visualizing secondary flow.
Critical Speed of The Glass Glue Machine's Creep and Influence Factors Analysis
NASA Astrophysics Data System (ADS)
Yang, Jianxi; Huang, Jian; Wang, Liying; Shi, Jintai
When automatic glass glue machine works, two questions of the machine starting vibrating and stick-slip motion are existing. These problems should be solved. According to these questions, a glue machine's model for studying stick-slip is established. Based on the dynamics system describing of the model, mathematical expression is presented. The creep critical speed expression is constructed referring to existing research achievement and a new conclusion is found. The influencing factors of stiffness, dampness, mass, velocity, difference of static and kinetic coefficient of friction are analyzed through Matlab simulation. Research shows that reasonable choice of influence parameters can improve the creep phenomenon. These all supply the theory evidence for improving the machine's motion stability.
Statistical analysis and modeling of the temperature-dependent sleep behavior of drosophila
NASA Astrophysics Data System (ADS)
Shih, Chi-Tin; Lin, Hsuan-Wen; Chiang, Ann-Shyn
2011-01-01
The sleep behavior of drosophila is analyzed under different temperatures. The activity per minute of the flies is recorded automatically. Sleep for a fruit fly is defined as the periods without any activity and longer than 5 minutes. Several parameters such as total sleep time, circadian sleep profile, quality of sleep are analyzed. The sleep behaviors are significantly different for flies at different temperature. Interestingly, the durations of daytime sleep periods show a common scale-free power law distribution. We propose a stochastic model to simulate the activities of the population of neurons which regulate the dynamics of sleep-wake process to explain the distribution of daytime sleep.
The effect of waist twisting on walking speed of an amphibious salamander like robot
NASA Astrophysics Data System (ADS)
Yin, Xin-Yan; Jia, Li-Chao; Wang, Chen; Xie, Guang-Ming
2016-06-01
Amphibious salamanders often swing their waist to coordinate quadruped walking in order to improve their crawling speed. A robot with a swing waist joint, like an amphibious salamander, is used to mimic this locomotion. A control method is designed to allow the robot to maintain the rotational speed of its legs continuous and avoid impact between its legs and the ground. An analytical expression is established between the amplitude of the waist joint and the step length. Further, an optimization amplitude is obtained corresponding to the maximum stride. The simulation results based on automatic dynamic analysis of mechanical systems (ADAMS) and physical experiments verify the rationality and validity of this expression.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fang, Y; Huang, H; Su, T
Purpose: Texture-based quantification of image heterogeneity has been a popular topic for imaging studies in recent years. As previous studies mainly focus on oncological applications, we report our recent efforts of applying such techniques on cardiac perfusion imaging. A fully automated procedure has been developed to perform texture analysis for measuring the image heterogeneity. Clinical data were used to evaluate the preliminary performance of such methods. Methods: Myocardial perfusion images of Thallium-201 scans were collected from 293 patients with suspected coronary artery disease. Each subject underwent a Tl-201 scan and a percutaneous coronary intervention (PCI) within three months. The PCImore » Result was used as the gold standard of coronary ischemia of more than 70% stenosis. Each Tl-201 scan was spatially normalized to an image template for fully automatic segmentation of the LV. The segmented voxel intensities were then carried into the texture analysis with our open-source software Chang Gung Image Texture Analysis toolbox (CGITA). To evaluate the clinical performance of the image heterogeneity for detecting the coronary stenosis, receiver operating characteristic (ROC) analysis was used to compute the overall accuracy, sensitivity and specificity as well as the area under curve (AUC). Those indices were compared to those obtained from the commercially available semi-automatic software QPS. Results: With the fully automatic procedure to quantify heterogeneity from Tl-201 scans, we were able to achieve a good discrimination with good accuracy (74%), sensitivity (73%), specificity (77%) and AUC of 0.82. Such performance is similar to those obtained from the semi-automatic QPS software that gives a sensitivity of 71% and specificity of 77%. Conclusion: Based on fully automatic procedures of data processing, our preliminary data indicate that the image heterogeneity of myocardial perfusion imaging can provide useful information for automatic determination of the myocardial ischemia.« less
Modeling multi-source flooding disaster and developing simulation framework in Delta
NASA Astrophysics Data System (ADS)
Liu, Y.; Cui, X.; Zhang, W.
2016-12-01
Most Delta regions of the world are densely populated and with advanced economies. However, due to impact of the multi-source flooding (upstream flood, rainstorm waterlogging, storm surge flood), the Delta regions is very vulnerable. The academic circles attach great importance to the multi-source flooding disaster in these areas. The Pearl River Delta urban agglomeration in south China is selected as the research area. Based on analysis of natural and environmental characteristics data of the Delta urban agglomeration(remote sensing data, land use data, topographic map, etc.), hydrological monitoring data, research of the uneven distribution and process of regional rainfall, the relationship between the underlying surface and the parameters of runoff, effect of flood storage pattern, we use an automatic or semi-automatic method for dividing spatial units to reflect the runoff characteristics in urban agglomeration, and develop an Multi-model Ensemble System in changing environment, including urban hydrologic model, parallel computational 1D&2D hydrodynamic model, storm surge forecast model and other professional models, the system will have the abilities like real-time setting a variety of boundary conditions, fast and real-time calculation, dynamic presentation of results, powerful statistical analysis function. The model could be optimized and improved by a variety of verification methods. This work was supported by the National Natural Science Foundation of China (41471427); Special Basic Research Key Fund for Central Public Scientific Research Institutes.
Liu, Bin; Wu, Hao; Zhang, Deyuan; Wang, Xiaolong; Chou, Kuo-Chen
2017-02-21
To expedite the pace in conducting genome/proteome analysis, we have developed a Python package called Pse-Analysis. The powerful package can automatically complete the following five procedures: (1) sample feature extraction, (2) optimal parameter selection, (3) model training, (4) cross validation, and (5) evaluating prediction quality. All the work a user needs to do is to input a benchmark dataset along with the query biological sequences concerned. Based on the benchmark dataset, Pse-Analysis will automatically construct an ideal predictor, followed by yielding the predicted results for the submitted query samples. All the aforementioned tedious jobs can be automatically done by the computer. Moreover, the multiprocessing technique was adopted to enhance computational speed by about 6 folds. The Pse-Analysis Python package is freely accessible to the public at http://bioinformatics.hitsz.edu.cn/Pse-Analysis/, and can be directly run on Windows, Linux, and Unix.
Wang, Hai-yi; Su, Zi-hua; Xu, Xiao; Sun, Zhi-peng; Duan, Fei-xue; Song, Yuan-yuan; Li, Lu; Wang, Ying-wei; Ma, Xin; Guo, Ai-tao; Ma, Lin; Ye, Hui-yi
2016-01-01
Pharmacokinetic parameters derived from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) have been increasingly used to evaluate the permeability of tumor vessel. Histogram metrics are a recognized promising method of quantitative MR imaging that has been recently introduced in analysis of DCE-MRI pharmacokinetic parameters in oncology due to tumor heterogeneity. In this study, 21 patients with renal cell carcinoma (RCC) underwent paired DCE-MRI studies on a 3.0 T MR system. Extended Tofts model and population-based arterial input function were used to calculate kinetic parameters of RCC tumors. Mean value and histogram metrics (Mode, Skewness and Kurtosis) of each pharmacokinetic parameter were generated automatically using ImageJ software. Intra- and inter-observer reproducibility and scan–rescan reproducibility were evaluated using intra-class correlation coefficients (ICCs) and coefficient of variation (CoV). Our results demonstrated that the histogram method (Mode, Skewness and Kurtosis) was not superior to the conventional Mean value method in reproducibility evaluation on DCE-MRI pharmacokinetic parameters (K trans & Ve) in renal cell carcinoma, especially for Skewness and Kurtosis which showed lower intra-, inter-observer and scan-rescan reproducibility than Mean value. Our findings suggest that additional studies are necessary before wide incorporation of histogram metrics in quantitative analysis of DCE-MRI pharmacokinetic parameters. PMID:27380733
Hybrid modeling and empirical analysis of automobile supply chain network
NASA Astrophysics Data System (ADS)
Sun, Jun-yan; Tang, Jian-ming; Fu, Wei-ping; Wu, Bing-ying
2017-05-01
Based on the connection mechanism of nodes which automatically select upstream and downstream agents, a simulation model for dynamic evolutionary process of consumer-driven automobile supply chain is established by integrating ABM and discrete modeling in the GIS-based map. Firstly, the rationality is proved by analyzing the consistency of sales and changes in various agent parameters between the simulation model and a real automobile supply chain. Second, through complex network theory, hierarchical structures of the model and relationships of networks at different levels are analyzed to calculate various characteristic parameters such as mean distance, mean clustering coefficients, and degree distributions. By doing so, it verifies that the model is a typical scale-free network and small-world network. Finally, the motion law of this model is analyzed from the perspective of complex self-adaptive systems. The chaotic state of the simulation system is verified, which suggests that this system has typical nonlinear characteristics. This model not only macroscopically illustrates the dynamic evolution of complex networks of automobile supply chain but also microcosmically reflects the business process of each agent. Moreover, the model construction and simulation of the system by means of combining CAS theory and complex networks supplies a novel method for supply chain analysis, as well as theory bases and experience for supply chain analysis of auto companies.
NASA Astrophysics Data System (ADS)
Wahyuda; Santosa, Budi; Rusdiansyah, Ahmad
2018-04-01
Deregulation of the electricity market requires coordination between parties to synchronize the optimization on the production side (power station) and the transport side (transmission). Electricity supply chain presented in this article is designed to facilitate the coordination between the parties. Generally, the production side is optimized with price based dynamic economic dispatch (PBDED) model, while the transmission side is optimized with Multi-echelon distribution model. Both sides optimization are done separately. This article proposes a joint model of PBDED and multi-echelon distribution for the combined optimization of production and transmission. This combined optimization is important because changes in electricity demand on the customer side will cause changes to the production side that automatically also alter the transmission path. The transmission will cause two cost components. First, the cost of losses. Second, the cost of using the transmission network (wheeling transaction). Costs due to losses are calculated based on ohmic losses, while the cost of using transmission lines using the MW - mile method. As a result, this method is able to provide best allocation analysis for electrical transactions, as well as emission levels in power generation and cost analysis. As for the calculation of transmission costs, the Reverse MW-mile method produces a cheaper cost than the Absolute MW-mile method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Page, R.; Jones, J.R.
1997-07-01
Ensuring that safety analysis needs are met in the future is likely to lead to the development of new codes and the further development of existing codes. It is therefore advantageous to define standards for data interfaces and to develop software interfacing techniques which can readily accommodate changes when they are made. Defining interface standards is beneficial but is necessarily restricted in application if future requirements are not known in detail. Code interfacing methods are of particular relevance with the move towards automatic grid frequency response operation where the integration of plant dynamic, core follow and fault study calculation toolsmore » is considered advantageous. This paper describes the background and features of a new code TALINK (Transient Analysis code LINKage program) used to provide a flexible interface to link the RELAP5 thermal hydraulics code with the PANTHER neutron kinetics and the SIBDYM whole plant dynamic modelling codes used by Nuclear Electric. The complete package enables the codes to be executed in parallel and provides an integrated whole plant thermal-hydraulics and neutron kinetics model. In addition the paper discusses the capabilities and pedigree of the component codes used to form the integrated transient analysis package and the details of the calculation of a postulated Sizewell `B` Loss of offsite power fault transient.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Hua, E-mail: huli@radonc.wustl.edu; Chen, Hsin
Purpose: For the first time, MRI-guided radiation therapy systems can acquire cine images to dynamically monitor in-treatment internal organ motion. However, the complex head and neck (H&N) structures and low-contrast/resolution of on-board cine MRI images make automatic motion tracking a very challenging task. In this study, the authors proposed an integrated model-driven method to automatically track the in-treatment motion of the H&N upper airway, a complex and highly deformable region wherein internal motion often occurs in an either voluntary or involuntary manner, from cine MRI images for the analysis of H&N motion patterns. Methods: Considering the complex H&N structures andmore » ensuring automatic and robust upper airway motion tracking, the authors firstly built a set of linked statistical shapes (including face, face-jaw, and face-jaw-palate) using principal component analysis from clinically approved contours delineated on a set of training data. The linked statistical shapes integrate explicit landmarks and implicit shape representation. Then, a hierarchical model-fitting algorithm was developed to align the linked shapes on the first image frame of a to-be-tracked cine sequence and to localize the upper airway region. Finally, a multifeature level set contour propagation scheme was performed to identify the upper airway shape change, frame-by-frame, on the entire image sequence. The multifeature fitting energy, including the information of intensity variations, edge saliency, curve geometry, and temporal shape continuity, was minimized to capture the details of moving airway boundaries. Sagittal cine MR image sequences acquired from three H&N cancer patients were utilized to demonstrate the performance of the proposed motion tracking method. Results: The tracking accuracy was validated by comparing the results to the average of two manual delineations in 50 randomly selected cine image frames from each patient. The resulting average dice similarity coefficient (93.28% ± 1.46%) and margin error (0.49 ± 0.12 mm) showed good agreement between the automatic and manual results. The comparison with three other deformable model-based segmentation methods illustrated the superior shape tracking performance of the proposed method. Large interpatient variations of swallowing frequency, swallowing duration, and upper airway cross-sectional area were observed from the testing cine image sequences. Conclusions: The proposed motion tracking method can provide accurate upper airway motion tracking results, and enable automatic and quantitative identification and analysis of in-treatment H&N upper airway motion. By integrating explicit and implicit linked-shape representations within a hierarchical model-fitting process, the proposed tracking method can process complex H&N structures and low-contrast/resolution cine MRI images. Future research will focus on the improvement of method reliability, patient motion pattern analysis for providing more information on patient-specific prediction of structure displacements, and motion effects on dosimetry for better H&N motion management in radiation therapy.« less
Li, Hua; Chen, Hsin-Chen; Dolly, Steven; Li, Harold; Fischer-Valuck, Benjamin; Victoria, James; Dempsey, James; Ruan, Su; Anastasio, Mark; Mazur, Thomas; Gach, Michael; Kashani, Rojano; Green, Olga; Rodriguez, Vivian; Gay, Hiram; Thorstad, Wade; Mutic, Sasa
2016-08-01
For the first time, MRI-guided radiation therapy systems can acquire cine images to dynamically monitor in-treatment internal organ motion. However, the complex head and neck (H&N) structures and low-contrast/resolution of on-board cine MRI images make automatic motion tracking a very challenging task. In this study, the authors proposed an integrated model-driven method to automatically track the in-treatment motion of the H&N upper airway, a complex and highly deformable region wherein internal motion often occurs in an either voluntary or involuntary manner, from cine MRI images for the analysis of H&N motion patterns. Considering the complex H&N structures and ensuring automatic and robust upper airway motion tracking, the authors firstly built a set of linked statistical shapes (including face, face-jaw, and face-jaw-palate) using principal component analysis from clinically approved contours delineated on a set of training data. The linked statistical shapes integrate explicit landmarks and implicit shape representation. Then, a hierarchical model-fitting algorithm was developed to align the linked shapes on the first image frame of a to-be-tracked cine sequence and to localize the upper airway region. Finally, a multifeature level set contour propagation scheme was performed to identify the upper airway shape change, frame-by-frame, on the entire image sequence. The multifeature fitting energy, including the information of intensity variations, edge saliency, curve geometry, and temporal shape continuity, was minimized to capture the details of moving airway boundaries. Sagittal cine MR image sequences acquired from three H&N cancer patients were utilized to demonstrate the performance of the proposed motion tracking method. The tracking accuracy was validated by comparing the results to the average of two manual delineations in 50 randomly selected cine image frames from each patient. The resulting average dice similarity coefficient (93.28% ± 1.46%) and margin error (0.49 ± 0.12 mm) showed good agreement between the automatic and manual results. The comparison with three other deformable model-based segmentation methods illustrated the superior shape tracking performance of the proposed method. Large interpatient variations of swallowing frequency, swallowing duration, and upper airway cross-sectional area were observed from the testing cine image sequences. The proposed motion tracking method can provide accurate upper airway motion tracking results, and enable automatic and quantitative identification and analysis of in-treatment H&N upper airway motion. By integrating explicit and implicit linked-shape representations within a hierarchical model-fitting process, the proposed tracking method can process complex H&N structures and low-contrast/resolution cine MRI images. Future research will focus on the improvement of method reliability, patient motion pattern analysis for providing more information on patient-specific prediction of structure displacements, and motion effects on dosimetry for better H&N motion management in radiation therapy.
Comparison of histomorphometrical data obtained with two different image analysis methods.
Ballerini, Lucia; Franke-Stenport, Victoria; Borgefors, Gunilla; Johansson, Carina B
2007-08-01
A common way to determine tissue acceptance of biomaterials is to perform histomorphometrical analysis on histologically stained sections from retrieved samples with surrounding tissue, using various methods. The "time and money consuming" methods and techniques used are often "in house standards". We address light microscopic investigations of bone tissue reactions on un-decalcified cut and ground sections of threaded implants. In order to screen sections and generate results faster, the aim of this pilot project was to compare results generated with the in-house standard visual image analysis tool (i.e., quantifications and judgements done by the naked eye) with a custom made automatic image analysis program. The histomorphometrical bone area measurements revealed no significant differences between the methods but the results of the bony contacts varied significantly. The raw results were in relative agreement, i.e., the values from the two methods were proportional to each other: low bony contact values in the visual method corresponded to low values with the automatic method. With similar resolution images and further improvements of the automatic method this difference should become insignificant. A great advantage using the new automatic image analysis method is that it is time saving--analysis time can be significantly reduced.
Stewart, Brandon D; Payne, B Keith
2008-10-01
The evidence for whether intentional control strategies can reduce automatic stereotyping is mixed. Therefore, the authors tested the utility of implementation intentions--specific plans linking a behavioral opportunity to a specific response--in reducing automatic bias. In three experiments, automatic stereotyping was reduced when participants made an intention to think specific counterstereotypical thoughts whenever they encountered a Black individual. The authors used two implicit tasks and process dissociation analysis, which allowed them to separate contributions of automatic and controlled thinking to task performance. Of importance, the reduction in stereotyping was driven by a change in automatic stereotyping and not controlled thinking. This benefit was acquired with little practice and generalized to novel faces. Thus, implementation intentions may be an effective and efficient means for controlling automatic aspects of thought.
Extraction of sandy bedforms features through geodesic morphometry
NASA Astrophysics Data System (ADS)
Debese, Nathalie; Jacq, Jean-José; Garlan, Thierry
2016-09-01
State-of-art echosounders reveal fine-scale details of mobile sandy bedforms, which are commonly found on continental shelfs. At present, their dynamics are still far from being completely understood. These bedforms are a serious threat to navigation security, anthropic structures and activities, placing emphasis on research breakthroughs. Bedform geometries and their dynamics are closely linked; therefore, one approach is to develop semi-automatic tools aiming at extracting their structural features from bathymetric datasets. Current approaches mimic manual processes or rely on morphological simplification of bedforms. The 1D and 2D approaches cannot address the wide ranges of both types and complexities of bedforms. In contrast, this work attempts to follow a 3D global semi-automatic approach based on a bathymetric TIN. The currently extracted primitives are the salient ridge and valley lines of the sand structures, i.e., waves and mega-ripples. The main difficulty is eliminating the ripples that are found to heavily overprint any observations. To this end, an anisotropic filter that is able to discard these structures while still enhancing the wave ridges is proposed. The second part of the work addresses the semi-automatic interactive extraction and 3D augmented display of the main lines structures. The proposed protocol also allows geoscientists to interactively insert topological constraints.
Event-driven management algorithm of an Engineering documents circulation system
NASA Astrophysics Data System (ADS)
Kuzenkov, V.; Zebzeev, A.; Gromakov, E.
2015-04-01
Development methodology of an engineering documents circulation system in the design company is reviewed. Discrete event-driven automatic models using description algorithms of project management is offered. Petri net use for dynamic design of projects is offered.
A Theory of Term Importance in Automatic Text Analysis.
ERIC Educational Resources Information Center
Salton, G.; And Others
Most existing automatic content analysis and indexing techniques are based on work frequency characteristics applied largely in an ad hoc manner. Contradictory requirements arise in this connection, in that terms exhibiting high occurrence frequencies in individual documents are often useful for high recall performance (to retrieve many relevant…
FAMA: Fast Automatic MOOG Analysis
NASA Astrophysics Data System (ADS)
Magrini, Laura; Randich, Sofia; Friel, Eileen; Spina, Lorenzo; Jacobson, Heather; Cantat-Gaudin, Tristan; Donati, Paolo; Baglioni, Roberto; Maiorca, Enrico; Bragaglia, Angela; Sordo, Rosanna; Vallenari, Antonella
2014-02-01
FAMA (Fast Automatic MOOG Analysis), written in Perl, computes the atmospheric parameters and abundances of a large number of stars using measurements of equivalent widths (EWs) automatically and independently of any subjective approach. Based on the widely-used MOOG code, it simultaneously searches for three equilibria, excitation equilibrium, ionization balance, and the relationship between logn(FeI) and the reduced EWs. FAMA also evaluates the statistical errors on individual element abundances and errors due to the uncertainties in the stellar parameters. Convergence criteria are not fixed "a priori" but instead are based on the quality of the spectra.
Automatic zebrafish heartbeat detection and analysis for zebrafish embryos.
Pylatiuk, Christian; Sanchez, Daniela; Mikut, Ralf; Alshut, Rüdiger; Reischl, Markus; Hirth, Sofia; Rottbauer, Wolfgang; Just, Steffen
2014-08-01
A fully automatic detection and analysis method of heartbeats in videos of nonfixed and nonanesthetized zebrafish embryos is presented. This method reduces the manual workload and time needed for preparation and imaging of the zebrafish embryos, as well as for evaluating heartbeat parameters such as frequency, beat-to-beat intervals, and arrhythmicity. The method is validated by a comparison of the results from automatic and manual detection of the heart rates of wild-type zebrafish embryos 36-120 h postfertilization and of embryonic hearts with bradycardia and pauses in the cardiac contraction.
Automatic segmentation of time-lapse microscopy images depicting a live Dharma embryo.
Zacharia, Eleni; Bondesson, Maria; Riu, Anne; Ducharme, Nicole A; Gustafsson, Jan-Åke; Kakadiaris, Ioannis A
2011-01-01
Biological inferences about the toxicity of chemicals reached during experiments on the zebrafish Dharma embryo can be greatly affected by the analysis of the time-lapse microscopy images depicting the embryo. Among the stages of image analysis, automatic and accurate segmentation of the Dharma embryo is the most crucial and challenging. In this paper, an accurate and automatic segmentation approach for the segmentation of the Dharma embryo data obtained by fluorescent time-lapse microscopy is proposed. Experiments performed in four stacks of 3D images over time have shown promising results.
NASA Astrophysics Data System (ADS)
Cenek, Martin; Dahl, Spencer K.
2016-11-01
Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.
Cenek, Martin; Dahl, Spencer K
2016-11-01
Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.
Design and dynamic analysis of a piezoelectric linear stage for pipetting liquid samples
NASA Astrophysics Data System (ADS)
Yu-Jen, Wang; Chien, Lee; Yi-Bin, Jiang; Kuo-Chieh, Fu
2017-06-01
Piezoelectric actuators have been widely used in positioning stages because of their compact size, stepping controllability, and holding force. This study proposes a piezoelectric-driven stage composed of a bi-electrode piezoelectric slab, capacitive position sensor, and capillary filling detector for filling liquid samples into nanopipettes using capillary flow. This automatic sample-filling device is suitable for transmission electron microscopy image-based quantitative analysis of aqueous products with added nanoparticles. The step length of the actuator is adjusted by a pulse width modulation signal that depends on the stage position; the actuator stops moving once the capillary filling has been detected. A novel dynamic model of the piezoelectric-driven stage based on collision interactions between the piezoelectric actuator and the sliding clipper is presented. Unknown model parameters are derived from the steady state solution of the equivalent steady phase angle. The output force of the piezoelectric actuator is formulated using the impulse and momentum principle. Considering the applied forces and related velocity between the sliding clipper and the piezoelectric slab, the stage dynamic response is confirmed with the experimental results. Moreover, the model can be used to explain the in-phase slanted trajectories of piezoelectric slab to drive sliders, but not elliptical trajectories. The maximum velocity and minimum step length of the piezoelectric-driven stage are 130 mm s-1 and 1 μm respectively.
Automatic labeling of MR brain images through extensible learning and atlas forests.
Xu, Lijun; Liu, Hong; Song, Enmin; Yan, Meng; Jin, Renchao; Hung, Chih-Cheng
2017-12-01
Multiatlas-based method is extensively used in MR brain images segmentation because of its simplicity and robustness. This method provides excellent accuracy although it is time consuming and limited in terms of obtaining information about new atlases. In this study, an automatic labeling of MR brain images through extensible learning and atlas forest is presented to address these limitations. We propose an extensible learning model which allows the multiatlas-based framework capable of managing the datasets with numerous atlases or dynamic atlas datasets and simultaneously ensure the accuracy of automatic labeling. Two new strategies are used to reduce the time and space complexity and improve the efficiency of the automatic labeling of brain MR images. First, atlases are encoded to atlas forests through random forest technology to reduce the time consumed for cross-registration between atlases and target image, and a scatter spatial vector is designed to eliminate errors caused by inaccurate registration. Second, an atlas selection method based on the extensible learning model is used to select atlases for target image without traversing the entire dataset and then obtain the accurate labeling. The labeling results of the proposed method were evaluated in three public datasets, namely, IBSR, LONI LPBA40, and ADNI. With the proposed method, the dice coefficient metric values on the three datasets were 84.17 ± 4.61%, 83.25 ± 4.29%, and 81.88 ± 4.53% which were 5% higher than those of the conventional method, respectively. The efficiency of the extensible learning model was evaluated by state-of-the-art methods for labeling of MR brain images. Experimental results showed that the proposed method could achieve accurate labeling for MR brain images without traversing the entire datasets. In the proposed multiatlas-based method, extensible learning and atlas forests were applied to control the automatic labeling of brain anatomies on large atlas datasets or dynamic atlas datasets and obtain accurate results. © 2017 American Association of Physicists in Medicine.
Plat, Rika; Lowie, Wander; de Bot, Kees
2017-01-01
Reaction time data have long been collected in order to gain insight into the underlying mechanisms involved in language processing. Means analyses often attempt to break down what factors relate to what portion of the total reaction time. From a dynamic systems theory perspective or an interaction dominant view of language processing, it is impossible to isolate discrete factors contributing to language processing, since these continually and interactively play a role. Non-linear analyses offer the tools to investigate the underlying process of language use in time, without having to isolate discrete factors. Patterns of variability in reaction time data may disclose the relative contribution of automatic (grapheme-to-phoneme conversion) processing and attention-demanding (semantic) processing. The presence of a fractal structure in the variability of a reaction time series indicates automaticity in the mental structures contributing to a task. A decorrelated pattern of variability will indicate a higher degree of attention-demanding processing. A focus on variability patterns allows us to examine the relative contribution of automatic and attention-demanding processing when a speaker is using the mother tongue (L1) or a second language (L2). A word naming task conducted in the L1 (Dutch) and L2 (English) shows L1 word processing to rely more on automatic spelling-to-sound conversion than L2 word processing. A word naming task with a semantic categorization subtask showed more reliance on attention-demanding semantic processing when using the L2. A comparison to L1 English data shows this was not only due to the amount of language use or language dominance, but also to the difference in orthographic depth between Dutch and English. An important implication of this finding is that when the same task is used to test and compare different languages, one cannot straightforwardly assume the same cognitive sub processes are involved to an equal degree using the same task in different languages.
Quantification of regional fat volume in rat MRI
NASA Astrophysics Data System (ADS)
Sacha, Jaroslaw P.; Cockman, Michael D.; Dufresne, Thomas E.; Trokhan, Darren
2003-05-01
Multiple initiatives in the pharmaceutical and beauty care industries are directed at identifying therapies for weight management. Body composition measurements are critical for such initiatives. Imaging technologies that can be used to measure body composition noninvasively include DXA (dual energy x-ray absorptiometry) and MRI (magnetic resonance imaging). Unlike other approaches, MRI provides the ability to perform localized measurements of fat distribution. Several factors complicate the automatic delineation of fat regions and quantification of fat volumes. These include motion artifacts, field non-uniformity, brightness and contrast variations, chemical shift misregistration, and ambiguity in delineating anatomical structures. We have developed an approach to deal practically with those challenges. The approach is implemented in a package, the Fat Volume Tool, for automatic detection of fat tissue in MR images of the rat abdomen, including automatic discrimination between abdominal and subcutaneous regions. We suppress motion artifacts using masking based on detection of implicit landmarks in the images. Adaptive object extraction is used to compensate for intensity variations. This approach enables us to perform fat tissue detection and quantification in a fully automated manner. The package can also operate in manual mode, which can be used for verification of the automatic analysis or for performing supervised segmentation. In supervised segmentation, the operator has the ability to interact with the automatic segmentation procedures to touch-up or completely overwrite intermediate segmentation steps. The operator's interventions steer the automatic segmentation steps that follow. This improves the efficiency and quality of the final segmentation. Semi-automatic segmentation tools (interactive region growing, live-wire, etc.) improve both the accuracy and throughput of the operator when working in manual mode. The quality of automatic segmentation has been evaluated by comparing the results of fully automated analysis to manual analysis of the same images. The comparison shows a high degree of correlation that validates the quality of the automatic segmentation approach.
NASA Astrophysics Data System (ADS)
Cui, Xiangyang; Li, She; Feng, Hui; Li, Guangyao
2017-05-01
In this paper, a novel triangular prism solid and shell interactive mapping element is proposed to solve the coupled magnetic-mechanical formulation in electromagnetic sheet metal forming process. A linear six-node "Triprism" element is firstly proposed for transient eddy current analysis in electromagnetic field. In present "Triprism" element, shape functions are given explicitly, and a cell-wise gradient smoothing operation is used to obtain the gradient matrices without evaluating derivatives of shape functions. In mechanical field analysis, a shear locking free triangular shell element is employed in internal force computation, and a data mapping method is developed to transfer the Lorentz force on solid into the external forces suffered by shell structure for dynamic elasto-plasticity deformation analysis. Based on the deformed triangular shell structure, a "Triprism" element generation rule is established for updated electromagnetic analysis, which means inter-transformation of meshes between the coupled fields can be performed automatically. In addition, the dynamic moving mesh is adopted for air mesh updating based on the deformation of sheet metal. A benchmark problem is carried out for confirming the accuracy of the proposed "Triprism" element in predicting flux density in electromagnetic field. Solutions of several EMF problems obtained by present work are compared with experiment results and those of traditional method, which are showing excellent performances of present interactive mapping element.
Investigation of a Technique for Measuring Dynamic Ground Effect in a Subsonic Wind Tunnel
NASA Technical Reports Server (NTRS)
Graves, Sharon S.
1999-01-01
To better understand the ground effect encountered by slender wing supersonic transport aircraft, a test was conducted at NASA Langley Research Center's 14 x 22 foot Subsonic Wind Tunnel in October, 1997. Emphasis was placed on improving the accuracy of the ground effect data by using a "dynamic" technique in which the model's vertical motion was varied automatically during wind-on testing. This report describes and evaluates different aspects of the dynamic method utilized for obtaining ground effect data in this test. The method for acquiring and processing time data from a dynamic ground effect wind tunnel test is outlined with details of the overall data acquisition system and software used for the data analysis. The removal of inertial loads due to sting motion and the support dynamics in the balance force and moment data measurements of the aerodynamic forces on the model is described. An evaluation of the results identifies problem areas providing recommendations for future experiments. Test results are validated by comparing test data for an elliptical wing planform with an Elliptical wing planform section with a NACA 0012 airfoil to results found in current literature. Major aerodynamic forces acting on the model in terms of lift curves for determining ground effect are presented. Comparisons of flight and wind tunnel data for the TU-144 are presented.
Smart algorithms and adaptive methods in computational fluid dynamics
NASA Astrophysics Data System (ADS)
Tinsley Oden, J.
1989-05-01
A review is presented of the use of smart algorithms which employ adaptive methods in processing large amounts of data in computational fluid dynamics (CFD). Smart algorithms use a rationally based set of criteria for automatic decision making in an attempt to produce optimal simulations of complex fluid dynamics problems. The information needed to make these decisions is not known beforehand and evolves in structure and form during the numerical solution of flow problems. Once the code makes a decision based on the available data, the structure of the data may change, and criteria may be reapplied in order to direct the analysis toward an acceptable end. Intelligent decisions are made by processing vast amounts of data that evolve unpredictably during the calculation. The basic components of adaptive methods and their application to complex problems of fluid dynamics are reviewed. The basic components of adaptive methods are: (1) data structures, that is what approaches are available for modifying data structures of an approximation so as to reduce errors; (2) error estimation, that is what techniques exist for estimating error evolution in a CFD calculation; and (3) solvers, what algorithms are available which can function in changing meshes. Numerical examples which demonstrate the viability of these approaches are presented.
NASA Technical Reports Server (NTRS)
Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi
1994-01-01
An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.
Vibrational energy distribution analysis (VEDA): scopes and limitations.
Jamróz, Michał H
2013-10-01
The principle of operations of the VEDA program written by the author for Potential Energy Distribution (PED) analysis of theoretical vibrational spectra is described. Nowadays, the PED analysis is indispensible tool in serious analysis of the vibrational spectra. To perform the PED analysis it is necessary to define 3N-6 linearly independent local mode coordinates. Already for 20-atomic molecules it is a difficult task. The VEDA program reads the input data automatically from the Gaussian program output files. Then, VEDA automatically proposes an introductory set of local mode coordinates. Next, the more adequate coordinates are proposed by the program and optimized to obtain maximal elements of each column (internal coordinate) of the PED matrix (the EPM parameter). The possibility for an automatic optimization of PED contributions is a unique feature of the VEDA program absent in any other programs performing PED analysis. Copyright © 2013 Elsevier B.V. All rights reserved.
Vibrational Energy Distribution Analysis (VEDA): Scopes and limitations
NASA Astrophysics Data System (ADS)
Jamróz, Michał H.
2013-10-01
The principle of operations of the VEDA program written by the author for Potential Energy Distribution (PED) analysis of theoretical vibrational spectra is described. Nowadays, the PED analysis is indispensible tool in serious analysis of the vibrational spectra. To perform the PED analysis it is necessary to define 3N-6 linearly independent local mode coordinates. Already for 20-atomic molecules it is a difficult task. The VEDA program reads the input data automatically from the Gaussian program output files. Then, VEDA automatically proposes an introductory set of local mode coordinates. Next, the more adequate coordinates are proposed by the program and optimized to obtain maximal elements of each column (internal coordinate) of the PED matrix (the EPM parameter). The possibility for an automatic optimization of PED contributions is a unique feature of the VEDA program absent in any other programs performing PED analysis.
Extension of the ADjoint Approach to a Laminar Navier-Stokes Solver
NASA Astrophysics Data System (ADS)
Paige, Cody
The use of adjoint methods is common in computational fluid dynamics to reduce the cost of the sensitivity analysis in an optimization cycle. The forward mode ADjoint is a combination of an adjoint sensitivity analysis method with a forward mode automatic differentiation (AD) and is a modification of the reverse mode ADjoint method proposed by Mader et al.[1]. A colouring acceleration technique is presented to reduce the computational cost increase associated with forward mode AD. The forward mode AD facilitates the implementation of the laminar Navier-Stokes (NS) equations. The forward mode ADjoint method is applied to a three-dimensional computational fluid dynamics solver. The resulting Euler and viscous ADjoint sensitivities are compared to the reverse mode Euler ADjoint derivatives and a complex-step method to demonstrate the reduced computational cost and accuracy. Both comparisons demonstrate the benefits of the colouring method and the practicality of using a forward mode AD. [1] Mader, C.A., Martins, J.R.R.A., Alonso, J.J., and van der Weide, E. (2008) ADjoint: An approach for the rapid development of discrete adjoint solvers. AIAA Journal, 46(4):863-873. doi:10.2514/1.29123.
Baumuratova, Tatiana; Dobre, Simona; Bastogne, Thierry; Sauter, Thomas
2013-01-01
Systems with bifurcations may experience abrupt irreversible and often unwanted shifts in their performance, called critical transitions. For many systems like climate, economy, ecosystems it is highly desirable to identify indicators serving as early warnings of such regime shifts. Several statistical measures were recently proposed as early warnings of critical transitions including increased variance, autocorrelation and skewness of experimental or model-generated data. The lack of automatized tool for model-based prediction of critical transitions led to designing DyGloSA – a MATLAB toolbox for dynamical global parameter sensitivity analysis (GPSA) of ordinary differential equations models. We suggest that the switch in dynamics of parameter sensitivities revealed by our toolbox is an early warning that a system is approaching a critical transition. We illustrate the efficiency of our toolbox by analyzing several models with bifurcations and predicting the time periods when systems can still avoid going to a critical transition by manipulating certain parameter values, which is not detectable with the existing SA techniques. DyGloSA is based on the SBToolbox2 and contains functions, which compute dynamically the global sensitivity indices of the system by applying four main GPSA methods: eFAST, Sobol's ANOVA, PRCC and WALS. It includes parallelized versions of the functions enabling significant reduction of the computational time (up to 12 times). DyGloSA is freely available as a set of MATLAB scripts at http://bio.uni.lu/systems_biology/software/dyglosa. It requires installation of MATLAB (versions R2008b or later) and the Systems Biology Toolbox2 available at www.sbtoolbox2.org. DyGloSA can be run on Windows and Linux systems, -32 and -64 bits. PMID:24367574
Baumuratova, Tatiana; Dobre, Simona; Bastogne, Thierry; Sauter, Thomas
2013-01-01
Systems with bifurcations may experience abrupt irreversible and often unwanted shifts in their performance, called critical transitions. For many systems like climate, economy, ecosystems it is highly desirable to identify indicators serving as early warnings of such regime shifts. Several statistical measures were recently proposed as early warnings of critical transitions including increased variance, autocorrelation and skewness of experimental or model-generated data. The lack of automatized tool for model-based prediction of critical transitions led to designing DyGloSA - a MATLAB toolbox for dynamical global parameter sensitivity analysis (GPSA) of ordinary differential equations models. We suggest that the switch in dynamics of parameter sensitivities revealed by our toolbox is an early warning that a system is approaching a critical transition. We illustrate the efficiency of our toolbox by analyzing several models with bifurcations and predicting the time periods when systems can still avoid going to a critical transition by manipulating certain parameter values, which is not detectable with the existing SA techniques. DyGloSA is based on the SBToolbox2 and contains functions, which compute dynamically the global sensitivity indices of the system by applying four main GPSA methods: eFAST, Sobol's ANOVA, PRCC and WALS. It includes parallelized versions of the functions enabling significant reduction of the computational time (up to 12 times). DyGloSA is freely available as a set of MATLAB scripts at http://bio.uni.lu/systems_biology/software/dyglosa. It requires installation of MATLAB (versions R2008b or later) and the Systems Biology Toolbox2 available at www.sbtoolbox2.org. DyGloSA can be run on Windows and Linux systems, -32 and -64 bits.
NASA Technical Reports Server (NTRS)
Macala, G. A.
1983-01-01
A computer program is described that can automatically generate symbolic equations of motion for systems of hinge-connected rigid bodies with tree topologies. The dynamical formulation underlying the program is outlined, and examples are given to show how a symbolic language is used to code the formulation. The program is applied to generate the equations of motion for a four-body model of the Galileo spacecraft. The resulting equations are shown to be a factor of three faster in execution time than conventional numerical subroutines.
Automatic inference of multicellular regulatory networks using informative priors.
Sun, Xiaoyun; Hong, Pengyu
2009-01-01
To fully understand the mechanisms governing animal development, computational models and algorithms are needed to enable quantitative studies of the underlying regulatory networks. We developed a mathematical model based on dynamic Bayesian networks to model multicellular regulatory networks that govern cell differentiation processes. A machine-learning method was developed to automatically infer such a model from heterogeneous data. We show that the model inference procedure can be greatly improved by incorporating interaction data across species. The proposed approach was applied to C. elegans vulval induction to reconstruct a model capable of simulating C. elegans vulval induction under 73 different genetic conditions.
Policy enabled information sharing system
Jorgensen, Craig R.; Nelson, Brian D.; Ratheal, Steve W.
2014-09-02
A technique for dynamically sharing information includes executing a sharing policy indicating when to share a data object responsive to the occurrence of an event. The data object is created by formatting a data file to be shared with a receiving entity. The data object includes a file data portion and a sharing metadata portion. The data object is encrypted and then automatically transmitted to the receiving entity upon occurrence of the event. The sharing metadata portion includes metadata characterizing the data file and referenced in connection with the sharing policy to determine when to automatically transmit the data object to the receiving entity.
Posteriori error determination and grid adaptation for AMR and ALE computational fluid dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lapenta, G. M.
2002-01-01
We discuss grid adaptation for application to AMR and ALE codes. Two new contributions are presented. First, a new method to locate the regions where the truncation error is being created due to an insufficient accuracy: the operator recovery error origin (OREO) detector. The OREO detector is automatic, reliable, easy to implement and extremely inexpensive. Second, a new grid motion technique is presented for application to ALE codes. The method is based on the Brackbill-Saltzman approach but it is directly linked to the OREO detector and moves the grid automatically to minimize the error.
The design of digital-adaptive controllers for VTOL aircraft
NASA Technical Reports Server (NTRS)
Stengel, R. F.; Broussard, J. R.; Berry, P. W.
1976-01-01
Design procedures for VTOL automatic control systems have been developed and are presented. Using linear-optimal estimation and control techniques as a starting point, digital-adaptive control laws have been designed for the VALT Research Aircraft, a tandem-rotor helicopter which is equipped for fully automatic flight in terminal area operations. These control laws are designed to interface with velocity-command and attitude-command guidance logic, which could be used in short-haul VTOL operations. Developments reported here include new algorithms for designing non-zero-set-point digital regulators, design procedures for rate-limited systems, and algorithms for dynamic control trim setting.
Users manual for the Variable dimension Automatic Synthesis Program (VASP)
NASA Technical Reports Server (NTRS)
White, J. S.; Lee, H. Q.
1971-01-01
A dictionary and some problems for the Variable Automatic Synthesis Program VASP are submitted. The dictionary contains a description of each subroutine and instructions on its use. The example problems give the user a better perspective on the use of VASP for solving problems in modern control theory. These example problems include dynamic response, optimal control gain, solution of the sampled data matrix Ricatti equation, matrix decomposition, and pseudo inverse of a matrix. Listings of all subroutines are also included. The VASP program has been adapted to run in the conversational mode on the Ames 360/67 computer.
Computational model of lightness perception in high dynamic range imaging
NASA Astrophysics Data System (ADS)
Krawczyk, Grzegorz; Myszkowski, Karol; Seidel, Hans-Peter
2006-02-01
An anchoring theory of lightness perception by Gilchrist et al. [1999] explains many characteristics of human visual system such as lightness constancy and its spectacular failures which are important in the perception of images. The principal concept of this theory is the perception of complex scenes in terms of groups of consistent areas (frameworks). Such areas, following the gestalt theorists, are defined by the regions of common illumination. The key aspect of the image perception is the estimation of lightness within each framework through the anchoring to the luminance perceived as white, followed by the computation of the global lightness. In this paper we provide a computational model for automatic decomposition of HDR images into frameworks. We derive a tone mapping operator which predicts lightness perception of the real world scenes and aims at its accurate reproduction on low dynamic range displays. Furthermore, such a decomposition into frameworks opens new grounds for local image analysis in view of human perception.
Evaluation of space shuttle main engine fluid dynamic frequency response characteristics
NASA Technical Reports Server (NTRS)
Gardner, T. G.
1980-01-01
In order to determine the POGO stability characteristics of the space shuttle main engine liquid oxygen (LOX) system, the fluid dynamic frequency response functions between elements in the SSME LOX system was evaluated, both analytically and experimentally. For the experimental data evaluation, a software package was written for the Hewlett-Packard 5451C Fourier analyzer. The POGO analysis software is documented and consists of five separate segments. Each segment is stored on the 5451C disc as an individual program and performs its own unique function. Two separate data reduction methods, a signal calibration, coherence or pulser signal based frequency response function blanking, and automatic plotting features are included in the program. The 5451C allows variable parameter transfer from program to program. This feature is used to advantage and requires only minimal user interface during the data reduction process. Experimental results are included and compared with the analytical predictions in order to adjust the general model and arrive at a realistic simulation of the POGO characteristics.
Parameterized examination in econometrics
NASA Astrophysics Data System (ADS)
Malinova, Anna; Kyurkchiev, Vesselin; Spasov, Georgi
2018-01-01
The paper presents a parameterization of basic types of exam questions in Econometrics. This algorithm is used to automate and facilitate the process of examination, assessment and self-preparation of a large number of students. The proposed parameterization of testing questions reduces the time required to author tests and course assignments. It enables tutors to generate a large number of different but equivalent dynamic questions (with dynamic answers) on a certain topic, which are automatically assessed. The presented methods are implemented in DisPeL (Distributed Platform for e-Learning) and provide questions in the areas of filtering and smoothing of time-series data, forecasting, building and analysis of single-equation econometric models. Questions also cover elasticity, average and marginal characteristics, product and cost functions, measurement of monopoly power, supply, demand and equilibrium price, consumer and product surplus, etc. Several approaches are used to enable the required numerical computations in DisPeL - integration of third-party mathematical libraries, developing our own procedures from scratch, and wrapping our legacy math codes in order to modernize and reuse them.
Faugeras, Blaise; Maury, Olivier
2005-10-01
We develop an advection-diffusion size-structured fish population dynamics model and apply it to simulate the skipjack tuna population in the Indian Ocean. The model is fully spatialized, and movements are parameterized with oceanographical and biological data; thus it naturally reacts to environment changes. We first formulate an initial-boundary value problem and prove existence of a unique positive solution. We then discuss the numerical scheme chosen for the integration of the simulation model. In a second step we address the parameter estimation problem for such a model. With the help of automatic differentiation, we derive the adjoint code which is used to compute the exact gradient of a Bayesian cost function measuring the distance between the outputs of the model and catch and length frequency data. A sensitivity analysis shows that not all parameters can be estimated from the data. Finally twin experiments in which pertubated parameters are recovered from simulated data are successfully conducted.
Online, automatic, ionospheric maps: IRI-PLAS-MAP
NASA Astrophysics Data System (ADS)
Arikan, F.; Sezen, U.; Gulyaeva, T. L.; Cilibas, O.
2015-04-01
Global and regional behavior of the ionosphere is an important component of space weather. The peak height and critical frequency of ionospheric layer for the maximum ionization, namely, hmF2 and foF2, and the total number of electrons on a ray path, Total Electron Content (TEC), are the most investigated and monitored values of ionosphere in capturing and observing ionospheric variability. Typically ionospheric models such as International Reference Ionosphere (IRI) can provide electron density profile, critical parameters of ionospheric layers and Ionospheric electron content for a given location, date and time. Yet, IRI model is limited by only foF2 STORM option in reflecting the dynamics of ionospheric/plasmaspheric/geomagnetic storms. Global Ionospheric Maps (GIM) are provided by IGS analysis centers for global TEC distribution estimated from ground-based GPS stations that can capture the actual dynamics of ionosphere and plasmasphere, but this service is not available for other ionospheric observables. In this study, a unique and original space weather service is introduced as IRI-PLAS-MAP from http://www.ionolab.org
Automating the parallel processing of fluid and structural dynamics calculations
NASA Technical Reports Server (NTRS)
Arpasi, Dale J.; Cole, Gary L.
1987-01-01
The NASA Lewis Research Center is actively involved in the development of expert system technology to assist users in applying parallel processing to computational fluid and structural dynamic analysis. The goal of this effort is to eliminate the necessity for the physical scientist to become a computer scientist in order to effectively use the computer as a research tool. Programming and operating software utilities have previously been developed to solve systems of ordinary nonlinear differential equations on parallel scalar processors. Current efforts are aimed at extending these capabilities to systems of partial differential equations, that describe the complex behavior of fluids and structures within aerospace propulsion systems. This paper presents some important considerations in the redesign, in particular, the need for algorithms and software utilities that can automatically identify data flow patterns in the application program and partition and allocate calculations to the parallel processors. A library-oriented multiprocessing concept for integrating the hardware and software functions is described.
NASA Technical Reports Server (NTRS)
Heelis, Roderick A.
1994-01-01
The final activity period for the DE project has been particularly productive. This period has seen the final delivery of geophysical data sets to the National Space Science Data Center, the granting of three Ph.D. degrees from cumulative work on the project, the operation of automatic data access and display routines for the data, and an increased effort in research and publication of the data. As before the research activities, largely devoted to studies involving the dynamics of the ionosphere, utilize data from the IDM and the RPA and thus the work is not easily attributable to one or the other of these separately funded efforts. In this final report we provide brief descriptions of the work accomplished in the final phase of the program. The Dynamics Explorer program has provided a significant opportunity for much of the community to participate in the data analysis and interpretation. The data, now residing in the national space science data center, are a great legacy that should continue to yield important results for many years.
Diagnostic accuracy of automatic normalization of CBV in glioma grading using T1- weighted DCE-MRI.
Sahoo, Prativa; Gupta, Rakesh K; Gupta, Pradeep K; Awasthi, Ashish; Pandey, Chandra M; Gupta, Mudit; Patir, Rana; Vaishya, Sandeep; Ahlawat, Sunita; Saha, Indrajit
2017-12-01
Aim of this retrospective study was to compare diagnostic accuracy of proposed automatic normalization method to quantify the relative cerebral blood volume (rCBV) with existing contra-lateral region of interest (ROI) based CBV normalization method for glioma grading using T1-weighted dynamic contrast enhanced MRI (DCE-MRI). Sixty patients with histologically confirmed gliomas were included in this study retrospectively. CBV maps were generated using T1-weighted DCE-MRI and are normalized by contralateral ROI based method (rCBV_contra), unaffected white matter (rCBV_WM) and unaffected gray matter (rCBV_GM), the latter two of these were generated automatically. An expert radiologist with >10years of experience in DCE-MRI and a non-expert with one year experience were used independently to measure rCBVs. Cutoff values for glioma grading were decided from ROC analysis. Agreement of histology with rCBV_WM, rCBV_GM and rCBV_contra respectively was studied using Kappa statistics and intra-class correlation coefficient (ICC). The diagnostic accuracy of glioma grading using the measured rCBV_contra by expert radiologist was found to be high (sensitivity=1.00, specificity=0.96, p<0.001) compared to the non-expert user (sensitivity=0.65, specificity=0.78, p<0.001). On the other hand, both the expert and non-expert user showed similar diagnostic accuracy for automatic rCBV_WM (sensitivity=0.89, specificity=0.87, p=0.001) and rCBV_GM (sensitivity=0.81, specificity=0.78, p=0.001) measures. Further, it was also observed that, contralateral based method by expert user showed highest agreement with histological grading of tumor (kappa=0.96, agreement 98.33%, p<0.001), however; automatic normalization method showed same percentage of agreement for both expert and non-expert user. rCBV_WM showed an agreement of 88.33% (kappa=0.76,p<0.001) with histopathological grading. It was inferred from this study that, in the absence of expert user, automated normalization of CBV using the proposed method could provide better diagnostic accuracy compared to the manual contralateral based approach. Copyright © 2017 Elsevier Inc. All rights reserved.
Approaches to the automatic generation and control of finite element meshes
NASA Technical Reports Server (NTRS)
Shephard, Mark S.
1987-01-01
The algorithmic approaches being taken to the development of finite element mesh generators capable of automatically discretizing general domains without the need for user intervention are discussed. It is demonstrated that because of the modeling demands placed on a automatic mesh generator, all the approaches taken to date produce unstructured meshes. Consideration is also given to both a priori and a posteriori mesh control devices for automatic mesh generators as well as their integration with geometric modeling and adaptive analysis procedures.
Social Risk and Depression: Evidence from Manual and Automatic Facial Expression Analysis
Girard, Jeffrey M.; Cohn, Jeffrey F.; Mahoor, Mohammad H.; Mavadati, Seyedmohammad; Rosenwald, Dean P.
2014-01-01
Investigated the relationship between change over time in severity of depression symptoms and facial expression. Depressed participants were followed over the course of treatment and video recorded during a series of clinical interviews. Facial expressions were analyzed from the video using both manual and automatic systems. Automatic and manual coding were highly consistent for FACS action units, and showed similar effects for change over time in depression severity. For both systems, when symptom severity was high, participants made more facial expressions associated with contempt, smiled less, and those smiles that occurred were more likely to be accompanied by facial actions associated with contempt. These results are consistent with the “social risk hypothesis” of depression. According to this hypothesis, when symptoms are severe, depressed participants withdraw from other people in order to protect themselves from anticipated rejection, scorn, and social exclusion. As their symptoms fade, participants send more signals indicating a willingness to affiliate. The finding that automatic facial expression analysis was both consistent with manual coding and produced the same pattern of depression effects suggests that automatic facial expression analysis may be ready for use in behavioral and clinical science. PMID:24598859
Zheng, Rencheng; Yamabe, Shigeyuki; Nakano, Kimihiko; Suda, Yoshihiro
2015-01-01
Nowadays insight into human-machine interaction is a critical topic with the large-scale development of intelligent vehicles. Biosignal analysis can provide a deeper understanding of driver behaviors that may indicate rationally practical use of the automatic technology. Therefore, this study concentrates on biosignal analysis to quantitatively evaluate mental stress of drivers during automatic driving of trucks, with vehicles set at a closed gap distance apart to reduce air resistance to save energy consumption. By application of two wearable sensor systems, a continuous measurement was realized for palmar perspiration and masseter electromyography, and a biosignal processing method was proposed to assess mental stress levels. In a driving simulator experiment, ten participants completed automatic driving with 4, 8, and 12 m gap distances from the preceding vehicle, and manual driving with about 25 m gap distance as a reference. It was found that mental stress significantly increased when the gap distances decreased, and an abrupt increase in mental stress of drivers was also observed accompanying a sudden change of the gap distance during automatic driving, which corresponded to significantly higher ride discomfort according to subjective reports. PMID:25738768
An Approach to Dynamic Service Management in Pervasive Computing Systems
2005-01-01
standard interface to them that is easily accessible by any user. This paper outlines the design of Centaurus , an infrastructure for presenting...based on Extensi- ble Markup Language (XML) for communication, giving the system a uniform and easily adaptable interface. Centaurus defines a...easy and automatic usage. This is the vision that guides our re- search on the Centaurus system. We define a SmartSpace as a dynamic environment that
Biologically inspired EM image alignment and neural reconstruction.
Knowles-Barley, Seymour; Butcher, Nancy J; Meinertzhagen, Ian A; Armstrong, J Douglas
2011-08-15
Three-dimensional reconstruction of consecutive serial-section transmission electron microscopy (ssTEM) images of neural tissue currently requires many hours of manual tracing and annotation. Several computational techniques have already been applied to ssTEM images to facilitate 3D reconstruction and ease this burden. Here, we present an alternative computational approach for ssTEM image analysis. We have used biologically inspired receptive fields as a basis for a ridge detection algorithm to identify cell membranes, synaptic contacts and mitochondria. Detected line segments are used to improve alignment between consecutive images and we have joined small segments of membrane into cell surfaces using a dynamic programming algorithm similar to the Needleman-Wunsch and Smith-Waterman DNA sequence alignment procedures. A shortest path-based approach has been used to close edges and achieve image segmentation. Partial reconstructions were automatically generated and used as a basis for semi-automatic reconstruction of neural tissue. The accuracy of partial reconstructions was evaluated and 96% of membrane could be identified at the cost of 13% false positive detections. An open-source reference implementation is available in the Supplementary information. seymour.kb@ed.ac.uk; douglas.armstrong@ed.ac.uk Supplementary data are available at Bioinformatics online.
Left ventricular endocardial surface detection based on real-time 3D echocardiographic data
NASA Technical Reports Server (NTRS)
Corsi, C.; Borsari, M.; Consegnati, F.; Sarti, A.; Lamberti, C.; Travaglini, A.; Shiota, T.; Thomas, J. D.
2001-01-01
OBJECTIVE: A new computerized semi-automatic method for left ventricular (LV) chamber segmentation is presented. METHODS: The LV is imaged by real-time three-dimensional echocardiography (RT3DE). The surface detection model, based on level set techniques, is applied to RT3DE data for image analysis. The modified level set partial differential equation we use is solved by applying numerical methods for conservation laws. The initial conditions are manually established on some slices of the entire volume. The solution obtained for each slice is a contour line corresponding with the boundary between LV cavity and LV endocardium. RESULTS: The mathematical model has been applied to sequences of frames of human hearts (volume range: 34-109 ml) imaged by 2D and reconstructed off-line and RT3DE data. Volume estimation obtained by this new semi-automatic method shows an excellent correlation with those obtained by manual tracing (r = 0.992). Dynamic change of LV volume during the cardiac cycle is also obtained. CONCLUSION: The volume estimation method is accurate; edge based segmentation, image completion and volume reconstruction can be accomplished. The visualization technique also allows to navigate into the reconstructed volume and to display any section of the volume.