C4 Software Technology Reference Guide - A Prototype.
1997-01-10
domain analysis methods include • Feature-oriented domain analysis ( FODA ) (see pg. 185), a domain analysis method based upon identifying the... Analysis ( FODA ) Feasibility Study (CMU/SEI-90-TR-21, ADA 235785). Pittsburgh, PA: Software En- gineering Institute, Carnegie Mellon University, 1990. 178...domain analysis ( FODA ) (see pg. 185), in which a feature is a user-visible aspect or char- acteristic of the domain [Kang 90].) The features in a system
DOT National Transportation Integrated Search
1995-07-01
An objective and quantitative method has been developed for deriving models of complex and specialized spheres of activity (domains) from domain-generated verbal data. The method was developed for analysis of interview transcripts, incident reports, ...
Verification and validation of a Work Domain Analysis with turing machine task analysis.
Rechard, J; Bignon, A; Berruet, P; Morineau, T
2015-03-01
While the use of Work Domain Analysis as a methodological framework in cognitive engineering is increasing rapidly, verification and validation of work domain models produced by this method are becoming a significant issue. In this article, we propose the use of a method based on Turing machine formalism named "Turing Machine Task Analysis" to verify and validate work domain models. The application of this method on two work domain analyses, one of car driving which is an "intentional" domain, and the other of a ship water system which is a "causal domain" showed the possibility of highlighting improvements needed by these models. More precisely, the step by step analysis of a degraded task scenario in each work domain model pointed out unsatisfactory aspects in the first modelling, like overspecification, underspecification, omission of work domain affordances, or unsuitable inclusion of objects in the work domain model. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.
NASA Technical Reports Server (NTRS)
Mcgreevy, Michael W.
1995-01-01
An objective and quantitative method has been developed for deriving models of complex and specialized spheres of activity (domains) from domain-generated verbal data. The method was developed for analysis of interview transcripts, incident reports, and other text documents whose original source is people who are knowledgeable about, and participate in, the domain in question. To test the method, it is applied here to a report describing a remote sensing project within the scope of the Earth Observing System (EOS). The method has the potential to improve the designs of domain-related computer systems and software by quickly providing developers with explicit and objective models of the domain in a form which is useful for design. Results of the analysis include a network model of the domain, and an object-oriented relational analysis report which describes the nodes and relationships in the network model. Other products include a database of relationships in the domain, and an interactive concordance. The analysis method utilizes a newly developed relational metric, a proximity-weighted frequency of co-occurrence. The metric is applied to relations between the most frequently occurring terms (words or multiword entities) in the domain text, and the terms found within the contexts of these terms. Contextual scope is selectable. Because of the discriminating power of the metric, data reduction from the association matrix to the network is simple. In addition to their value for design. the models produced by the method are also useful for understanding the domains themselves. They can, for example, be interpreted as models of presence in the domain.
Formal Language Design in the Context of Domain Engineering
2000-03-28
73 Related Work 75 5.1 Feature oriented domain analysis ( FODA ) 75 5.2 Organizational domain modeling (ODM) 76 5.3 Domain-Specific Software...However there are only a few that are well defined and used repeatedly in practice. These include: Feature oriented domain analysis ( FODA ), Organizational...Feature oriented domain analysis ( FODA ) Feature oriented domain analysis ( FODA ) is a domain analysis method being researched and applied by the SEI
Examining, Documenting, and Modeling the Problem Space of a Variable Domain
2002-06-14
Feature-Oriented Domain Analysis ( FODA ) .............................................................................................. 9...development of this proposed process include: Feature-Oriented Domain Analysis ( FODA ) [3,4], Organization Domain Modeling (ODM) [2,5,6], Family-Oriented...configuration knowledge using generators [2]. 8 Existing Methods of Domain Engineering Feature-Oriented Domain Analysis ( FODA ) FODA is a domain
NASA Astrophysics Data System (ADS)
Yusa, Yasunori; Okada, Hiroshi; Yamada, Tomonori; Yoshimura, Shinobu
2018-04-01
A domain decomposition method for large-scale elastic-plastic problems is proposed. The proposed method is based on a quasi-Newton method in conjunction with a balancing domain decomposition preconditioner. The use of a quasi-Newton method overcomes two problems associated with the conventional domain decomposition method based on the Newton-Raphson method: (1) avoidance of a double-loop iteration algorithm, which generally has large computational complexity, and (2) consideration of the local concentration of nonlinear deformation, which is observed in elastic-plastic problems with stress concentration. Moreover, the application of a balancing domain decomposition preconditioner ensures scalability. Using the conventional and proposed domain decomposition methods, several numerical tests, including weak scaling tests, were performed. The convergence performance of the proposed method is comparable to that of the conventional method. In particular, in elastic-plastic analysis, the proposed method exhibits better convergence performance than the conventional method.
Shape design sensitivity analysis using domain information
NASA Technical Reports Server (NTRS)
Seong, Hwal-Gyeong; Choi, Kyung K.
1985-01-01
A numerical method for obtaining accurate shape design sensitivity information for built-up structures is developed and demonstrated through analysis of examples. The basic character of the finite element method, which gives more accurate domain information than boundary information, is utilized for shape design sensitivity improvement. A domain approach for shape design sensitivity analysis of built-up structures is derived using the material derivative idea of structural mechanics and the adjoint variable method of design sensitivity analysis. Velocity elements and B-spline curves are introduced to alleviate difficulties in generating domain velocity fields. The regularity requirements of the design velocity field are studied.
Domain fusion analysis by applying relational algebra to protein sequence and domain databases
Truong, Kevin; Ikura, Mitsuhiko
2003-01-01
Background Domain fusion analysis is a useful method to predict functionally linked proteins that may be involved in direct protein-protein interactions or in the same metabolic or signaling pathway. As separate domain databases like BLOCKS, PROSITE, Pfam, SMART, PRINTS-S, ProDom, TIGRFAMs, and amalgamated domain databases like InterPro continue to grow in size and quality, a computational method to perform domain fusion analysis that leverages on these efforts will become increasingly powerful. Results This paper proposes a computational method employing relational algebra to find domain fusions in protein sequence databases. The feasibility of this method was illustrated on the SWISS-PROT+TrEMBL sequence database using domain predictions from the Pfam HMM (hidden Markov model) database. We identified 235 and 189 putative functionally linked protein partners in H. sapiens and S. cerevisiae, respectively. From scientific literature, we were able to confirm many of these functional linkages, while the remainder offer testable experimental hypothesis. Results can be viewed at . Conclusion As the analysis can be computed quickly on any relational database that supports standard SQL (structured query language), it can be dynamically updated along with the sequence and domain databases, thereby improving the quality of predictions over time. PMID:12734020
1993-12-01
proposed a domain analysis approach called Feature-Oriented Domain Analysis ( FODA ). The approach identifies prominent features (similarities) and...characteristics of software systems in the domain. Unlike the other domain analysis approaches we have summarized, the re- searchers described FODA in...Domain Analysis ( FODA ) Feasibility Study. Technical Report, Software Engineering Institute, Carnegie Mellon University, Novem- ber 1990. 19. Lee, Kenneth
Jothi, Raja; Cherukuri, Praveen F.; Tasneem, Asba; Przytycka, Teresa M.
2006-01-01
Recent advances in functional genomics have helped generate large-scale high-throughput protein interaction data. Such networks, though extremely valuable towards molecular level understanding of cells, do not provide any direct information about the regions (domains) in the proteins that mediate the interaction. Here, we performed co-evolutionary analysis of domains in interacting proteins in order to understand the degree of co-evolution of interacting and non-interacting domains. Using a combination of sequence and structural analysis, we analyzed protein–protein interactions in F1-ATPase, Sec23p/Sec24p, DNA-directed RNA polymerase and nuclear pore complexes, and found that interacting domain pair(s) for a given interaction exhibits higher level of co-evolution than the noninteracting domain pairs. Motivated by this finding, we developed a computational method to test the generality of the observed trend, and to predict large-scale domain–domain interactions. Given a protein–protein interaction, the proposed method predicts the domain pair(s) that is most likely to mediate the protein interaction. We applied this method on the yeast interactome to predict domain–domain interactions, and used known domain–domain interactions found in PDB crystal structures to validate our predictions. Our results show that the prediction accuracy of the proposed method is statistically significant. Comparison of our prediction results with those from two other methods reveals that only a fraction of predictions are shared by all the three methods, indicating that the proposed method can detect known interactions missed by other methods. We believe that the proposed method can be used with other methods to help identify previously unrecognized domain–domain interactions on a genome scale, and could potentially help reduce the search space for identifying interaction sites. PMID:16949097
NASA Astrophysics Data System (ADS)
Eriksen, Vibeke R.; Hahn, Gitte H.; Greisen, Gorm
2015-03-01
The aim was to compare two conventional methods used to describe cerebral autoregulation (CA): frequency-domain analysis and time-domain analysis. We measured cerebral oxygenation (as a surrogate for cerebral blood flow) and mean arterial blood pressure (MAP) in 60 preterm infants. In the frequency domain, outcome variables were coherence and gain, whereas the cerebral oximetry index (COx) and the regression coefficient were the outcome variables in the time domain. Correlation between coherence and COx was poor. The disagreement between the two methods was due to the MAP and cerebral oxygenation signals being in counterphase in three cases. High gain and high coherence may arise spuriously when cerebral oxygenation decreases as MAP increases; hence, time-domain analysis appears to be a more robust-and simpler-method to describe CA.
Criteria for Comparing Domain Analysis Approaches Version 01.00.00
1991-12-01
Down-Bottom-Up Domain Analysis Process (1990 Version) ..... 14 Figure 8. FODAs Domain Analysis Process ............................................ 16... FODA , which uses the Design Approach for Real-Time Systems (DARTS) design method (Gomaa 1984)? 1 1. hntmduction Domain analysis is still immature... Analysis Process 16 2. An Orvicw of Some Domain AnabAppro•d•a 2.4.3 ExAzs The FODA report illustrates the process by using the window management
Domain fusion analysis by applying relational algebra to protein sequence and domain databases.
Truong, Kevin; Ikura, Mitsuhiko
2003-05-06
Domain fusion analysis is a useful method to predict functionally linked proteins that may be involved in direct protein-protein interactions or in the same metabolic or signaling pathway. As separate domain databases like BLOCKS, PROSITE, Pfam, SMART, PRINTS-S, ProDom, TIGRFAMs, and amalgamated domain databases like InterPro continue to grow in size and quality, a computational method to perform domain fusion analysis that leverages on these efforts will become increasingly powerful. This paper proposes a computational method employing relational algebra to find domain fusions in protein sequence databases. The feasibility of this method was illustrated on the SWISS-PROT+TrEMBL sequence database using domain predictions from the Pfam HMM (hidden Markov model) database. We identified 235 and 189 putative functionally linked protein partners in H. sapiens and S. cerevisiae, respectively. From scientific literature, we were able to confirm many of these functional linkages, while the remainder offer testable experimental hypothesis. Results can be viewed at http://calcium.uhnres.utoronto.ca/pi. As the analysis can be computed quickly on any relational database that supports standard SQL (structured query language), it can be dynamically updated along with the sequence and domain databases, thereby improving the quality of predictions over time.
An operational modal analysis method in frequency and spatial domain
NASA Astrophysics Data System (ADS)
Wang, Tong; Zhang, Lingmi; Tamura, Yukio
2005-12-01
A frequency and spatial domain decomposition method (FSDD) for operational modal analysis (OMA) is presented in this paper, which is an extension of the complex mode indicator function (CMIF) method for experimental modal analysis (EMA). The theoretical background of the FSDD method is clarified. Singular value decomposition is adopted to separate the signal space from the noise space. Finally, an enhanced power spectrum density (PSD) is proposed to obtain more accurate modal parameters by curve fitting in the frequency domain. Moreover, a simulation case and an application case are used to validate this method.
Turbulence excited frequency domain damping measurement and truncation effects
NASA Technical Reports Server (NTRS)
Soovere, J.
1976-01-01
Existing frequency domain modal frequency and damping analysis methods are discussed. The effects of truncation in the Laplace and Fourier transform data analysis methods are described. Methods for eliminating truncation errors from measured damping are presented. Implications of truncation effects in fast Fourier transform analysis are discussed. Limited comparison with test data is presented.
Towards an Interoperability Ontology for Software Development Tools
2003-03-01
The description of feature models was tied to the introduction of the Feature-Oriented Domain Analysis ( FODA *) [KANG90] approach in the late eighties...Feature-oriented domain analysis ( FODA ) is a domain analysis method developed at the Software...ese obstacles was to construct a “pilot” ontology that is extensible. We applied the Feature-Oriented Domain Analysis approach to capture the
A Dual Super-Element Domain Decomposition Approach for Parallel Nonlinear Finite Element Analysis
NASA Astrophysics Data System (ADS)
Jokhio, G. A.; Izzuddin, B. A.
2015-05-01
This article presents a new domain decomposition method for nonlinear finite element analysis introducing the concept of dual partition super-elements. The method extends ideas from the displacement frame method and is ideally suited for parallel nonlinear static/dynamic analysis of structural systems. In the new method, domain decomposition is realized by replacing one or more subdomains in a "parent system," each with a placeholder super-element, where the subdomains are processed separately as "child partitions," each wrapped by a dual super-element along the partition boundary. The analysis of the overall system, including the satisfaction of equilibrium and compatibility at all partition boundaries, is realized through direct communication between all pairs of placeholder and dual super-elements. The proposed method has particular advantages for matrix solution methods based on the frontal scheme, and can be readily implemented for existing finite element analysis programs to achieve parallelization on distributed memory systems with minimal intervention, thus overcoming memory bottlenecks typically faced in the analysis of large-scale problems. Several examples are presented in this article which demonstrate the computational benefits of the proposed parallel domain decomposition approach and its applicability to the nonlinear structural analysis of realistic structural systems.
Pulse analysis of acoustic emission signals. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Houghton, J. R.
1976-01-01
A method for the signature analysis of pulses in the frequency domain and the time domain is presented. Fourier spectrum, Fourier transfer function, shock spectrum and shock spectrum ratio are examined in the frequency domain analysis, and pulse shape deconvolution is developed for use in the time domain analysis. To demonstrate the relative sensitivity of each of the methods to small changes in the pulse shape, signatures of computer modeled systems with analytical pulses are presented. Optimization techniques are developed and used to indicate the best design parameters values for deconvolution of the pulse shape. Several experiments are presented that test the pulse signature analysis methods on different acoustic emission sources. These include acoustic emissions associated with: (1) crack propagation, (2) ball dropping on a plate, (3) spark discharge and (4) defective and good ball bearings.
Efficient multiscale magnetic-domain analysis of iron-core material under mechanical stress
NASA Astrophysics Data System (ADS)
Nishikubo, Atsushi; Ito, Shumpei; Mifune, Takeshi; Matsuo, Tetsuji; Kaido, Chikara; Takahashi, Yasuhito; Fujiwara, Koji
2018-05-01
For an efficient analysis of magnetization, a partial-implicit solution method is improved using an assembled domain structure model with six-domain mesoscopic particles exhibiting pinning-type hysteresis. The quantitative analysis of non-oriented silicon steel succeeds in predicting the stress dependence of hysteresis loss with computation times greatly reduced by using the improved partial-implicit method. The effect of cell division along the thickness direction is also evaluated.
Dual-domain point diffraction interferometer
Naulleau, Patrick P.; Goldberg, Kenneth Alan
2000-01-01
A hybrid spatial/temporal-domain point diffraction interferometer (referred to as the dual-domain PS/PDI) that is capable of suppressing the scattered-reference-light noise that hinders the conventional PS/PDI is provided. The dual-domain PS/PDI combines the separate noise-suppression capabilities of the widely-used phase-shifting and Fourier-transform fringe pattern analysis methods. The dual-domain PS/PDI relies on both a more restrictive implementation of the image plane PS/PDI mask and a new analysis method to be applied to the interferograms generated and recorded by the modified PS/PDI. The more restrictive PS/PDI mask guarantees the elimination of spatial-frequency crosstalk between the signal and the scattered-light noise arising from scattered-reference-light interfering with the test beam. The new dual-domain analysis method is then used to eliminate scattered-light noise arising from both the scattered-reference-light interfering with the test beam and the scattered-reference-light interfering with the "true" pinhole-diffracted reference light. The dual-domain analysis method has also been demonstrated to provide performance enhancement when using the non-optimized standard PS/PDI design. The dual-domain PS/PDI is essentially a three-tiered filtering system composed of lowpass spatial-filtering the test-beam electric field using the more restrictive PS/PDI mask, bandpass spatial-filtering the individual interferogram irradiance frames making up the phase-shifting series, and bandpass temporal-filtering the phase-shifting series as a whole.
Impact of Domain Analysis on Reuse Methods
1989-11-06
return on the investment. The potential negative effects a "bad" domain analysis has on developing systems in the domain also increases the risks of a...importance of domain analysis as part of a software reuse program. A particular goal is to assist in avoiding the potential negative effects of ad hoc or...are specification objects discovered by performing object-oriented analysis. Object-based analysis approaches thus serve to capture a model of reality
Wang, B; Switowski, K; Cojocaru, C; Roppo, V; Sheng, Y; Scalora, M; Kisielewski, J; Pawlak, D; Vilaseca, R; Akhouayri, H; Krolikowski, W; Trull, J
2018-01-22
We present an indirect, non-destructive optical method for domain statistic characterization in disordered nonlinear crystals having homogeneous refractive index and spatially random distribution of ferroelectric domains. This method relies on the analysis of the wave-dependent spatial distribution of the second harmonic, in the plane perpendicular to the optical axis in combination with numerical simulations. We apply this technique to the characterization of two different media, Calcium Barium Niobate and Strontium Barium Niobate, with drastically different statistical distributions of ferroelectric domains.
Kotani, Kiyoshi; Takamasu, Kiyoshi; Tachibana, Makoto
2007-01-01
The objectives of this paper were to present a method to extract the amplitude of RSA in the respiratory-phase domain, to compare that with subjective or objective indices of the MWL (mental workload), and to compare that with a conventional frequency analysis in terms of its accuracy during a mental arithmetic task. HRV (heart rate variability), ILV (instantaneous lung volume), and motion of the throat were measured under a mental arithmetic experiment and subjective and objective indices were also obtained. The amplitude of RSA was extracted in the respiratory-phase domain, and its correlation with the load level was compared with the results of the frequency domain analysis, which is the standard analysis of the HRV. The subjective and objective indices decreased as the load level increased, showing that the experimental protocol was appropriate. Then, the amplitude of RSA in the respiratory-phase domain also decreased with the increase in the load level. The results of the correlation analysis showed that the respiratory-phase domain analysis has higher negative correlations, -0.84 and -0.82, with the load level as determined by simple correlation and rank correlation, respectively, than does frequency analysis, for which the correlations were found to be -0.54 and -0.63, respectively. In addition, it was demonstrated that the proposed method could be applied to the short-term extraction of RSA amplitude. We proposed a simple and effective method to extract the amplitude of the respiratory sinus arrhythmia (RSA) in the respiratory-phase domain and the results show that this method can estimate cardiac vagal activity more accurately than frequency analysis.
Fusing modeling techniques to support domain analysis for reuse opportunities identification
NASA Technical Reports Server (NTRS)
Hall, Susan Main; Mcguire, Eileen
1993-01-01
Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.
Fast time- and frequency-domain finite-element methods for electromagnetic analysis
NASA Astrophysics Data System (ADS)
Lee, Woochan
Fast electromagnetic analysis in time and frequency domain is of critical importance to the design of integrated circuits (IC) and other advanced engineering products and systems. Many IC structures constitute a very large scale problem in modeling and simulation, the size of which also continuously grows with the advancement of the processing technology. This results in numerical problems beyond the reach of existing most powerful computational resources. Different from many other engineering problems, the structure of most ICs is special in the sense that its geometry is of Manhattan type and its dielectrics are layered. Hence, it is important to develop structure-aware algorithms that take advantage of the structure specialties to speed up the computation. In addition, among existing time-domain methods, explicit methods can avoid solving a matrix equation. However, their time step is traditionally restricted by the space step for ensuring the stability of a time-domain simulation. Therefore, making explicit time-domain methods unconditionally stable is important to accelerate the computation. In addition to time-domain methods, frequency-domain methods have suffered from an indefinite system that makes an iterative solution difficult to converge fast. The first contribution of this work is a fast time-domain finite-element algorithm for the analysis and design of very large-scale on-chip circuits. The structure specialty of on-chip circuits such as Manhattan geometry and layered permittivity is preserved in the proposed algorithm. As a result, the large-scale matrix solution encountered in the 3-D circuit analysis is turned into a simple scaling of the solution of a small 1-D matrix, which can be obtained in linear (optimal) complexity with negligible cost. Furthermore, the time step size is not sacrificed, and the total number of time steps to be simulated is also significantly reduced, thus achieving a total cost reduction in CPU time. The second contribution is a new method for making an explicit time-domain finite-element method (TDFEM) unconditionally stable for general electromagnetic analysis. In this method, for a given time step, we find the unstable modes that are the root cause of instability, and deduct them directly from the system matrix resulting from a TDFEM based analysis. As a result, an explicit TDFEM simulation is made stable for an arbitrarily large time step irrespective of the space step. The third contribution is a new method for full-wave applications from low to very high frequencies in a TDFEM based on matrix exponential. In this method, we directly deduct the eigenmodes having large eigenvalues from the system matrix, thus achieving a significantly increased time step in the matrix exponential based TDFEM. The fourth contribution is a new method for transforming the indefinite system matrix of a frequency-domain FEM to a symmetric positive definite one. We deduct non-positive definite component directly from the system matrix resulting from a frequency-domain FEM-based analysis. The resulting new representation of the finite-element operator ensures an iterative solution to converge in a small number of iterations. We then add back the non-positive definite component to synthesize the original solution with negligible cost.
Work Domain Analysis Methodology for Development of Operational Concepts for Advanced Reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hugo, Jacques
2015-05-01
This report describes a methodology to conduct a Work Domain Analysis in preparation for the development of operational concepts for new plants. This method has been adapted from the classical method described in the literature in order to better deal with the uncertainty and incomplete information typical of first-of-a-kind designs. The report outlines the strategy for undertaking a Work Domain Analysis of a new nuclear power plant and the methods to be used in the development of the various phases of the analysis. Basic principles are described to the extent necessary to explain why and how the classical method wasmore » adapted to make it suitable as a tool for the preparation of operational concepts for a new nuclear power plant. Practical examples are provided of the systematic application of the method and the various presentation formats in the operational analysis of advanced reactors.« less
NASA Astrophysics Data System (ADS)
Kiyono, Ken; Tsujimoto, Yutaka
2016-07-01
We develop a general framework to study the time and frequency domain characteristics of detrending-operation-based scaling analysis methods, such as detrended fluctuation analysis (DFA) and detrending moving average (DMA) analysis. In this framework, using either the time or frequency domain approach, the frequency responses of detrending operations are calculated analytically. Although the frequency domain approach based on conventional linear analysis techniques is only applicable to linear detrending operations, the time domain approach presented here is applicable to both linear and nonlinear detrending operations. Furthermore, using the relationship between the time and frequency domain representations of the frequency responses, the frequency domain characteristics of nonlinear detrending operations can be obtained. Based on the calculated frequency responses, it is possible to establish a direct connection between the root-mean-square deviation of the detrending-operation-based scaling analysis and the power spectrum for linear stochastic processes. Here, by applying our methods to DFA and DMA, including higher-order cases, exact frequency responses are calculated. In addition, we analytically investigate the cutoff frequencies of DFA and DMA detrending operations and show that these frequencies are not optimally adjusted to coincide with the corresponding time scale.
Kiyono, Ken; Tsujimoto, Yutaka
2016-07-01
We develop a general framework to study the time and frequency domain characteristics of detrending-operation-based scaling analysis methods, such as detrended fluctuation analysis (DFA) and detrending moving average (DMA) analysis. In this framework, using either the time or frequency domain approach, the frequency responses of detrending operations are calculated analytically. Although the frequency domain approach based on conventional linear analysis techniques is only applicable to linear detrending operations, the time domain approach presented here is applicable to both linear and nonlinear detrending operations. Furthermore, using the relationship between the time and frequency domain representations of the frequency responses, the frequency domain characteristics of nonlinear detrending operations can be obtained. Based on the calculated frequency responses, it is possible to establish a direct connection between the root-mean-square deviation of the detrending-operation-based scaling analysis and the power spectrum for linear stochastic processes. Here, by applying our methods to DFA and DMA, including higher-order cases, exact frequency responses are calculated. In addition, we analytically investigate the cutoff frequencies of DFA and DMA detrending operations and show that these frequencies are not optimally adjusted to coincide with the corresponding time scale.
Ventilatory thresholds determined from HRV: comparison of 2 methods in obese adolescents.
Quinart, S; Mourot, L; Nègre, V; Simon-Rigaud, M-L; Nicolet-Guénat, M; Bertrand, A-M; Meneveau, N; Mougin, F
2014-03-01
The development of personalised training programmes is crucial in the management of obesity. We evaluated the ability of 2 heart rate variability analyses to determine ventilatory thresholds (VT) in obese adolescents. 20 adolescents (mean age 14.3±1.6 years and body mass index z-score 4.2±0.1) performed an incremental test to exhaustion before and after a 9-month multidisciplinary management programme. The first (VT1) and second (VT2) ventilatory thresholds were identified by the reference method (gas exchanges). We recorded RR intervals to estimate VT1 and VT2 from heart rate variability using time-domain analysis and time-varying spectral-domain analysis. The coefficient correlations between thresholds were higher with spectral-domain analysis compared to time-domain analysis: Heart rate at VT1: r=0.91 vs. =0.66 and VT2: r=0.91 vs. =0.66; power at VT1: r=0.91 vs. =0.74 and VT2: r=0.93 vs. =0.78; spectral-domain vs. time-domain analysis respectively). No systematic bias in heart rate at VT1 and VT2 with standard deviations <6 bpm were found, confirming that spectral-domain analysis could replace the reference method for the detection of ventilatory thresholds. Furthermore, this technique is sensitive to rehabilitation and re-training, which underlines its utility in clinical practice. This inexpensive and non-invasive tool is promising for prescribing physical activity programs in obese adolescents. © Georg Thieme Verlag KG Stuttgart · New York.
A hybrid spatial-spectral denoising method for infrared hyperspectral images using 2DPCA
NASA Astrophysics Data System (ADS)
Huang, Jun; Ma, Yong; Mei, Xiaoguang; Fan, Fan
2016-11-01
The traditional noise reduction methods for 3-D infrared hyperspectral images typically operate independently in either the spatial or spectral domain, and such methods overlook the relationship between the two domains. To address this issue, we propose a hybrid spatial-spectral method in this paper to link both domains. First, principal component analysis and bivariate wavelet shrinkage are performed in the 2-D spatial domain. Second, 2-D principal component analysis transformation is conducted in the 1-D spectral domain to separate the basic components from detail ones. The energy distribution of noise is unaffected by orthogonal transformation; therefore, the signal-to-noise ratio of each component is used as a criterion to determine whether a component should be protected from over-denoising or denoised with certain 1-D denoising methods. This study implements the 1-D wavelet shrinking threshold method based on Stein's unbiased risk estimator, and the quantitative results on publicly available datasets demonstrate that our method can improve denoising performance more effectively than other state-of-the-art methods can.
Extrapolation of Functions of Many Variables by Means of Metric Analysis
NASA Astrophysics Data System (ADS)
Kryanev, Alexandr; Ivanov, Victor; Romanova, Anastasiya; Sevastianov, Leonid; Udumyan, David
2018-02-01
The paper considers a problem of extrapolating functions of several variables. It is assumed that the values of the function of m variables at a finite number of points in some domain D of the m-dimensional space are given. It is required to restore the value of the function at points outside the domain D. The paper proposes a fundamentally new method for functions of several variables extrapolation. In the presented paper, the method of extrapolating a function of many variables developed by us uses the interpolation scheme of metric analysis. To solve the extrapolation problem, a scheme based on metric analysis methods is proposed. This scheme consists of two stages. In the first stage, using the metric analysis, the function is interpolated to the points of the domain D belonging to the segment of the straight line connecting the center of the domain D with the point M, in which it is necessary to restore the value of the function. In the second stage, based on the auto regression model and metric analysis, the function values are predicted along the above straight-line segment beyond the domain D up to the point M. The presented numerical example demonstrates the efficiency of the method under consideration.
Training Plan. Central Archive for Reusable Defense Software (CARDS)
1994-01-29
Modeling Software Reuse Technology: Feature Oriented Domain Analysis ( FODA ). SEI, Carnegie Mellon University, May 1992. 8. Component Provider’s...events to the services of the domain. 4. Feature Oriented Domain Analysis ( FODA ) [COHEN92] The FODA method produces feature models. Feature models provide...Architecture FODA Feature-Oriented Domain Analysis GOTS Government-Off-The-Shelf Pap A-49 STARS-VC-B003/001/00 29 imaty 1994 MS Master of Science NEC
NASA Astrophysics Data System (ADS)
Chen, Jing-Bo
2014-06-01
By using low-frequency components of the damped wavefield, Laplace-Fourier-domain full waveform inversion (FWI) can recover a long-wavelength velocity model from the original undamped seismic data lacking low-frequency information. Laplace-Fourier-domain modelling is an important foundation of Laplace-Fourier-domain FWI. Based on the numerical phase velocity and the numerical attenuation propagation velocity, a method for performing Laplace-Fourier-domain numerical dispersion analysis is developed in this paper. This method is applied to an average-derivative optimal scheme. The results show that within the relative error of 1 per cent, the Laplace-Fourier-domain average-derivative optimal scheme requires seven gridpoints per smallest wavelength and smallest pseudo-wavelength for both equal and unequal directional sampling intervals. In contrast, the classical five-point scheme requires 23 gridpoints per smallest wavelength and smallest pseudo-wavelength to achieve the same accuracy. Numerical experiments demonstrate the theoretical analysis.
Multiscale Modeling of Damage Processes in fcc Aluminum: From Atoms to Grains
NASA Technical Reports Server (NTRS)
Glaessgen, E. H.; Saether, E.; Yamakov, V.
2008-01-01
Molecular dynamics (MD) methods are opening new opportunities for simulating the fundamental processes of material behavior at the atomistic level. However, current analysis is limited to small domains and increasing the size of the MD domain quickly presents intractable computational demands. A preferred approach to surmount this computational limitation has been to combine continuum mechanics-based modeling procedures, such as the finite element method (FEM), with MD analyses thereby reducing the region of atomic scale refinement. Such multiscale modeling strategies can be divided into two broad classifications: concurrent multiscale methods that directly incorporate an atomistic domain within a continuum domain and sequential multiscale methods that extract an averaged response from the atomistic simulation for later use as a constitutive model in a continuum analysis.
CrossTalk: The Journal of Defense Software Engineering. Volume 19, Number 12, December 2006
2006-12-01
Feature-Oriented Domain Analysis ( FODA ) FODA is a domain analysis and engineer- ing method that focuses on developing reusable assets [9]. By examining...Eliciting Security Requirements This article describes an approach for doing trade-off analysis among requirements elicitation methods. by Dr. Nancy R...high-level requirements are addressed and met in the requirements work products. 3. Unclear requirements Mitigation Perform requirements analysis and
Pulse analysis of acoustic emission signals
NASA Technical Reports Server (NTRS)
Houghton, J. R.; Packman, P. F.
1977-01-01
A method for the signature analysis of pulses in the frequency domain and the time domain is presented. Fourier spectrum, Fourier transfer function, shock spectrum and shock spectrum ratio were examined in the frequency domain analysis and pulse shape deconvolution was developed for use in the time domain analysis. Comparisons of the relative performance of each analysis technique are made for the characterization of acoustic emission pulses recorded by a measuring system. To demonstrate the relative sensitivity of each of the methods to small changes in the pulse shape, signatures of computer modeled systems with analytical pulses are presented. Optimization techniques are developed and used to indicate the best design parameter values for deconvolution of the pulse shape. Several experiments are presented that test the pulse signature analysis methods on different acoustic emission sources. These include acoustic emission associated with (a) crack propagation, (b) ball dropping on a plate, (c) spark discharge, and (d) defective and good ball bearings. Deconvolution of the first few micro-seconds of the pulse train is shown to be the region in which the significant signatures of the acoustic emission event are to be found.
Pulse analysis of acoustic emission signals
NASA Technical Reports Server (NTRS)
Houghton, J. R.; Packman, P. F.
1977-01-01
A method for the signature analysis of pulses in the frequency domain and the time domain is presented. Fourier spectrum, Fourier transfer function, shock spectrum and shock spectrum ratio were examined in the frequency domain analysis, and pulse shape deconvolution was developed for use in the time domain analysis. Comparisons of the relative performance of each analysis technique are made for the characterization of acoustic emission pulses recorded by a measuring system. To demonstrate the relative sensitivity of each of the methods to small changes in the pulse shape, signatures of computer modeled systems with analytical pulses are presented. Optimization techniques are developed and used to indicate the best design parameters values for deconvolution of the pulse shape. Several experiments are presented that test the pulse signature analysis methods on different acoustic emission sources. These include acoustic emissions associated with: (1) crack propagation, (2) ball dropping on a plate, (3) spark discharge and (4) defective and good ball bearings. Deconvolution of the first few micro-seconds of the pulse train are shown to be the region in which the significant signatures of the acoustic emission event are to be found.
The Researches on Damage Detection Method for Truss Structures
NASA Astrophysics Data System (ADS)
Wang, Meng Hong; Cao, Xiao Nan
2018-06-01
This paper presents an effective method to detect damage in truss structures. Numerical simulation and experimental analysis were carried out on a damaged truss structure under instantaneous excitation. The ideal excitation point and appropriate hammering method were determined to extract time domain signals under two working conditions. The frequency response function and principal component analysis were used for data processing, and the angle between the frequency response function vectors was selected as a damage index to ascertain the location of a damaged bar in the truss structure. In the numerical simulation, the time domain signal of all nodes was extracted to determine the location of the damaged bar. In the experimental analysis, the time domain signal of a portion of the nodes was extracted on the basis of an optimal sensor placement method based on the node strain energy coefficient. The results of the numerical simulation and experimental analysis showed that the damage detection method based on the frequency response function and principal component analysis could locate the damaged bar accurately.
An equivalent domain integral method in the two-dimensional analysis of mixed mode crack problems
NASA Technical Reports Server (NTRS)
Raju, I. S.; Shivakumar, K. N.
1990-01-01
An equivalent domain integral (EDI) method for calculating J-integrals for two-dimensional cracked elastic bodies is presented. The details of the method and its implementation are presented for isoparametric elements. The EDI method gave accurate values of the J-integrals for two mode I and two mixed mode problems. Numerical studies showed that domains consisting of one layer of elements are sufficient to obtain accurate J-integral values. Two procedures for separating the individual modes from the domain integrals are presented.
A Method for Populating the Knowledge Base of AFIT’s Domain-Oriented Application Composition System
1993-12-01
Analysis ( FODA ). The approach identifies prominent features (similarities) and distinctive features (differences) of software systems within an... analysis approaches we have summarized, the re- searchers described FODA in sufficient detail to use on large domain analysis projects (ones with...Software Technology Center, July 1991. 18. Kang, Kyo C. and others. Feature-Oriented Domain Analysis ( FODA ) Feasibility Study. Technical Report, Software
Shape design sensitivity analysis and optimal design of structural systems
NASA Technical Reports Server (NTRS)
Choi, Kyung K.
1987-01-01
The material derivative concept of continuum mechanics and an adjoint variable method of design sensitivity analysis are used to relate variations in structural shape to measures of structural performance. A domain method of shape design sensitivity analysis is used to best utilize the basic character of the finite element method that gives accurate information not on the boundary but in the domain. Implementation of shape design sensitivty analysis using finite element computer codes is discussed. Recent numerical results are used to demonstrate the accuracy obtainable using the method. Result of design sensitivity analysis is used to carry out design optimization of a built-up structure.
An asymptotic induced numerical method for the convection-diffusion-reaction equation
NASA Technical Reports Server (NTRS)
Scroggs, Jeffrey S.; Sorensen, Danny C.
1988-01-01
A parallel algorithm for the efficient solution of a time dependent reaction convection diffusion equation with small parameter on the diffusion term is presented. The method is based on a domain decomposition that is dictated by singular perturbation analysis. The analysis is used to determine regions where certain reduced equations may be solved in place of the full equation. Parallelism is evident at two levels. Domain decomposition provides parallelism at the highest level, and within each domain there is ample opportunity to exploit parallelism. Run time results demonstrate the viability of the method.
Wave Propagation, Scattering and Imaging Using Dual-domain One-way and One-return Propagators
NASA Astrophysics Data System (ADS)
Wu, R.-S.
- Dual-domain one-way propagators implement wave propagation in heterogeneous media in mixed domains (space-wavenumber domains). One-way propagators neglect wave reverberations between heterogeneities but correctly handle the forward multiple-scattering including focusing/defocusing, diffraction, refraction and interference of waves. The algorithm shuttles between space-domain and wavenumber-domain using FFT, and the operations in the two domains are self-adaptive to the complexity of the media. The method makes the best use of the operations in each domain, resulting in efficient and accurate propagators. Due to recent progress, new versions of dual-domain methods overcame some limitations of the classical dual-domain methods (phase-screen or split-step Fourier methods) and can propagate large-angle waves quite accurately in media with strong velocity contrasts. These methods can deliver superior image quality (high resolution/high fidelity) for complex subsurface structures. One-way and one-return (De Wolf approximation) propagators can be also applied to wave-field modeling and simulations for some geophysical problems. In the article, a historical review and theoretical analysis of the Born, Rytov, and De Wolf approximations are given. A review on classical phase-screen or split-step Fourier methods is also given, followed by a summary and analysis of the new dual-domain propagators. The applications of the new propagators to seismic imaging and modeling are reviewed with several examples. For seismic imaging, the advantages and limitations of the traditional Kirchhoff migration and time-space domain finite-difference migration, when applied to 3-D complicated structures, are first analyzed. Then the special features, and applications of the new dual-domain methods are presented. Three versions of GSP (generalized screen propagators), the hybrid pseudo-screen, the wide-angle Padé-screen, and the higher-order generalized screen propagators are discussed. Recent progress also makes it possible to use the dual-domain propagators for modeling elastic reflections for complex structures and long-range propagations of crustal guided waves. Examples of 2-D and 3-D imaging and modeling using GSP methods are given.
UXDs-Driven Transferring Method from TRIZ Solution to Domain Solution
NASA Astrophysics Data System (ADS)
Ma, Lihui; Cao, Guozhong; Chang, Yunxia; Wei, Zihui; Ma, Kai
The translation process from TRIZ solutions to domain solutions is an analogy-based process. TRIZ solutions, such as 40 inventive principles and the related cases, are medium-solutions for domain problems. Unexpected discoveries (UXDs) are the key factors to trigger designers to generate new ideas for domain solutions. The Algorithm of UXD resolving based on Means-Ends Analysis(MEA) is studied and an UXDs-driven transferring method from TRIZ solution to domain solution is formed. A case study shows the application of the process.
A spectrum fractal feature classification algorithm for agriculture crops with hyper spectrum image
NASA Astrophysics Data System (ADS)
Su, Junying
2011-11-01
A fractal dimension feature analysis method in spectrum domain for hyper spectrum image is proposed for agriculture crops classification. Firstly, a fractal dimension calculation algorithm in spectrum domain is presented together with the fast fractal dimension value calculation algorithm using the step measurement method. Secondly, the hyper spectrum image classification algorithm and flowchart is presented based on fractal dimension feature analysis in spectrum domain. Finally, the experiment result of the agricultural crops classification with FCL1 hyper spectrum image set with the proposed method and SAM (spectral angle mapper). The experiment results show it can obtain better classification result than the traditional SAM feature analysis which can fulfill use the spectrum information of hyper spectrum image to realize precision agricultural crops classification.
Advance in ERG Analysis: From Peak Time and Amplitude to Frequency, Power, and Energy
Lina, Jean-Marc; Lachapelle, Pierre
2014-01-01
Purpose. To compare time domain (TD: peak time and amplitude) analysis of the human photopic electroretinogram (ERG) with measures obtained in the frequency domain (Fourier analysis: FA) and in the time-frequency domain (continuous (CWT) and discrete (DWT) wavelet transforms). Methods. Normal ERGs (n = 40) were analyzed using traditional peak time and amplitude measurements of the a- and b-waves in the TD and descriptors extracted from FA, CWT, and DWT. Selected descriptors were also compared in their ability to monitor the long-term consequences of disease process. Results. Each method extracted relevant information but had distinct limitations (i.e., temporal and frequency resolutions). The DWT offered the best compromise by allowing us to extract more relevant descriptors of the ERG signal at the cost of lesser temporal and frequency resolutions. Follow-ups of disease progression were more prolonged with the DWT (max 29 years compared to 13 with TD). Conclusions. Standardized time domain analysis of retinal function should be complemented with advanced DWT descriptors of the ERG. This method should allow more sensitive/specific quantifications of ERG responses, facilitate follow-up of disease progression, and identify diagnostically significant changes of ERG waveforms that are not resolved when the analysis is only limited to time domain measurements. PMID:25061605
Feature-level sentiment analysis by using comparative domain corpora
NASA Astrophysics Data System (ADS)
Quan, Changqin; Ren, Fuji
2016-06-01
Feature-level sentiment analysis (SA) is able to provide more fine-grained SA on certain opinion targets and has a wider range of applications on E-business. This study proposes an approach based on comparative domain corpora for feature-level SA. The proposed approach makes use of word associations for domain-specific feature extraction. First, we assign a similarity score for each candidate feature to denote its similarity extent to a domain. Then we identify domain features based on their similarity scores on different comparative domain corpora. After that, dependency grammar and a general sentiment lexicon are applied to extract and expand feature-oriented opinion words. Lastly, the semantic orientation of a domain-specific feature is determined based on the feature-oriented opinion lexicons. In evaluation, we compare the proposed method with several state-of-the-art methods (including unsupervised and semi-supervised) using a standard product review test collection. The experimental results demonstrate the effectiveness of using comparative domain corpora.
Efficient Power Network Analysis with Modeling of Inductive Effects
NASA Astrophysics Data System (ADS)
Zeng, Shan; Yu, Wenjian; Hong, Xianlong; Cheng, Chung-Kuan
In this paper, an efficient method is proposed to accurately analyze large-scale power/ground (P/G) networks, where inductive parasitics are modeled with the partial reluctance. The method is based on frequency-domain circuit analysis and the technique of vector fitting [14], and obtains the time-domain voltage response at given P/G nodes. The frequency-domain circuit equation including partial reluctances is derived, and then solved with the GMRES algorithm with rescaling, preconditioning and recycling techniques. With the merit of sparsified reluctance matrix and iterative solving techniques for the frequency-domain circuit equations, the proposed method is able to handle large-scale P/G networks with complete inductive modeling. Numerical results show that the proposed method is orders of magnitude faster than HSPICE, several times faster than INDUCTWISE [4], and capable of handling the inductive P/G structures with more than 100, 000 wire segments.
Time Domain Radar Laboratory Operating System Development and Transient EM Analysis.
1981-09-01
polarization of the return, arg used. Other similar methods use amplitude and phase differences or special properties of Rayleigh region scattering. All these...3ptias Inverse Scattering ... 19 2. "!xact" Inverse Scattering !Nethod .. 20 3. Other Methods ................... 21 C. REVIEW OF TDRL PROGRESS AT SPS...explicit independant variable in.most methods . In the past, frequency domain analysis has been the primary means of analyzing aan-monochromatic EM
Analysis and interpretation of diffraction data from complex, anisotropic materials
NASA Astrophysics Data System (ADS)
Tutuncu, Goknur
Most materials are elastically anisotropic and exhibit additional anisotropy beyond elastic deformation. For instance, in ferroelectric materials the main inelastic deformation mode is via domains, which are highly anisotropic crystallographic features. To quantify this anisotropy of ferroelectrics, advanced X-ray and neutron diffraction methods were employed. Extensive sets of data were collected from tetragonal BaTiO3, PZT and other ferroelectric ceramics. Data analysis was challenging due to the complex constitutive behavior of these materials. To quantify the elastic strain and texture evolution in ferroelectrics under loading, a number of data analysis techniques such as the single peak and Rietveld methods were used and their advantages and disadvantages compared. It was observed that the single peak analysis fails at low peak intensities especially after domain switching while the Rietveld method does not account for lattice strain anisotropy although it overcomes the low intensity problem via whole pattern analysis. To better account for strain anisotropy the constant stress (Reuss) approximation was employed within the Rietveld method and new formulations to estimate lattice strain were proposed. Along the way, new approaches for handling highly anisotropic lattice strain data were also developed and applied. All of the ceramics studied exhibited significant changes in their crystallographic texture after loading indicating non-180° domain switching. For a full interpretation of domain switching the spherical harmonics method was employed in Rietveld. A procedure for simultaneous refinement of multiple data sets was established for a complete texture analysis. To further interpret diffraction data, a solid mechanics model based on the self-consistent approach was used in calculating lattice strain and texture evolution during the loading of a polycrystalline ferroelectric. The model estimates both the macroscopic average response of a specimen and its hkl-dependent lattice strains for different reflections. It also tracks the number of grains (or domains) contributing to each reflection and allows for domain switching. The agreement between the model and experimental data was found to be satisfactory.
Wavelet bases on the L-shaped domain
NASA Astrophysics Data System (ADS)
Jouini, Abdellatif; Lemarié-Rieusset, Pierre Gilles
2013-07-01
We present in this paper two elementary constructions of multiresolution analyses on the L-shaped domain D. In the first one, we shall describe a direct method to define an orthonormal multiresolution analysis. In the second one, we use the decomposition method for constructing a biorthogonal multiresolution analysis. These analyses are adapted for the study of the Sobolev spaces Hs(D)(s∈N).
Frequency- and Time-Domain Methods in Soil-Structure Interaction Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bolisetti, Chandrakanth; Whittaker, Andrew S.; Coleman, Justin L.
2015-06-01
Soil-structure interaction (SSI) analysis in the nuclear industry is currently performed using linear codes that function in the frequency domain. There is a consensus that these frequency-domain codes give reasonably accurate results for low-intensity ground motions that result in almost linear response. For higher intensity ground motions, which may result in nonlinear response in the soil, structure or at the vicinity of the foundation, the adequacy of frequency-domain codes is unproven. Nonlinear analysis, which is only possible in the time domain, is theoretically more appropriate in such cases. These methods are available but are rarely used due to the largemore » computational requirements and a lack of experience with analysts and regulators. This paper presents an assessment of the linear frequency-domain code, SASSI, which is widely used in the nuclear industry, and the time-domain commercial finite-element code, LS-DYNA, for SSI analysis. The assessment involves benchmarking the SSI analysis procedure in LS-DYNA against SASSI for linearly elastic models. After affirming that SASSI and LS-DYNA result in almost identical responses for these models, they are used to perform nonlinear SSI analyses of two structures founded on soft soil. An examination of the results shows that, in spite of using identical material properties, the predictions of frequency- and time-domain codes are significantly different in the presence of nonlinear behavior such as gapping and sliding of the foundation.« less
Domain decomposition and matching for time-domain analysis of motions of ships advancing in head sea
NASA Astrophysics Data System (ADS)
Tang, Kai; Zhu, Ren-chuan; Miao, Guo-ping; Fan, Ju
2014-08-01
A domain decomposition and matching method in the time-domain is outlined for simulating the motions of ships advancing in waves. The flow field is decomposed into inner and outer domains by an imaginary control surface, and the Rankine source method is applied to the inner domain while the transient Green function method is used in the outer domain. Two initial boundary value problems are matched on the control surface. The corresponding numerical codes are developed, and the added masses, wave exciting forces and ship motions advancing in head sea for Series 60 ship and S175 containership, are presented and verified. A good agreement has been obtained when the numerical results are compared with the experimental data and other references. It shows that the present method is more efficient because of the panel discretization only in the inner domain during the numerical calculation, and good numerical stability is proved to avoid divergence problem regarding ships with flare.
A developed nearly analytic discrete method for forward modeling in the frequency domain
NASA Astrophysics Data System (ADS)
Liu, Shaolin; Lang, Chao; Yang, Hui; Wang, Wenshuai
2018-02-01
High-efficiency forward modeling methods play a fundamental role in full waveform inversion (FWI). In this paper, the developed nearly analytic discrete (DNAD) method is proposed to accelerate frequency-domain forward modeling processes. We first derive the discretization of frequency-domain wave equations via numerical schemes based on the nearly analytic discrete (NAD) method to obtain a linear system. The coefficients of numerical stencils are optimized to make the linear system easier to solve and to minimize computing time. Wavefield simulation and numerical dispersion analysis are performed to compare the numerical behavior of DNAD method with that of the conventional NAD method. The results demonstrate the superiority of our proposed method. Finally, the DNAD method is implemented in frequency-domain FWI, and high-resolution inverse results are obtained.
Frequency-domain-independent vector analysis for mode-division multiplexed transmission
NASA Astrophysics Data System (ADS)
Liu, Yunhe; Hu, Guijun; Li, Jiao
2018-04-01
In this paper, we propose a demultiplexing method based on frequency-domain independent vector analysis (FD-IVA) algorithm for mode-division multiplexing (MDM) system. FD-IVA extends frequency-domain independent component analysis (FD-ICA) from unitary variable to multivariate variables, and provides an efficient method to eliminate the permutation ambiguity. In order to verify the performance of FD-IVA algorithm, a 6 ×6 MDM system is simulated. The simulation results show that the FD-IVA algorithm has basically the same bit-error-rate(BER) performance with the FD-ICA algorithm and frequency-domain least mean squares (FD-LMS) algorithm. Meanwhile, the convergence speed of FD-IVA algorithm is the same as that of FD-ICA. However, compared with the FD-ICA and the FD-LMS, the FD-IVA has an obviously lower computational complexity.
A hybrid fault diagnosis approach based on mixed-domain state features for rotating machinery.
Xue, Xiaoming; Zhou, Jianzhong
2017-01-01
To make further improvement in the diagnosis accuracy and efficiency, a mixed-domain state features data based hybrid fault diagnosis approach, which systematically blends both the statistical analysis approach and the artificial intelligence technology, is proposed in this work for rolling element bearings. For simplifying the fault diagnosis problems, the execution of the proposed method is divided into three steps, i.e., fault preliminary detection, fault type recognition and fault degree identification. In the first step, a preliminary judgment about the health status of the equipment can be evaluated by the statistical analysis method based on the permutation entropy theory. If fault exists, the following two processes based on the artificial intelligence approach are performed to further recognize the fault type and then identify the fault degree. For the two subsequent steps, mixed-domain state features containing time-domain, frequency-domain and multi-scale features are extracted to represent the fault peculiarity under different working conditions. As a powerful time-frequency analysis method, the fast EEMD method was employed to obtain multi-scale features. Furthermore, due to the information redundancy and the submergence of original feature space, a novel manifold learning method (modified LGPCA) is introduced to realize the low-dimensional representations for high-dimensional feature space. Finally, two cases with 12 working conditions respectively have been employed to evaluate the performance of the proposed method, where vibration signals were measured from an experimental bench of rolling element bearing. The analysis results showed the effectiveness and the superiority of the proposed method of which the diagnosis thought is more suitable for practical application. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
Fast-Running Aeroelastic Code Based on Unsteady Linearized Aerodynamic Solver Developed
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Bakhle, Milind A.; Keith, T., Jr.
2003-01-01
The NASA Glenn Research Center has been developing aeroelastic analyses for turbomachines for use by NASA and industry. An aeroelastic analysis consists of a structural dynamic model, an unsteady aerodynamic model, and a procedure to couple the two models. The structural models are well developed. Hence, most of the development for the aeroelastic analysis of turbomachines has involved adapting and using unsteady aerodynamic models. Two methods are used in developing unsteady aerodynamic analysis procedures for the flutter and forced response of turbomachines: (1) the time domain method and (2) the frequency domain method. Codes based on time domain methods require considerable computational time and, hence, cannot be used during the design process. Frequency domain methods eliminate the time dependence by assuming harmonic motion and, hence, require less computational time. Early frequency domain analyses methods neglected the important physics of steady loading on the analyses for simplicity. A fast-running unsteady aerodynamic code, LINFLUX, which includes steady loading and is based on the frequency domain method, has been modified for flutter and response calculations. LINFLUX, solves unsteady linearized Euler equations for calculating the unsteady aerodynamic forces on the blades, starting from a steady nonlinear aerodynamic solution. First, we obtained a steady aerodynamic solution for a given flow condition using the nonlinear unsteady aerodynamic code TURBO. A blade vibration analysis was done to determine the frequencies and mode shapes of the vibrating blades, and an interface code was used to convert the steady aerodynamic solution to a form required by LINFLUX. A preprocessor was used to interpolate the mode shapes from the structural dynamic mesh onto the computational dynamics mesh. Then, we used LINFLUX to calculate the unsteady aerodynamic forces for a given mode, frequency, and phase angle. A postprocessor read these unsteady pressures and calculated the generalized aerodynamic forces, eigenvalues, and response amplitudes. The eigenvalues determine the flutter frequency and damping. As a test case, the flutter of a helical fan was calculated with LINFLUX and compared with calculations from TURBO-AE, a nonlinear time domain code, and from ASTROP2, a code based on linear unsteady aerodynamics.
1990-08-01
the spectral domain is extended to include the effects of two-dimensional, two-component current flow in planar transmission line discontinuities 6n...PROFESSOR: Tatsuo Itoh A deterministic formulation of the method of moments carried out in the spectral domain is extended to include the effects of...two-dimensional, two- component current flow in planar transmission line discontinuities on open substrates. The method includes the effects of space
On the Analysis Methods for the Time Domain and Frequency Domain Response of a Buried Objects*
NASA Astrophysics Data System (ADS)
Poljak, Dragan; Šesnić, Silvestar; Cvetković, Mario
2014-05-01
There has been a continuous interest in the analysis of ground-penetrating radar systems and related applications in civil engineering [1]. Consequently, a deeper insight of scattering phenomena occurring in a lossy half-space, as well as the development of sophisticated numerical methods based on Finite Difference Time Domain (FDTD) method, Finite Element Method (FEM), Boundary Element Method (BEM), Method of Moments (MoM) and various hybrid methods, is required, e.g. [2], [3]. The present paper deals with certain techniques for time and frequency domain analysis, respectively, of buried conducting and dielectric objects. Time domain analysis is related to the assessment of a transient response of a horizontal straight thin wire buried in a lossy half-space using a rigorous antenna theory (AT) approach. The AT approach is based on the space-time integral equation of the Pocklington type (time domain electric field integral equation for thin wires). The influence of the earth-air interface is taken into account via the simplified reflection coefficient arising from the Modified Image Theory (MIT). The obtained results for the transient current induced along the electrode due to the transmitted plane wave excitation are compared to the numerical results calculated via an approximate transmission line (TL) approach and the AT approach based on the space-frequency variant of the Pocklington integro-differential approach, respectively. It is worth noting that the space-frequency Pocklington equation is numerically solved via the Galerkin-Bubnov variant of the Indirect Boundary Element Method (GB-IBEM) and the corresponding transient response is obtained by the aid of inverse fast Fourier transform (IFFT). The results calculated by means of different approaches agree satisfactorily. Frequency domain analysis is related to the assessment of frequency domain response of dielectric sphere using the full wave model based on the set of coupled electric field integral equations for surfaces. The numerical solution is carried out by means of the improved variant of the Method of Moments (MoM) providing numerically stable and an efficient procedure for the extraction of singularities arising in integral expressions. The proposed analysis method is compared to the results obtained by using some commercial software packages. A satisfactory agreement has been achieved. Both approaches discussed throughout this work and demonstrated on canonical geometries could be also useful for benchmark purpose. References [1] L. Pajewski et al., Applications of Ground Penetrating Radar in Civil Engineering - COST Action TU1208, 2013. [2] U. Oguz, L. Gurel, Frequency Responses of Ground-Penetrating Radars Operating Over Highly Lossy Grounds, IEEE Trans. Geosci. and Remote sensing, Vol. 40, No 6, 2002. [3] D.Poljak, Advanced Modeling in Computational electromagnetic Compatibility, John Wiley and Sons, New York 2007. *This work benefited from networking activities carried out within the EU funded COST Action TU1208 "Civil Engineering Applications of Ground Penetrating Radar."
A full potential flow analysis with realistic wake influence for helicopter rotor airload prediction
NASA Technical Reports Server (NTRS)
Egolf, T. Alan; Sparks, S. Patrick
1987-01-01
A 3-D, quasi-steady, full potential flow solver was adapted to include realistic wake influence for the aerodynamic analysis of helicopter rotors. The method is based on a finite difference solution of the full potential equation, using an inner and outer domain procedure for the blade flowfield to accommodate wake effects. The nonlinear flow is computed in the inner domain region using a finite difference solution method. The wake is modeled by a vortex lattice using prescribed geometry techniques to allow for the inclusion of realistic rotor wakes. The key feature of the analysis is that vortices contained within the finite difference mesh (inner domain) were treated with a vortex embedding technique while the influence of the remaining portion of the wake (in the outer domain) is impressed as a boundary condition on the outer surface of the finite difference mesh. The solution procedure couples the wake influence with the inner domain solution in a consistent and efficient solution process. The method has been applied to both hover and forward flight conditions. Correlation with subsonic and transonic hover airload data is shown which demonstrates the merits of the approach.
Boundary Approximation Methods for Sloving Elliptic Problems on Unbounded Domains
NASA Astrophysics Data System (ADS)
Li, Zi-Cai; Mathon, Rudolf
1990-08-01
Boundary approximation methods with partial solutions are presented for solving a complicated problem on an unbounded domain, with both a crack singularity and a corner singularity. Also an analysis of partial solutions near the singular points is provided. These methods are easy to apply, have good stability properties, and lead to highly accurate solutions. Hence, boundary approximation methods with partial solutions are recommended for the treatment of elliptic problems on unbounded domains provided that piecewise solution expansions, in particular, asymptotic solutions near the singularities and infinity, can be found.
Quarterly Update, January-March 1992
1992-03-01
representations to support exploiting that commonality. The project used the Feature-Oriented Domain Analysis ( FODA ) method, developed by the project in 1990, in...9 FODA feature-oriented domain analysis ................................... 14 FTP file transfer protocol...concentrations, and product market share for 23 countries. Along with other SEI staff, members of the Rate Monotonic Analysis for Real-Time Systems (RMARTS
Computational methods for efficient structural reliability and reliability sensitivity analysis
NASA Technical Reports Server (NTRS)
Wu, Y.-T.
1993-01-01
This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.
Wang, Renjie; Normand, Christophe; Gadal, Olivier
2016-01-01
Spatial organization of the genome has important impacts on all aspects of chromosome biology, including transcription, replication, and DNA repair. Frequent interactions of some chromosome domains with specific nuclear compartments, such as the nucleolus, are now well documented using genome-scale methods. However, direct measurement of distance and interaction frequency between loci requires microscopic observation of specific genomic domains and the nucleolus, followed by image analysis to allow quantification. The fluorescent repressor operator system (FROS) is an invaluable method to fluorescently tag DNA sequences and investigate chromosome position and dynamics in living cells. This chapter describes a combination of methods to define motion and region of confinement of a locus relative to the nucleolus in cell's nucleus, from fluorescence acquisition to automated image analysis using two dedicated pipelines.
Eigenvalue sensitivity analysis of planar frames with variable joint and support locations
NASA Technical Reports Server (NTRS)
Chuang, Ching H.; Hou, Gene J. W.
1991-01-01
Two sensitivity equations are derived in this study based upon the continuum approach for eigenvalue sensitivity analysis of planar frame structures with variable joint and support locations. A variational form of an eigenvalue equation is first derived in which all of the quantities are expressed in the local coordinate system attached to each member. Material derivative of this variational equation is then sought to account for changes in member's length and orientation resulting form the perturbation of joint and support locations. Finally, eigenvalue sensitivity equations are formulated in either domain quantities (by the domain method) or boundary quantities (by the boundary method). It is concluded that the sensitivity equation derived by the boundary method is more efficient in computation but less accurate than that of the domain method. Nevertheless, both of them in terms of computational efficiency are superior to the conventional direct differentiation method and the finite difference method.
Ahn, T; Moon, S; Youk, Y; Jung, Y; Oh, K; Kim, D
2005-05-30
A novel mode analysis method and differential mode delay (DMD) measurement technique for a multimode optical fiber based on optical frequency domain reflectometry has been proposed for the first time. We have used a conventional OFDR with a tunable external cavity laser and a Michelson interferometer. A few-mode optical multimode fiber was prepared to test our proposed measurement technique. We have also compared the OFDR measurement results with those obtained using a traditional time-domain measurement method.
NASA Astrophysics Data System (ADS)
Fallahi, Arya; Oswald, Benedikt; Leidenberger, Patrick
2012-04-01
We study a 3-dimensional, dual-field, fully explicit method for the solution of Maxwell's equations in the time domain on unstructured, tetrahedral grids. The algorithm uses the element level time domain (ELTD) discretization of the electric and magnetic vector wave equations. In particular, the suitability of the method for the numerical analysis of nanometer structured systems in the optical region of the electromagnetic spectrum is investigated. The details of the theory and its implementation as a computer code are introduced and its convergence behavior as well as conditions for stable time domain integration is examined. Here, we restrict ourselves to non-dispersive dielectric material properties since dielectric dispersion will be treated in a subsequent paper. Analytically solvable problems are analyzed in order to benchmark the method. Eventually, a dielectric microlens is considered to demonstrate the potential of the method. A flexible method of 2nd order accuracy is obtained that is applicable to a wide range of nano-optical configurations and can be a serious competitor to more conventional finite difference time domain schemes which operate only on hexahedral grids. The ELTD scheme can resolve geometries with a wide span of characteristic length scales and with the appropriate level of detail, using small tetrahedra where delicate, physically relevant details must be modeled.
NASA Astrophysics Data System (ADS)
Geltner, I.; Hashimshony, D.; Zigler, A.
2002-07-01
We use a time-domain analysis method to characterize the outer layer of a multilayer structure regardless of the inner ones, thus simplifying the characterization of all the layers. We combine this method with THz reflection spectroscopy to detect nondestructively a hidden aluminum oxide layer under opaque paint and to measure its conductivity and high-frequency dielectric constant in the THz range.
Determining attenuation properties of interfering fast and slow ultrasonic waves in cancellous bone.
Nelson, Amber M; Hoffman, Joseph J; Anderson, Christian C; Holland, Mark R; Nagatani, Yoshiki; Mizuno, Katsunori; Matsukawa, Mami; Miller, James G
2011-10-01
Previous studies have shown that interference between fast waves and slow waves can lead to observed negative dispersion in cancellous bone. In this study, the effects of overlapping fast and slow waves on measurements of the apparent attenuation as a function of propagation distance are investigated along with methods of analysis used to determine the attenuation properties. Two methods are applied to simulated data that were generated based on experimentally acquired signals taken from a bovine specimen. The first method uses a time-domain approach that was dictated by constraints imposed by the partial overlap of fast and slow waves. The second method uses a frequency-domain log-spectral subtraction technique on the separated fast and slow waves. Applying the time-domain analysis to the broadband data yields apparent attenuation behavior that is larger in the early stages of propagation and decreases as the wave travels deeper. In contrast, performing frequency-domain analysis on the separated fast waves and slow waves results in attenuation coefficients that are independent of propagation distance. Results suggest that features arising from the analysis of overlapping two-mode data may represent an alternate explanation for the previously reported apparent dependence on propagation distance of the attenuation coefficient of cancellous bone. © 2011 Acoustical Society of America
Determining attenuation properties of interfering fast and slow ultrasonic waves in cancellous bone
Nelson, Amber M.; Hoffman, Joseph J.; Anderson, Christian C.; Holland, Mark R.; Nagatani, Yoshiki; Mizuno, Katsunori; Matsukawa, Mami; Miller, James G.
2011-01-01
Previous studies have shown that interference between fast waves and slow waves can lead to observed negative dispersion in cancellous bone. In this study, the effects of overlapping fast and slow waves on measurements of the apparent attenuation as a function of propagation distance are investigated along with methods of analysis used to determine the attenuation properties. Two methods are applied to simulated data that were generated based on experimentally acquired signals taken from a bovine specimen. The first method uses a time-domain approach that was dictated by constraints imposed by the partial overlap of fast and slow waves. The second method uses a frequency-domain log-spectral subtraction technique on the separated fast and slow waves. Applying the time-domain analysis to the broadband data yields apparent attenuation behavior that is larger in the early stages of propagation and decreases as the wave travels deeper. In contrast, performing frequency-domain analysis on the separated fast waves and slow waves results in attenuation coefficients that are independent of propagation distance. Results suggest that features arising from the analysis of overlapping two-mode data may represent an alternate explanation for the previously reported apparent dependence on propagation distance of the attenuation coefficient of cancellous bone. PMID:21973378
Time domain reflectometry waveform analysis with second order bounded mean oscillation
USDA-ARS?s Scientific Manuscript database
Tangent-line methods and adaptive waveform interpretation with Gaussian filtering (AWIGF) have been proposed for determining reflection positions of time domain reflectometry (TDR) waveforms. However, the accuracy of those methods is limited for short probe TDR sensors. Second order bounded mean osc...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fancher, Chris M.; Blendell, John E.; Bowman, Keith J.
2017-02-07
A method leveraging Rietveld full-pattern texture analysis to decouple induced domain texture from a preferred grain orientation is presented in this paper. The proposed method is demonstrated by determining the induced domain texture in a polar polymorph of 100 oriented 0.91Bi 1/2Na 1/2TiO 3-0.07BaTiO 3-0.02K 0.5Na 0.5NbO 3. Domain textures determined using the present method are compared with results obtained via single peak fitting. Texture determined using single peak fitting estimated more domain alignment than that determined using the Rietveld based method. These results suggest that the combination of grain texture and phase transitions can lead to single peak fittingmore » under or over estimating domain texture. Finally, while demonstrated for a bulk piezoelectric, the proposed method can be applied to quantify domain textures in multi-component systems and thin films.« less
Imaging of surface spin textures on bulk crystals by scanning electron microscopy
NASA Astrophysics Data System (ADS)
Akamine, Hiroshi; Okumura, So; Farjami, Sahar; Murakami, Yasukazu; Nishida, Minoru
2016-11-01
Direct observation of magnetic microstructures is vital for advancing spintronics and other technologies. Here we report a method for imaging surface domain structures on bulk samples by scanning electron microscopy (SEM). Complex magnetic domains, referred to as the maze state in CoPt/FePt alloys, were observed at a spatial resolution of less than 100 nm by using an in-lens annular detector. The method allows for imaging almost all the domain walls in the mazy structure, whereas the visualisation of the domain walls with the classical SEM method was limited. Our method provides a simple way to analyse surface domain structures in the bulk state that can be used in combination with SEM functions such as orientation or composition analysis. Thus, the method extends applications of SEM-based magnetic imaging, and is promising for resolving various problems at the forefront of fields including physics, magnetics, materials science, engineering, and chemistry.
NASA Astrophysics Data System (ADS)
Merabet, Lucas; Robert, Sébastien; Prada, Claire
2018-04-01
In this paper, we present two frequency-domain algorithms for 2D imaging with plane wave emissions, namely Stolt's migration and Lu's method. The theoretical background is first presented, followed by an analysis of the algorithm complexities. The frequency-domain methods are then compared to the time-domain plane wave imaging in a realistic inspection configuration where the array elements are not in contact with the specimen. Imaging defects located far away from the array aperture is assessed and computation times for the three methods are presented as a function of the number of pixels of the reconstructed image. We show that Lu's method provides a time gain of up to 33 compared to the time-domain algorithm, and demonstrate the limitations of Stolt's migration for defects far away from the aperture.
Substructure coupling in the frequency domain
NASA Technical Reports Server (NTRS)
1985-01-01
Frequency domain analysis was found to be a suitable method for determining the transient response of systems subjected to a wide variety of loads. However, since a large number of calculations are performed within the discrete frequency loop, the method loses it computational efficiency if the loads must be represented by a large number of discrete frequencies. It was also discovered that substructure coupling in the frequency domain work particularly well for analyzing structural system with a small number of interface and loaded degrees of freedom. It was discovered that substructure coupling in the frequency domain can lead to an efficient method of obtaining natural frequencies of undamped structures. It was also found that the damped natural frequencies of a system may be determined using frequency domain techniques.
Methods for Conducting Cognitive Task Analysis for a Decision Making Task.
1996-01-01
Cognitive task analysis (CTA) improves traditional task analysis procedures by analyzing the thought processes of performers while they complete a...for using these methods to conduct a CTA for domains which involve critical decision making tasks in naturalistic settings. The cognitive task analysis methods
Random domain name and address mutation (RDAM) for thwarting reconnaissance attacks
Chen, Xi; Zhu, Yuefei
2017-01-01
Network address shuffling is a novel moving target defense (MTD) that invalidates the address information collected by the attacker by dynamically changing or remapping the host’s network addresses. However, most network address shuffling methods are limited by the limited address space and rely on the host’s static domain name to map to its dynamic address; therefore these methods cannot effectively defend against random scanning attacks, and cannot defend against an attacker who knows the target’s domain name. In this paper, we propose a network defense method based on random domain name and address mutation (RDAM), which increases the scanning space of the attacker through a dynamic domain name method and reduces the probability that a host will be hit by an attacker scanning IP addresses using the domain name system (DNS) query list and the time window methods. Theoretical analysis and experimental results show that RDAM can defend against scanning attacks and worm propagation more effectively than general network address shuffling methods, while introducing an acceptable operational overhead. PMID:28489910
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dasgupta, Aritra; Burrows, Susannah M.; Han, Kyungsik
2017-05-08
Scientists often use specific data analysis and presentation methods familiar within their domain. But does high familiarity drive better analytical judgment? This question is especially relevant when familiar methods themselves can have shortcomings: many visualizations used conventionally for scientific data analysis and presentation do not follow established best practices. This necessitates new methods that might be unfamiliar yet prove to be more effective. But there is little empirical understanding of the relationships between scientists’ subjective impressions about familiar and unfamiliar visualizations and objective measures of their visual analytic judgments. To address this gap and to study these factors, we focusmore » on visualizations used for comparison of climate model performance. We report on a comprehensive survey-based user study with 47 climate scientists and present an analysis of : i) relationships among scientists’ familiarity, their perceived lev- els of comfort, confidence, accuracy, and objective measures of accuracy, and ii) relationships among domain experience, visualization familiarity, and post-study preference.« less
Numerical simulation of water evaporation inside vertical circular tubes
NASA Astrophysics Data System (ADS)
Ocłoń, Paweł; Nowak, Marzena; Majewski, Karol
2013-10-01
In this paper the results of simplified numerical analysis of water evaporation in vertical circular tubes are presented. The heat transfer in fluid domain (water or wet steam) and solid domain (tube wall) is analyzed. For the fluid domain the temperature field is calculated solving energy equation using the Control Volume Method and for the solid domain using the Finite Element Method. The heat transfer between fluid and solid domains is conjugated using the value of heat transfer coefficient from evaporating liquid to the tube wall. It is determined using the analytical Steiner-Taborek correlation. The pressure changes in fluid are computed using Friedel model.
The Philosophy of Information as an Underlying and Unifying Theory of Information Science
ERIC Educational Resources Information Center
Tomic, Taeda
2010-01-01
Introduction: Philosophical analyses of theoretical principles underlying these sub-domains reveal philosophy of information as underlying meta-theory of information science. Method: Conceptual research on the knowledge sub-domains in information science and philosophy and analysis of their mutual connection. Analysis: Similarities between…
Lexicon-enhanced sentiment analysis framework using rule-based classification scheme.
Asghar, Muhammad Zubair; Khan, Aurangzeb; Ahmad, Shakeel; Qasim, Maria; Khan, Imran Ali
2017-01-01
With the rapid increase in social networks and blogs, the social media services are increasingly being used by online communities to share their views and experiences about a particular product, policy and event. Due to economic importance of these reviews, there is growing trend of writing user reviews to promote a product. Nowadays, users prefer online blogs and review sites to purchase products. Therefore, user reviews are considered as an important source of information in Sentiment Analysis (SA) applications for decision making. In this work, we exploit the wealth of user reviews, available through the online forums, to analyze the semantic orientation of words by categorizing them into +ive and -ive classes to identify and classify emoticons, modifiers, general-purpose and domain-specific words expressed in the public's feedback about the products. However, the un-supervised learning approach employed in previous studies is becoming less efficient due to data sparseness, low accuracy due to non-consideration of emoticons, modifiers, and presence of domain specific words, as they may result in inaccurate classification of users' reviews. Lexicon-enhanced sentiment analysis based on Rule-based classification scheme is an alternative approach for improving sentiment classification of users' reviews in online communities. In addition to the sentiment terms used in general purpose sentiment analysis, we integrate emoticons, modifiers and domain specific terms to analyze the reviews posted in online communities. To test the effectiveness of the proposed method, we considered users reviews in three domains. The results obtained from different experiments demonstrate that the proposed method overcomes limitations of previous methods and the performance of the sentiment analysis is improved after considering emoticons, modifiers, negations, and domain specific terms when compared to baseline methods.
Full waveform inversion in the frequency domain using classified time-domain residual wavefields
NASA Astrophysics Data System (ADS)
Son, Woohyun; Koo, Nam-Hyung; Kim, Byoung-Yeop; Lee, Ho-Young; Joo, Yonghwan
2017-04-01
We perform the acoustic full waveform inversion in the frequency domain using residual wavefields that have been separated in the time domain. We sort the residual wavefields in the time domain according to the order of absolute amplitudes. Then, the residual wavefields are separated into several groups in the time domain. To analyze the characteristics of the residual wavefields, we compare the residual wavefields of conventional method with those of our residual separation method. From the residual analysis, the amplitude spectrum obtained from the trace before separation appears to have little energy at the lower frequency bands. However, the amplitude spectrum obtained from our strategy is regularized by the separation process, which means that the low-frequency components are emphasized. Therefore, our method helps to emphasize low-frequency components of residual wavefields. Then, we generate the frequency-domain residual wavefields by taking the Fourier transform of the separated time-domain residual wavefields. With these wavefields, we perform the gradient-based full waveform inversion in the frequency domain using back-propagation technique. Through a comparison of gradient directions, we confirm that our separation method can better describe the sub-salt image than the conventional approach. The proposed method is tested on the SEG/EAGE salt-dome model. The inversion results show that our algorithm is better than the conventional gradient based waveform inversion in the frequency domain, especially for deeper parts of the velocity model.
Domain adaptation via transfer component analysis.
Pan, Sinno Jialin; Tsang, Ivor W; Kwok, James T; Yang, Qiang
2011-02-01
Domain adaptation allows knowledge from a source domain to be transferred to a different but related target domain. Intuitively, discovering a good feature representation across domains is crucial. In this paper, we first propose to find such a representation through a new learning method, transfer component analysis (TCA), for domain adaptation. TCA tries to learn some transfer components across domains in a reproducing kernel Hilbert space using maximum mean miscrepancy. In the subspace spanned by these transfer components, data properties are preserved and data distributions in different domains are close to each other. As a result, with the new representations in this subspace, we can apply standard machine learning methods to train classifiers or regression models in the source domain for use in the target domain. Furthermore, in order to uncover the knowledge hidden in the relations between the data labels from the source and target domains, we extend TCA in a semisupervised learning setting, which encodes label information into transfer components learning. We call this extension semisupervised TCA. The main contribution of our work is that we propose a novel dimensionality reduction framework for reducing the distance between domains in a latent space for domain adaptation. We propose both unsupervised and semisupervised feature extraction approaches, which can dramatically reduce the distance between domain distributions by projecting data onto the learned transfer components. Finally, our approach can handle large datasets and naturally lead to out-of-sample generalization. The effectiveness and efficiency of our approach are verified by experiments on five toy datasets and two real-world applications: cross-domain indoor WiFi localization and cross-domain text classification.
Subspace-based interference removal methods for a multichannel biomagnetic sensor array.
Sekihara, Kensuke; Nagarajan, Srikantan S
2017-10-01
In biomagnetic signal processing, the theory of the signal subspace has been applied to removing interfering magnetic fields, and a representative algorithm is the signal space projection algorithm, in which the signal/interference subspace is defined in the spatial domain as the span of signal/interference-source lead field vectors. This paper extends the notion of this conventional (spatial domain) signal subspace by introducing a new definition of signal subspace in the time domain. It defines the time-domain signal subspace as the span of row vectors that contain the source time course values. This definition leads to symmetric relationships between the time-domain and the conventional (spatial-domain) signal subspaces. As a review, this article shows that the notion of the time-domain signal subspace provides useful insights over existing interference removal methods from a unified perspective. Main results and significance. Using the time-domain signal subspace, it is possible to interpret a number of interference removal methods as the time domain signal space projection. Such methods include adaptive noise canceling, sensor noise suppression, the common temporal subspace projection, the spatio-temporal signal space separation, and the recently-proposed dual signal subspace projection. Our analysis using the notion of the time domain signal space projection reveals implicit assumptions these methods rely on, and shows that the difference between these methods results only from the manner of deriving the interference subspace. Numerical examples that illustrate the results of our arguments are provided.
Subspace-based interference removal methods for a multichannel biomagnetic sensor array
NASA Astrophysics Data System (ADS)
Sekihara, Kensuke; Nagarajan, Srikantan S.
2017-10-01
Objective. In biomagnetic signal processing, the theory of the signal subspace has been applied to removing interfering magnetic fields, and a representative algorithm is the signal space projection algorithm, in which the signal/interference subspace is defined in the spatial domain as the span of signal/interference-source lead field vectors. This paper extends the notion of this conventional (spatial domain) signal subspace by introducing a new definition of signal subspace in the time domain. Approach. It defines the time-domain signal subspace as the span of row vectors that contain the source time course values. This definition leads to symmetric relationships between the time-domain and the conventional (spatial-domain) signal subspaces. As a review, this article shows that the notion of the time-domain signal subspace provides useful insights over existing interference removal methods from a unified perspective. Main results and significance. Using the time-domain signal subspace, it is possible to interpret a number of interference removal methods as the time domain signal space projection. Such methods include adaptive noise canceling, sensor noise suppression, the common temporal subspace projection, the spatio-temporal signal space separation, and the recently-proposed dual signal subspace projection. Our analysis using the notion of the time domain signal space projection reveals implicit assumptions these methods rely on, and shows that the difference between these methods results only from the manner of deriving the interference subspace. Numerical examples that illustrate the results of our arguments are provided.
NASA Astrophysics Data System (ADS)
Ranjan, Suman; Mandal, Sanjoy
2017-12-01
Modeling of triple asymmetrical optical micro ring resonator (TAOMRR) in z-domain with 2 × 2 input-output system with detailed design of its waveguide configuration using finite-difference time-domain (FDTD) method is presented. Transfer function in z-domain using delay-line signal processing technique of the proposed TAOMRR is determined for different input and output ports. The frequency response analysis is carried out using MATLAB software. Group delay and dispersion characteristics are also determined in MATLAB. The electric field analysis is done using FDTD. The method proposes a new methodology to design and draw multiple configurations of coupled ring resonators having multiple in and out ports. Various important parameters such as coupling coefficients and FSR are also determined.
NASA Astrophysics Data System (ADS)
Ranjan, Suman; Mandal, Sanjoy
2018-02-01
Modeling of triple asymmetrical optical micro ring resonator (TAOMRR) in z-domain with 2 × 2 input-output system with detailed design of its waveguide configuration using finite-difference time-domain (FDTD) method is presented. Transfer function in z-domain using delay-line signal processing technique of the proposed TAOMRR is determined for different input and output ports. The frequency response analysis is carried out using MATLAB software. Group delay and dispersion characteristics are also determined in MATLAB. The electric field analysis is done using FDTD. The method proposes a new methodology to design and draw multiple configurations of coupled ring resonators having multiple in and out ports. Various important parameters such as coupling coefficients and FSR are also determined.
NASA Astrophysics Data System (ADS)
Bashkirtseva, Irina; Ryashko, Lev; Ryazanova, Tatyana
2017-09-01
A problem of the analysis of the noise-induced extinction in multidimensional population systems is considered. For the investigation of conditions of the extinction caused by random disturbances, a new approach based on the stochastic sensitivity function technique and confidence domains is suggested, and applied to tritrophic population model of interacting prey, predator and top predator. This approach allows us to analyze constructively the probabilistic mechanisms of the transition to the noise-induced extinction from both equilibrium and oscillatory regimes of coexistence. In this analysis, a method of principal directions for the reducing of the dimension of confidence domains is suggested. In the dispersion of random states, the principal subspace is defined by the ratio of eigenvalues of the stochastic sensitivity matrix. A detailed analysis of two scenarios of the noise-induced extinction in dependence on parameters of considered tritrophic system is carried out.
Classify epithelium-stroma in histopathological images based on deep transferable network.
Yu, X; Zheng, H; Liu, C; Huang, Y; Ding, X
2018-04-20
Recently, the deep learning methods have received more attention in histopathological image analysis. However, the traditional deep learning methods assume that training data and test data have the same distributions, which causes certain limitations in real-world histopathological applications. However, it is costly to recollect a large amount of labeled histology data to train a new neural network for each specified image acquisition procedure even for similar tasks. In this paper, an unsupervised domain adaptation is introduced into a typical deep convolutional neural network (CNN) model to mitigate the repeating of the labels. The unsupervised domain adaptation is implemented by adding two regularisation terms, namely the feature-based adaptation and entropy minimisation, to the object function of a widely used CNN model called the AlexNet. Three independent public epithelium-stroma datasets were used to verify the proposed method. The experimental results have demonstrated that in the epithelium-stroma classification, the proposed method can achieve better performance than the commonly used deep learning methods and some existing deep domain adaptation methods. Therefore, the proposed method can be considered as a better option for the real-world applications of histopathological image analysis because there is no requirement for recollection of large-scale labeled data for every specified domain. © 2018 The Authors Journal of Microscopy © 2018 Royal Microscopical Society.
NASA Astrophysics Data System (ADS)
Jin, Yang; Ciwei, Gao; Jing, Zhang; Min, Sun; Jie, Yu
2017-05-01
The selection and evaluation of priority domains in Global Energy Internet standard development will help to break through limits of national investment, thus priority will be given to standardizing technical areas with highest urgency and feasibility. Therefore, in this paper, the process of Delphi survey based on technology foresight is put forward, the evaluation index system of priority domains is established, and the index calculation method is determined. Afterwards, statistical method is used to evaluate the alternative domains. Finally the top four priority domains are determined as follows: Interconnected Network Planning and Simulation Analysis, Interconnected Network Safety Control and Protection, Intelligent Power Transmission and Transformation, and Internet of Things.
Peterson, Thomas A; Nehrt, Nathan L; Park, DoHwan
2012-01-01
Background and objective With recent breakthroughs in high-throughput sequencing, identifying deleterious mutations is one of the key challenges for personalized medicine. At the gene and protein level, it has proven difficult to determine the impact of previously unknown variants. A statistical method has been developed to assess the significance of disease mutation clusters on protein domains by incorporating domain functional annotations to assist in the functional characterization of novel variants. Methods Disease mutations aggregated from multiple databases were mapped to domains, and were classified as either cancer- or non-cancer-related. The statistical method for identifying significantly disease-associated domain positions was applied to both sets of mutations and to randomly generated mutation sets for comparison. To leverage the known function of protein domain regions, the method optionally distributes significant scores to associated functional feature positions. Results Most disease mutations are localized within protein domains and display a tendency to cluster at individual domain positions. The method identified significant disease mutation hotspots in both the cancer and non-cancer datasets. The domain significance scores (DS-scores) for cancer form a bimodal distribution with hotspots in oncogenes forming a second peak at higher DS-scores than non-cancer, and hotspots in tumor suppressors have scores more similar to non-cancers. In addition, on an independent mutation benchmarking set, the DS-score method identified mutations known to alter protein function with very high precision. Conclusion By aggregating mutations with known disease association at the domain level, the method was able to discover domain positions enriched with multiple occurrences of deleterious mutations while incorporating relevant functional annotations. The method can be incorporated into translational bioinformatics tools to characterize rare and novel variants within large-scale sequencing studies. PMID:22319177
NASA Astrophysics Data System (ADS)
Soporan, V. F.; Samoilă, V.; Lehene, T. R.; Pădureţu, S.; Crişan, M. D.; Vescan, M. M.
2018-06-01
The paper presents a method of analysis of doctoral theses in castings production, elaborated in Romania, the analysis period ranging from 1918 to 2016. The procedure, based on the evolution of the analyzed problem, consists of the following steps: establishment of a coding system for the domains and subdomains established in the thematic characterization of doctoral theses; the establishment of the doctoral organizing institutions, the doctoral specialties, the doctoral supervisors and the time frame for the analysis; selecting the doctoral thesis that will be included in the analysis; establishing the key words for characterization of doctoral theses, based on their title; the assignment of theses to the domains and subdomains according to the meaning of the keywords, to the existing groups of the coding system; statistical processing of results and determination of shares for each domain and subdomain; conclusions on the results obtained and their interpretation in the context of economic and social developments. The proposed method being considered as general, the case study is carried out at the level of the specific field of castings production, the territory of the analysis refers to the institutions organizing doctoral studies.
Tailoring vocabularies for NLP in sub-domains: a method to detect unused word sense.
Figueroa, Rosa L; Zeng-Treitler, Qing; Goryachev, Sergey; Wiechmann, Eduardo P
2009-11-14
We developed a method to help tailor a comprehensive vocabulary system (e.g. the UMLS) for a sub-domain (e.g. clinical reports) in support of natural language processing (NLP). The method detects unused sense in a sub-domain by comparing the relational neighborhood of a word/term in the vocabulary with the semantic neighborhood of the word/term in the sub-domain. The semantic neighborhood of the word/term in the sub-domain is determined using latent semantic analysis (LSA). We trained and tested the unused sense detection on two clinical text corpora: one contains discharge summaries and the other outpatient visit notes. We were able to detect unused senses with precision from 79% to 87%, recall from 48% to 74%, and an area under receiver operation curve (AUC) of 72% to 87%.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dasgupta, Aritra; Burrows, Susannah M.; Han, Kyungsik
Scientists working in a particular domain often adhere to conventional data analysis and presentation methods and this leads to familiarity with these methods over time. But does high familiarity always lead to better analytical judgment? This question is especially relevant when visualizations are used in scientific tasks, as there can be discrepancies between visualization best practices and domain conventions. However, there is little empirical evidence of the relationships between scientists’ subjective impressions about familiar and unfamiliar visualizations and objective measures of their effect on scientific judgment. To address this gap and to study these factors, we focus on the climatemore » science domain, specifically on visualizations used for comparison of model performance. We present a comprehensive user study with 47 climate scientists where we explored the following factors: i) relationships between scientists’ familiarity, their perceived levels of com- fort, confidence, accuracy, and objective measures of accuracy, and ii) relationships among domain experience, visualization familiarity, and post-study preference.« less
2014-01-01
Background The Theoretical Domains Framework (TDF) is a set of 14 domains of behavior change that provide a framework for the critical issues and factors influencing optimal knowledge translation. Considering that a previous study has identified optimal knowledge translation techniques for each TDF domain, it was hypothesized that the TDF could be used to contextualize and interpret findings from a behavioral and educational needs assessment. To illustrate this hypothesis, findings and recommendations drawn from a 2012 national behavioral and educational needs assessment conducted with healthcare providers who treat and manage Growth and Growth Hormone Disorders, will be discussed using the TDF. Methods This needs assessment utilized a mixed-methods research approach that included a combination of: [a] data sources (Endocrinologists (n:120), Pediatric Endocrinologists (n:53), Pediatricians (n:52)), [b] data collection methods (focus groups, interviews, online survey), [c] analysis methodologies (qualitative - analyzed through thematic analysis, quantitative - analyzed using frequencies, cross-tabulations, and gap analysis). Triangulation was used to generate trustworthy findings on the clinical practice gaps of endocrinologists, pediatric endocrinologists, and general pediatricians in their provision of care to adult patients with adult growth hormone deficiency or acromegaly, or children/teenagers with pediatric growth disorders. The identified gaps were then broken into key underlying determinants, categorized according to the TDF domains, and linked to optimal behavioral change techniques. Results The needs assessment identified 13 gaps, each with one or more underlying determinant(s). Overall, these determinants were mapped to 9 of the 14 TDF domains. The Beliefs about Consequences domain was identified as a contributing determinant to 7 of the 13 challenges. Five of the gaps could be related to the Skills domain, while three were linked to the Knowledge domain. Conclusions The TDF categorization of the needs assessment findings allowed recommendation of appropriate behavior change techniques for each underlying determinant, and facilitated communication and understanding of the identified issues to a broader audience. This approach provides a means for health education researchers to categorize gaps and challenges identified through educational needs assessments, and facilitates the application of these findings by educators and knowledge translators, by linking the gaps to recommended behavioral change techniques. PMID:25060235
Localization of optic disc and fovea in retinal images using intensity based line scanning analysis.
Kamble, Ravi; Kokare, Manesh; Deshmukh, Girish; Hussin, Fawnizu Azmadi; Mériaudeau, Fabrice
2017-08-01
Accurate detection of diabetic retinopathy (DR) mainly depends on identification of retinal landmarks such as optic disc and fovea. Present methods suffer from challenges like less accuracy and high computational complexity. To address this issue, this paper presents a novel approach for fast and accurate localization of optic disc (OD) and fovea using one-dimensional scanned intensity profile analysis. The proposed method utilizes both time and frequency domain information effectively for localization of OD. The final OD center is located using signal peak-valley detection in time domain and discontinuity detection in frequency domain analysis. However, with the help of detected OD location, the fovea center is located using signal valley analysis. Experiments were conducted on MESSIDOR dataset, where OD was successfully located in 1197 out of 1200 images (99.75%) and fovea in 1196 out of 1200 images (99.66%) with an average computation time of 0.52s. The large scale evaluation has been carried out extensively on nine publicly available databases. The proposed method is highly efficient in terms of quickly and accurately localizing OD and fovea structure together compared with the other state-of-the-art methods. Copyright © 2017 Elsevier Ltd. All rights reserved.
Decomposition of Proteins into Dynamic Units from Atomic Cross-Correlation Functions.
Calligari, Paolo; Gerolin, Marco; Abergel, Daniel; Polimeno, Antonino
2017-01-10
In this article, we present a clustering method of atoms in proteins based on the analysis of the correlation times of interatomic distance correlation functions computed from MD simulations. The goal is to provide a coarse-grained description of the protein in terms of fewer elements that can be treated as dynamically independent subunits. Importantly, this domain decomposition method does not take into account structural properties of the protein. Instead, the clustering of protein residues in terms of networks of dynamically correlated domains is defined on the basis of the effective correlation times of the pair distance correlation functions. For these properties, our method stands as a complementary analysis to the customary protein decomposition in terms of quasi-rigid, structure-based domains. Results obtained for a prototypal protein structure illustrate the approach proposed.
Time-domain representation of frequency-dependent foundation impedance functions
Safak, E.
2006-01-01
Foundation impedance functions provide a simple means to account for soil-structure interaction (SSI) when studying seismic response of structures. Impedance functions represent the dynamic stiffness of the soil media surrounding the foundation. The fact that impedance functions are frequency dependent makes it difficult to incorporate SSI in standard time-history analysis software. This paper introduces a simple method to convert frequency-dependent impedance functions into time-domain filters. The method is based on the least-squares approximation of impedance functions by ratios of two complex polynomials. Such ratios are equivalent, in the time-domain, to discrete-time recursive filters, which are simple finite-difference equations giving the relationship between foundation forces and displacements. These filters can easily be incorporated into standard time-history analysis programs. Three examples are presented to show the applications of the method.
NASA Astrophysics Data System (ADS)
Li, Zuhe; Fan, Yangyu; Liu, Weihua; Yu, Zeqi; Wang, Fengqin
2017-01-01
We aim to apply sparse autoencoder-based unsupervised feature learning to emotional semantic analysis for textile images. To tackle the problem of limited training data, we present a cross-domain feature learning scheme for emotional textile image classification using convolutional autoencoders. We further propose a correlation-analysis-based feature selection method for the weights learned by sparse autoencoders to reduce the number of features extracted from large size images. First, we randomly collect image patches on an unlabeled image dataset in the source domain and learn local features with a sparse autoencoder. We then conduct feature selection according to the correlation between different weight vectors corresponding to the autoencoder's hidden units. We finally adopt a convolutional neural network including a pooling layer to obtain global feature activations of textile images in the target domain and send these global feature vectors into logistic regression models for emotional image classification. The cross-domain unsupervised feature learning method achieves 65% to 78% average accuracy in the cross-validation experiments corresponding to eight emotional categories and performs better than conventional methods. Feature selection can reduce the computational cost of global feature extraction by about 50% while improving classification performance.
A Comparative Analysis of Numbers and Biology Content Domains between Turkey and the USA
ERIC Educational Resources Information Center
Incikabi, Lutfi; Ozgelen, Sinan; Tjoe, Hartono
2012-01-01
This study aimed to compare Mathematics and Science programs focusing on TIMSS content domains of Numbers and Biology that produced the largest achievement gap among students from Turkey and the USA. Specifically, it utilized the content analysis method within Turkish and New York State (NYS) frameworks. The procedures of study included matching…
NASA Astrophysics Data System (ADS)
Nastos, C. V.; Theodosiou, T. C.; Rekatsinas, C. S.; Saravanos, D. A.
2018-03-01
An efficient numerical method is developed for the simulation of dynamic response and the prediction of the wave propagation in composite plate structures. The method is termed finite wavelet domain method and takes advantage of the outstanding properties of compactly supported 2D Daubechies wavelet scaling functions for the spatial interpolation of displacements in a finite domain of a plate structure. The development of the 2D wavelet element, based on the first order shear deformation laminated plate theory is described and equivalent stiffness, mass matrices and force vectors are calculated and synthesized in the wavelet domain. The transient response is predicted using the explicit central difference time integration scheme. Numerical results for the simulation of wave propagation in isotropic, quasi-isotropic and cross-ply laminated plates are presented and demonstrate the high spatial convergence and problem size reduction obtained by the present method.
NASA Astrophysics Data System (ADS)
Zheng, Chang-Jun; Bi, Chuan-Xing; Zhang, Chuanzeng; Gao, Hai-Feng; Chen, Hai-Bo
2018-04-01
The vibration behavior of thin elastic structures can be noticeably influenced by the surrounding water, which represents a kind of heavy fluid. Since the feedback of the acoustic pressure onto the structure cannot be neglected in this case, a strong coupled scheme between the structural and fluid domains is usually required. In this work, a coupled finite element and boundary element (FE-BE) solver is developed for the free vibration analysis of structures submerged in an infinite fluid domain or a semi-infinite fluid domain with a free water surface. The structure is modeled by the finite element method (FEM). The compressibility of the fluid is taken into account, and hence the Helmholtz equation serves as the governing equation of the fluid domain. The boundary element method (BEM) is employed to model the fluid domain, and a boundary integral formulation with a half-space fundamental solution is used to satisfy the Dirichlet boundary condition on the free water surface exactly. The resulting nonlinear eigenvalue problem (NEVP) is converted into a small linear one by using a contour integral method. Adequate modifications are suggested to improve the efficiency of the contour integral method and avoid missing the eigenfrequencies of interest. The Burton-Miller method is used to filter out the fictitious eigenfrequencies of the boundary integral formulations. Numerical examples are given to demonstrate the accuracy and applicability of the developed eigensolver, and also show that the fluid-loading effect strongly depends on both the water depth and the mode shapes.
Lang, Tiange; Yin, Kangquan; Liu, Jinyu; Cao, Kunfang; Cannon, Charles H; Du, Fang K
2014-01-01
Predicting protein domains is essential for understanding a protein's function at the molecular level. However, up till now, there has been no direct and straightforward method for predicting protein domains in species without a reference genome sequence. In this study, we developed a functionality with a set of programs that can predict protein domains directly from genomic sequence data without a reference genome. Using whole genome sequence data, the programming functionality mainly comprised DNA assembly in combination with next-generation sequencing (NGS) assembly methods and traditional methods, peptide prediction and protein domain prediction. The proposed new functionality avoids problems associated with de novo assembly due to micro reads and small single repeats. Furthermore, we applied our functionality for the prediction of leucine rich repeat (LRR) domains in four species of Ficus with no reference genome, based on NGS genomic data. We found that the LRRNT_2 and LRR_8 domains are related to plant transpiration efficiency, as indicated by the stomata index, in the four species of Ficus. The programming functionality established in this study provides new insights for protein domain prediction, which is particularly timely in the current age of NGS data expansion.
Frequency domain phase-shifted confocal microscopy (FDPCM) with array detection
NASA Astrophysics Data System (ADS)
Ge, Baoliang; Huang, Yujia; Fang, Yue; Kuang, Cuifang; Xiu, Peng; Liu, Xu
2017-09-01
We proposed a novel method to reconstruct images taken by array detected confocal microscopy without prior knowledge about its detector distribution. The proposed frequency domain phase-shifted confocal microscopy (FDPCM) shifts the image from each detection channel to its corresponding place by substituting the phase information in Fourier domain. Theoretical analysis shows that our method could approach the resolution nearly twofold of wide-field microscopy. Simulation and experiment results are also shown to verify the applicability and effectiveness of our method. Compared to Airyscan, our method holds the advantage of simplicity and convenience to be applied to array detectors with different structure, which makes FDPCM have great potential in the application of biomedical observation in the future.
Multi Agent Reward Analysis for Learning in Noisy Domains
NASA Technical Reports Server (NTRS)
Tumer, Kagan; Agogino, Adrian K.
2005-01-01
In many multi agent learning problems, it is difficult to determine, a priori, the agent reward structure that will lead to good performance. This problem is particularly pronounced in continuous, noisy domains ill-suited to simple table backup schemes commonly used in TD(lambda)/Q-learning. In this paper, we present a new reward evaluation method that allows the tradeoff between coordination among the agents and the difficulty of the learning problem each agent faces to be visualized. This method is independent of the learning algorithm and is only a function of the problem domain and the agents reward structure. We then use this reward efficiency visualization method to determine an effective reward without performing extensive simulations. We test this method in both a static and a dynamic multi-rover learning domain where the agents have continuous state spaces and where their actions are noisy (e.g., the agents movement decisions are not always carried out properly). Our results show that in the more difficult dynamic domain, the reward efficiency visualization method provides a two order of magnitude speedup in selecting a good reward. Most importantly it allows one to quickly create and verify rewards tailored to the observational limitations of the domain.
Inferring Domain-Domain Interactions from Protein-Protein Interactions with Formal Concept Analysis
Khor, Susan
2014-01-01
Identifying reliable domain-domain interactions will increase our ability to predict novel protein-protein interactions, to unravel interactions in protein complexes, and thus gain more information about the function and behavior of genes. One of the challenges of identifying reliable domain-domain interactions is domain promiscuity. Promiscuous domains are domains that can occur in many domain architectures and are therefore found in many proteins. This becomes a problem for a method where the score of a domain-pair is the ratio between observed and expected frequencies because the protein-protein interaction network is sparse. As such, many protein-pairs will be non-interacting and domain-pairs with promiscuous domains will be penalized. This domain promiscuity challenge to the problem of inferring reliable domain-domain interactions from protein-protein interactions has been recognized, and a number of work-arounds have been proposed. This paper reports on an application of Formal Concept Analysis to this problem. It is found that the relationship between formal concepts provides a natural way for rare domains to elevate the rank of promiscuous domain-pairs and enrich highly ranked domain-pairs with reliable domain-domain interactions. This piggybacking of promiscuous domain-pairs onto less promiscuous domain-pairs is possible only with concept lattices whose attribute-labels are not reduced and is enhanced by the presence of proteins that comprise both promiscuous and rare domains. PMID:24586450
An exploration of function analysis and function allocation in the commercial flight domain
NASA Technical Reports Server (NTRS)
Mcguire, James C.; Zich, John A.; Goins, Richard T.; Erickson, Jeffery B.; Dwyer, John P.; Cody, William J.; Rouse, William B.
1991-01-01
The applicability is explored of functional analysis methods to support cockpit design. Specifically, alternative techniques are studied for ensuring an effective division of responsibility between the flight crew and automation. A functional decomposition is performed of the commercial flight domain to provide the information necessary to support allocation decisions and demonstrate methodology for allocating functions to flight crew or to automation. The function analysis employed 'bottom up' and 'top down' analyses and demonstrated the comparability of identified functions, using the 'lift off' segment of the 'take off' phase as a test case. The normal flight mission and selected contingencies were addressed. Two alternative methods for using the functional description in the allocation of functions between man and machine were investigated. The two methods were compared in order to ascertain their relative strengths and weaknesses. Finally, conclusions were drawn regarding the practical utility of function analysis methods.
Oncodomains: A protein domain-centric framework for analyzing rare variants in tumor samples
Peterson, Thomas A.; Park, Junyong
2017-01-01
The fight against cancer is hindered by its highly heterogeneous nature. Genome-wide sequencing studies have shown that individual malignancies contain many mutations that range from those commonly found in tumor genomes to rare somatic variants present only in a small fraction of lesions. Such rare somatic variants dominate the landscape of genomic mutations in cancer, yet efforts to correlate somatic mutations found in one or few individuals with functional roles have been largely unsuccessful. Traditional methods for identifying somatic variants that drive cancer are ‘gene-centric’ in that they consider only somatic variants within a particular gene and make no comparison to other similar genes in the same family that may play a similar role in cancer. In this work, we present oncodomain hotspots, a new ‘domain-centric’ method for identifying clusters of somatic mutations across entire gene families using protein domain models. Our analysis confirms that our approach creates a framework for leveraging structural and functional information encapsulated by protein domains into the analysis of somatic variants in cancer, enabling the assessment of even rare somatic variants by comparison to similar genes. Our results reveal a vast landscape of somatic variants that act at the level of domain families altering pathways known to be involved with cancer such as protein phosphorylation, signaling, gene regulation, and cell metabolism. Due to oncodomain hotspots’ unique ability to assess rare variants, we expect our method to become an important tool for the analysis of sequenced tumor genomes, complementing existing methods. PMID:28426665
Micro-heterogeneity versus clustering in binary mixtures of ethanol with water or alkanes.
Požar, Martina; Lovrinčević, Bernarda; Zoranić, Larisa; Primorać, Tomislav; Sokolić, Franjo; Perera, Aurélien
2016-08-24
Ethanol is a hydrogen bonding liquid. When mixed in small concentrations with water or alkanes, it forms aggregate structures reminiscent of, respectively, the direct and inverse micellar aggregates found in emulsions, albeit at much smaller sizes. At higher concentrations, micro-heterogeneous mixing with segregated domains is found. We examine how different statistical methods, namely correlation function analysis, structure factor analysis and cluster distribution analysis, can describe efficiently these morphological changes in these mixtures. In particular, we explain how the neat alcohol pre-peak of the structure factor evolves into the domain pre-peak under mixing conditions, and how this evolution differs whether the co-solvent is water or alkane. This study clearly establishes the heuristic superiority of the correlation function/structure factor analysis to study the micro-heterogeneity, since cluster distribution analysis is insensitive to domain segregation. Correlation functions detect the domains, with a clear structure factor pre-peak signature, while the cluster techniques detect the cluster hierarchy within domains. The main conclusion is that, in micro-segregated mixtures, the domain structure is a more fundamental statistical entity than the underlying cluster structures. These findings could help better understand comparatively the radiation scattering experiments, which are sensitive to domains, versus the spectroscopy-NMR experiments, which are sensitive to clusters.
Dasgupta, Aritra; Lee, Joon-Yong; Wilson, Ryan; Lafrance, Robert A; Cramer, Nick; Cook, Kristin; Payne, Samuel
2017-01-01
Combining interactive visualization with automated analytical methods like statistics and data mining facilitates data-driven discovery. These visual analytic methods are beginning to be instantiated within mixed-initiative systems, where humans and machines collaboratively influence evidence-gathering and decision-making. But an open research question is that, when domain experts analyze their data, can they completely trust the outputs and operations on the machine-side? Visualization potentially leads to a transparent analysis process, but do domain experts always trust what they see? To address these questions, we present results from the design and evaluation of a mixed-initiative, visual analytics system for biologists, focusing on analyzing the relationships between familiarity of an analysis medium and domain experts' trust. We propose a trust-augmented design of the visual analytics system, that explicitly takes into account domain-specific tasks, conventions, and preferences. For evaluating the system, we present the results of a controlled user study with 34 biologists where we compare the variation of the level of trust across conventional and visual analytic mediums and explore the influence of familiarity and task complexity on trust. We find that despite being unfamiliar with a visual analytic medium, scientists seem to have an average level of trust that is comparable with the same in conventional analysis medium. In fact, for complex sense-making tasks, we find that the visual analytic system is able to inspire greater trust than other mediums. We summarize the implications of our findings with directions for future research on trustworthiness of visual analytic systems.
Comprehensive analysis of orthologous protein domains using the HOPS database.
Storm, Christian E V; Sonnhammer, Erik L L
2003-10-01
One of the most reliable methods for protein function annotation is to transfer experimentally known functions from orthologous proteins in other organisms. Most methods for identifying orthologs operate on a subset of organisms with a completely sequenced genome, and treat proteins as single-domain units. However, it is well known that proteins are often made up of several independent domains, and there is a wealth of protein sequences from genomes that are not completely sequenced. A comprehensive set of protein domain families is found in the Pfam database. We wanted to apply orthology detection to Pfam families, but first some issues needed to be addressed. First, orthology detection becomes impractical and unreliable when too many species are included. Second, shorter domains contain less information. It is therefore important to assess the quality of the orthology assignment and avoid very short domains altogether. We present a database of orthologous protein domains in Pfam called HOPS: Hierarchical grouping of Orthologous and Paralogous Sequences. Orthology is inferred in a hierarchic system of phylogenetic subgroups using ortholog bootstrapping. To avoid the frequent errors stemming from horizontally transferred genes in bacteria, the analysis is presently limited to eukaryotic genes. The results are accessible in the graphical browser NIFAS, a Java tool originally developed for analyzing phylogenetic relations within Pfam families. The method was tested on a set of curated orthologs with experimentally verified function. In comparison to tree reconciliation with a complete species tree, our approach finds significantly more orthologs in the test set. Examples for investigating gene fusions and domain recombination using HOPS are given.
Acoustics based assessment of respiratory diseases using GMM classification.
Mayorga, P; Druzgalski, C; Morelos, R L; Gonzalez, O H; Vidales, J
2010-01-01
The focus of this paper is to present a method utilizing lung sounds for a quantitative assessment of patient health as it relates to respiratory disorders. In order to accomplish this, applicable traditional techniques within the speech processing domain were utilized to evaluate lung sounds obtained with a digital stethoscope. Traditional methods utilized in the evaluation of asthma involve auscultation and spirometry, but utilization of more sensitive electronic stethoscopes, which are currently available, and application of quantitative signal analysis methods offer opportunities of improved diagnosis. In particular we propose an acoustic evaluation methodology based on the Gaussian Mixed Models (GMM) which should assist in broader analysis, identification, and diagnosis of asthma based on the frequency domain analysis of wheezing and crackles.
Analysis of automobile engine cylinder pressure and rotation speed from engine body vibration signal
NASA Astrophysics Data System (ADS)
Wang, Yuhua; Cheng, Xiang; Tan, Haishu
2016-01-01
In order to improve the engine vibration signal process method for the engine cylinder pressure and engine revolution speed measurement instrument, the engine cylinder pressure varying with the engine working cycle process has been regarded as the main exciting force for the engine block forced vibration. The forced vibration caused by the engine cylinder pressure presents as a low frequency waveform which varies with the cylinder pressure synchronously and steadily in time domain and presents as low frequency high energy discrete humorous spectrum lines in frequency domain. The engine cylinder pressure and the rotation speed can been extract form the measured engine block vibration signal by low-pass filtering analysis in time domain or by FFT analysis in frequency domain, the low-pass filtering analysis in time domain is not only suitable for the engine in uniform revolution condition but also suitable for the engine in uneven revolution condition. That provides a practical and convenient way to design motor revolution rate and cylinder pressure measurement instrument.
NASA Technical Reports Server (NTRS)
Baumeister, Kenneth J.; Baumeister, Joseph F.
1994-01-01
An analytical procedure is presented, called the modal element method, that combines numerical grid based algorithms with eigenfunction expansions developed by separation of variables. A modal element method is presented for solving potential flow in a channel with two-dimensional cylindrical like obstacles. The infinite computational region is divided into three subdomains; the bounded finite element domain, which is characterized by the cylindrical obstacle and the surrounding unbounded uniform channel entrance and exit domains. The velocity potential is represented approximately in the grid based domain by a finite element solution and is represented analytically by an eigenfunction expansion in the uniform semi-infinite entrance and exit domains. The calculated flow fields are in excellent agreement with exact analytical solutions. By eliminating the grid surrounding the obstacle, the modal element method reduces the numerical grid size, employs a more precise far field boundary condition, as well as giving theoretical insight to the interaction of the obstacle with the mean flow. Although the analysis focuses on a specific geometry, the formulation is general and can be applied to a variety of problems as seen by a comparison to companion theories in aeroacoustics and electromagnetics.
ERIC Educational Resources Information Center
Pavlik, Philip I. Jr.; Cen, Hao; Koedinger, Kenneth R.
2009-01-01
This paper describes a novel method to create a quantitative model of an educational content domain of related practice item-types using learning curves. By using a pairwise test to search for the relationships between learning curves for these item-types, we show how the test results in a set of pairwise transfer relationships that can be…
Electro-quasistatic analysis of an electrostatic induction micromotor using the cell method.
Monzón-Verona, José Miguel; Santana-Martín, Francisco Jorge; García-Alonso, Santiago; Montiel-Nelson, Juan Antonio
2010-01-01
An electro-quasistatic analysis of an induction micromotor has been realized by using the Cell Method. We employed the direct Finite Formulation (FF) of the electromagnetic laws, hence, avoiding a further discretization. The Cell Method (CM) is used for solving the field equations at the entire domain (2D space) of the micromotor. We have reformulated the field laws in a direct FF and analyzed physical quantities to make explicit the relationship between magnitudes and laws. We applied a primal-dual barycentric discretization of the 2D space. The electric potential has been calculated on each node of the primal mesh using CM. For verification purpose, an analytical electric potential equation is introduced as reference. In frequency domain, results demonstrate the error in calculating potential quantity is neglected (<3‰). In time domain, the potential value in transient state tends to the steady state value.
Electro-Quasistatic Analysis of an Electrostatic Induction Micromotor Using the Cell Method
Monzón-Verona, José Miguel; Santana-Martín, Francisco Jorge; García–Alonso, Santiago; Montiel-Nelson, Juan Antonio
2010-01-01
An electro-quasistatic analysis of an induction micromotor has been realized by using the Cell Method. We employed the direct Finite Formulation (FF) of the electromagnetic laws, hence, avoiding a further discretization. The Cell Method (CM) is used for solving the field equations at the entire domain (2D space) of the micromotor. We have reformulated the field laws in a direct FF and analyzed physical quantities to make explicit the relationship between magnitudes and laws. We applied a primal-dual barycentric discretization of the 2D space. The electric potential has been calculated on each node of the primal mesh using CM. For verification purpose, an analytical electric potential equation is introduced as reference. In frequency domain, results demonstrate the error in calculating potential quantity is neglected (<3‰). In time domain, the potential value in transient state tends to the steady state value. PMID:22163397
Repressing the effects of variable speed harmonic orders in operational modal analysis
NASA Astrophysics Data System (ADS)
Randall, R. B.; Coats, M. D.; Smith, W. A.
2016-10-01
Discrete frequency components such as machine shaft orders can disrupt the operation of normal Operational Modal Analysis (OMA) algorithms. With constant speed machines, they have been removed using time synchronous averaging (TSA). This paper compares two approaches for varying speed machines. In one method, signals are transformed into the order domain, and after the removal of shaft speed related components by a cepstral notching method, are transformed back to the time domain to allow normal OMA. In the other simpler approach an exponential shortpass lifter is applied directly in the time domain cepstrum to enhance the modal information at the expense of other disturbances. For simulated gear signals with speed variations of both ±5% and ±15%, the simpler approach was found to give better results The TSA method is shown not to work in either case. The paper compares the results with those obtained using a stationary random excitation.
Analysis of microstrip patch antennas using finite difference time domain method
NASA Astrophysics Data System (ADS)
Reineix, Alain; Jecko, Bernard
1989-11-01
The study of microstrip patch antennas is directly treated in the time domain, using a modified finite-difference time-domain (FDTD) method. Assuming an appropriate choice of excitation, the frequency dependence of the relevant parameters can readily be found using the Fourier transform of the transient current. The FDTD method allows a rigorous treatment of one or several dielectric interfaces. Different types of excitation can be taken into consideration (coaxial, microstrip lines, etc.). Plotting the spatial distribution of the current density gives information about the resonance modes. The usual frequency-depedent parameters (input impedance, radiation pattern) are given for several examples.
Vibration of carbon nanotubes with defects: order reduction methods
NASA Astrophysics Data System (ADS)
Hudson, Robert B.; Sinha, Alok
2018-03-01
Order reduction methods are widely used to reduce computational effort when calculating the impact of defects on the vibrational properties of nearly periodic structures in engineering applications, such as a gas-turbine bladed disc. However, despite obvious similarities these techniques have not yet been adapted for use in analysing atomic structures with inevitable defects. Two order reduction techniques, modal domain analysis and modified modal domain analysis, are successfully used in this paper to examine the changes in vibrational frequencies, mode shapes and mode localization caused by defects in carbon nanotubes. The defects considered are isotope defects and Stone-Wales defects, though the methods described can be extended to other defects.
Studies in astronomical time series analysis. I - Modeling random processes in the time domain
NASA Technical Reports Server (NTRS)
Scargle, J. D.
1981-01-01
Several random process models in the time domain are defined and discussed. Attention is given to the moving average model, the autoregressive model, and relationships between and combinations of these models. Consideration is then given to methods for investigating pulse structure, procedures of model construction, computational methods, and numerical experiments. A FORTRAN algorithm of time series analysis has been developed which is relatively stable numerically. Results of test cases are given to study the effect of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the light curve of the quasar 3C 272 is considered as an example.
NASA Technical Reports Server (NTRS)
Mei, Chuh; Pates, Carl S., III
1994-01-01
A coupled boundary element (BEM)-finite element (FEM) approach is presented to accurately model structure-acoustic interaction systems. The boundary element method is first applied to interior, two and three-dimensional acoustic domains with complex geometry configurations. Boundary element results are very accurate when compared with limited exact solutions. Structure-interaction problems are then analyzed with the coupled FEM-BEM method, where the finite element method models the structure and the boundary element method models the interior acoustic domain. The coupled analysis is compared with exact and experimental results for a simplistic model. Composite panels are analyzed and compared with isotropic results. The coupled method is then extended for random excitation. Random excitation results are compared with uncoupled results for isotropic and composite panels.
Fancher, C. M.; Brewer, S.; Chung, C. C.; ...
2016-12-27
Here, the contribution of 180° domain wall motion to polarization and dielectric properties of ferroelectric materials has yet to be determined experimentally. In this paper, an approach for estimating the extent of (180°) domain reversal during application of electric fields is presented. We demonstrate this method by determining the contribution of domain reversal to polarization in soft lead zirconate titanate during application of strong electric fields. At the maximum applied field, domain reversal was determined to account for >80% of the measured macroscopic polarization. We also apply the method to quantify the contribution of domain reversal to the weak-field dielectricmore » permittivity of BaTiO 3. The results of this analysis determined that domain reversal accounts for up to ~70% of the macroscopic dielectric permittivity in BaTiO 3. These results demonstrate the predominance of domain reversal to high and low-field dielectric response in ferroelectric polycrystalline materials.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fancher, C. M.; Brewer, S.; Chung, C. C.
2017-03-01
The contribution of 180° domain wall motion to polarization and dielectric properties of ferroelectric materials has yet to be determined experimentally. In this paper, an approach for estimating the extent of (180°) domain reversal during application of electric fields is presented. We demonstrate this method by determining the contribution of domain reversal to polarization in soft lead zirconate titanate during application of strong electric fields. At the maximum applied field, domain reversal was determined to account for >80% of the measured macroscopic polarization. We also apply the method to quantify the contribution of domain reversal to the weak-field dielectric permittivitymore » of BaTiO 3. The results of this analysis determined that domain reversal accounts for up to ~70% of the macroscopic dielectric permittivity in BaTiO 3. These results demonstrate the predominance of domain reversal to high and low-field dielectric response in ferroelectric polycrystalline materials.« less
Investigation of domain walls in PPLN by confocal raman microscopy and PCA analysis
NASA Astrophysics Data System (ADS)
Shur, Vladimir Ya.; Zelenovskiy, Pavel; Bourson, Patrice
2017-07-01
Confocal Raman microscopy (CRM) is a powerful tool for investigation of ferroelectric domains. Mechanical stresses and electric fields existed in the vicinity of neutral and charged domain walls modify frequency, intensity and width of spectral lines [1], thus allowing to visualize micro- and nanodomain structures both at the surface and in the bulk of the crystal [2,3]. Stresses and fields are naturally coupled in ferroelectrics due to inverse piezoelectric effect and hardly can be separated in Raman spectra. PCA is a powerful statistical method for analysis of large data matrix providing a set of orthogonal variables, called principal components (PCs). PCA is widely used for classification of experimental data, for example, in crystallization experiments, for detection of small amounts of components in solid mixtures etc. [4,5]. In Raman spectroscopy PCA was applied for analysis of phase transitions and provided critical pressure with good accuracy [6]. In the present work we for the first time applied Principal Component Analysis (PCA) method for analysis of Raman spectra measured in periodically poled lithium niobate (PPLN). We found that principal components demonstrate different sensitivity to mechanical stresses and electric fields in the vicinity of the domain walls. This allowed us to separately visualize spatial distribution of fields and electric fields at the surface and in the bulk of PPLN.
NASA Astrophysics Data System (ADS)
Liao, Zhijun; Wang, Xinrui; Zeng, Yeting; Zou, Quan
2016-12-01
The Dishevelled/EGL-10/Pleckstrin (DEP) domain-containing (DEPDC) proteins have seven members. However, whether this superfamily can be distinguished from other proteins based only on the amino acid sequences, remains unknown. Here, we describe a computational method to segregate DEPDCs and non-DEPDCs. First, we examined the Pfam numbers of the known DEPDCs and used the longest sequences for each Pfam to construct a phylogenetic tree. Subsequently, we extracted 188-dimensional (188D) and 20D features of DEPDCs and non-DEPDCs and classified them with random forest classifier. We also mined the motifs of human DEPDCs to find the related domains. Finally, we designed experimental verification methods of human DEPDC expression at the mRNA level in hepatocellular carcinoma (HCC) and adjacent normal tissues. The phylogenetic analysis showed that the DEPDCs superfamily can be divided into three clusters. Moreover, the 188D and 20D features can both be used to effectively distinguish the two protein types. Motif analysis revealed that the DEP and RhoGAP domain was common in human DEPDCs, human HCC and the adjacent tissues that widely expressed DEPDCs. However, their regulation was not identical. In conclusion, we successfully constructed a binary classifier for DEPDCs and experimentally verified their expression in human HCC tissues.
Identifying protein domains by global analysis of soluble fragment data.
Bulloch, Esther M M; Kingston, Richard L
2014-11-15
The production and analysis of individual structural domains is a common strategy for studying large or complex proteins, which may be experimentally intractable in their full-length form. However, identifying domain boundaries is challenging if there is little structural information concerning the protein target. One experimental procedure for mapping domains is to screen a library of random protein fragments for solubility, since truncation of a domain will typically expose hydrophobic groups, leading to poor fragment solubility. We have coupled fragment solubility screening with global data analysis to develop an effective method for identifying structural domains within a protein. A gene fragment library is generated using mechanical shearing, or by uracil doping of the gene and a uracil-specific enzymatic digest. A split green fluorescent protein (GFP) assay is used to screen the corresponding protein fragments for solubility when expressed in Escherichia coli. The soluble fragment data are then analyzed using two complementary approaches. Fragmentation "hotspots" indicate possible interdomain regions. Clustering algorithms are used to group related fragments, and concomitantly predict domain location. The effectiveness of this Domain Seeking procedure is demonstrated by application to the well-characterized human protein p85α. Copyright © 2014 Elsevier Inc. All rights reserved.
Sentiment Analysis Using Common-Sense and Context Information
Mittal, Namita; Bansal, Pooja; Garg, Sonal
2015-01-01
Sentiment analysis research has been increasing tremendously in recent times due to the wide range of business and social applications. Sentiment analysis from unstructured natural language text has recently received considerable attention from the research community. In this paper, we propose a novel sentiment analysis model based on common-sense knowledge extracted from ConceptNet based ontology and context information. ConceptNet based ontology is used to determine the domain specific concepts which in turn produced the domain specific important features. Further, the polarities of the extracted concepts are determined using the contextual polarity lexicon which we developed by considering the context information of a word. Finally, semantic orientations of domain specific features of the review document are aggregated based on the importance of a feature with respect to the domain. The importance of the feature is determined by the depth of the feature in the ontology. Experimental results show the effectiveness of the proposed methods. PMID:25866505
Sentiment analysis using common-sense and context information.
Agarwal, Basant; Mittal, Namita; Bansal, Pooja; Garg, Sonal
2015-01-01
Sentiment analysis research has been increasing tremendously in recent times due to the wide range of business and social applications. Sentiment analysis from unstructured natural language text has recently received considerable attention from the research community. In this paper, we propose a novel sentiment analysis model based on common-sense knowledge extracted from ConceptNet based ontology and context information. ConceptNet based ontology is used to determine the domain specific concepts which in turn produced the domain specific important features. Further, the polarities of the extracted concepts are determined using the contextual polarity lexicon which we developed by considering the context information of a word. Finally, semantic orientations of domain specific features of the review document are aggregated based on the importance of a feature with respect to the domain. The importance of the feature is determined by the depth of the feature in the ontology. Experimental results show the effectiveness of the proposed methods.
An equivalent domain integral for analysis of two-dimensional mixed mode problems
NASA Technical Reports Server (NTRS)
Raju, I. S.; Shivakumar, K. N.
1989-01-01
An equivalent domain integral (EDI) method for calculating J-integrals for two-dimensional cracked elastic bodies subjected to mixed mode loading is presented. The total and product integrals consist of the sum of an area or domain integral and line integrals on the crack faces. The EDI method gave accurate values of the J-integrals for two mode I and two mixed mode problems. Numerical studies showed that domains consisting of one layer of elements are sufficient to obtain accurate J-integral values. Two procedures for separating the individual modes from the domain integrals are presented. The procedure that uses the symmetric and antisymmetric components of the stress and displacement fields to calculate the individual modes gave accurate values of the integrals for all the problems analyzed.
Comparison of frequency-domain and time-domain rotorcraft vibration control methods
NASA Technical Reports Server (NTRS)
Gupta, N. K.
1984-01-01
Active control of rotor-induced vibration in rotorcraft has received significant attention recently. Two classes of techniques have been proposed. The more developed approach works with harmonic analysis of measured time histories and is called the frequency-domain approach. The more recent approach computes the control input directly using the measured time history data and is called the time-domain approach. The report summarizes the results of a theoretical investigation to compare the two approaches. Five specific areas were addressed: (1) techniques to derive models needed for control design (system identification methods), (2) robustness with respect to errors, (3) transient response, (4) susceptibility to noise, and (5) implementation difficulties. The system identification methods are more difficult for the time-domain models. The time-domain approach is more robust (e.g., has higher gain and phase margins) than the frequency-domain approach. It might thus be possible to avoid doing real-time system identification in the time-domain approach by storing models at a number of flight conditions. The most significant error source is the variation in open-loop vibrations caused by pilot inputs, maneuvers or gusts. The implementation requirements are similar except that the time-domain approach can be much simpler to implement if real-time system identification were not necessary.
Shin, Junha; Lee, Insuk
2015-01-01
Phylogenetic profiling, a network inference method based on gene inheritance profiles, has been widely used to construct functional gene networks in microbes. However, its utility for network inference in higher eukaryotes has been limited. An improved algorithm with an in-depth understanding of pathway evolution may overcome this limitation. In this study, we investigated the effects of taxonomic structures on co-inheritance analysis using 2,144 reference species in four query species: Escherichia coli, Saccharomyces cerevisiae, Arabidopsis thaliana, and Homo sapiens. We observed three clusters of reference species based on a principal component analysis of the phylogenetic profiles, which correspond to the three domains of life—Archaea, Bacteria, and Eukaryota—suggesting that pathways inherit primarily within specific domains or lower-ranked taxonomic groups during speciation. Hence, the co-inheritance pattern within a taxonomic group may be eroded by confounding inheritance patterns from irrelevant taxonomic groups. We demonstrated that co-inheritance analysis within domains substantially improved network inference not only in microbe species but also in the higher eukaryotes, including humans. Although we observed two sub-domain clusters of reference species within Eukaryota, co-inheritance analysis within these sub-domain taxonomic groups only marginally improved network inference. Therefore, we conclude that co-inheritance analysis within domains is the optimal approach to network inference with the given reference species. The construction of a series of human gene networks with increasing sample sizes of the reference species for each domain revealed that the size of the high-accuracy networks increased as additional reference species genomes were included, suggesting that within-domain co-inheritance analysis will continue to expand human gene networks as genomes of additional species are sequenced. Taken together, we propose that co-inheritance analysis within the domains of life will greatly potentiate the use of the expected onslaught of sequenced genomes in the study of molecular pathways in higher eukaryotes. PMID:26394049
2D automatic body-fitted structured mesh generation using advancing extraction method
NASA Astrophysics Data System (ADS)
Zhang, Yaoxin; Jia, Yafei
2018-01-01
This paper presents an automatic mesh generation algorithm for body-fitted structured meshes in Computational Fluids Dynamics (CFD) analysis using the Advancing Extraction Method (AEM). The method is applicable to two-dimensional domains with complex geometries, which have the hierarchical tree-like topography with extrusion-like structures (i.e., branches or tributaries) and intrusion-like structures (i.e., peninsula or dikes). With the AEM, the hierarchical levels of sub-domains can be identified, and the block boundary of each sub-domain in convex polygon shape in each level can be extracted in an advancing scheme. In this paper, several examples were used to illustrate the effectiveness and applicability of the proposed algorithm for automatic structured mesh generation, and the implementation of the method.
Sentiment classification technology based on Markov logic networks
NASA Astrophysics Data System (ADS)
He, Hui; Li, Zhigang; Yao, Chongchong; Zhang, Weizhe
2016-07-01
With diverse online media emerging, there is a growing concern of sentiment classification problem. At present, text sentiment classification mainly utilizes supervised machine learning methods, which feature certain domain dependency. On the basis of Markov logic networks (MLNs), this study proposed a cross-domain multi-task text sentiment classification method rooted in transfer learning. Through many-to-one knowledge transfer, labeled text sentiment classification, knowledge was successfully transferred into other domains, and the precision of the sentiment classification analysis in the text tendency domain was improved. The experimental results revealed the following: (1) the model based on a MLN demonstrated higher precision than the single individual learning plan model. (2) Multi-task transfer learning based on Markov logical networks could acquire more knowledge than self-domain learning. The cross-domain text sentiment classification model could significantly improve the precision and efficiency of text sentiment classification.
Sumer, Huseyin; Craig, Jeffrey M.; Sibson, Mandy; Choo, K.H. Andy
2003-01-01
Human neocentromeres are fully functional centromeres that arise at previously noncentromeric regions of the genome. We have tested a rapid procedure of genomic array analysis of chromosome scaffold/matrix attachment regions (S/MARs), involving the isolation of S/MAR DNA and hybridization of this DNA to a genomic BAC/PAC array. Using this procedure, we have defined a 2.5-Mb domain of S/MAR-enriched chromatin that fully encompasses a previously mapped centromere protein-A (CENP-A)-associated domain at a human neocentromere. We have independently verified this procedure using a previously established fluorescence in situ hybridization method on salt-treated metaphase chromosomes. In silico sequence analysis of the S/MAR-enriched and surrounding regions has revealed no outstanding sequence-related predisposition. This study defines the S/MAR-enriched domain of a higher eukaryotic centromere and provides a method that has broad application for the mapping of S/MAR attachment sites over large genomic regions or throughout a genome. PMID:12840048
Data Analysis and Statistical Methods for the Assessment and Interpretation of Geochronologic Data
NASA Astrophysics Data System (ADS)
Reno, B. L.; Brown, M.; Piccoli, P. M.
2007-12-01
Ages are traditionally reported as a weighted mean with an uncertainty based on least squares analysis of analytical error on individual dates. This method does not take into account geological uncertainties, and cannot accommodate asymmetries in the data. In most instances, this method will understate uncertainty on a given age, which may lead to over interpretation of age data. Geologic uncertainty is difficult to quantify, but is typically greater than analytical uncertainty. These factors make traditional statistical approaches inadequate to fully evaluate geochronologic data. We propose a protocol to assess populations within multi-event datasets and to calculate age and uncertainty from each population of dates interpreted to represent a single geologic event using robust and resistant statistical methods. To assess whether populations thought to represent different events are statistically separate exploratory data analysis is undertaken using a box plot, where the range of the data is represented by a 'box' of length given by the interquartile range, divided at the median of the data, with 'whiskers' that extend to the furthest datapoint that lies within 1.5 times the interquartile range beyond the box. If the boxes representing the populations do not overlap, they are interpreted to represent statistically different sets of dates. Ages are calculated from statistically distinct populations using a robust tool such as the tanh method of Kelsey et al. (2003, CMP, 146, 326-340), which is insensitive to any assumptions about the underlying probability distribution from which the data are drawn. Therefore, this method takes into account the full range of data, and is not drastically affected by outliers. The interquartile range of each population of dates (the interquartile range) gives a first pass at expressing uncertainty, which accommodates asymmetry in the dataset; outliers have a minor affect on the uncertainty. To better quantify the uncertainty, a resistant tool that is insensitive to local misbehavior of data is preferred, such as the normalized median absolute deviations proposed by Powell et al. (2002, Chem Geol, 185, 191-204). We illustrate the method using a dataset of 152 monazite dates determined using EPMA chemical data from a single sample from the Neoproterozoic Brasília Belt, Brazil. Results are compared with ages and uncertainties calculated using traditional methods to demonstrate the differences. The dataset was manually culled into three populations representing discrete compositional domains within chemically-zoned monazite grains. The weighted mean ages and least squares uncertainties for these populations are 633±6 (2σ) Ma for a core domain, 614±5 (2σ) Ma for an intermediate domain and 595±6 (2σ) Ma for a rim domain. Probability distribution plots indicate asymmetric distributions of all populations, which cannot be accounted for with traditional statistical tools. These three domains record distinct ages outside the interquartile range for each population of dates, with the core domain lying in the subrange 642-624 Ma, the intermediate domain 617-609 Ma and the rim domain 606-589 Ma. The tanh estimator yields ages of 631±7 (2σ) for the core domain, 616±7 (2σ) for the intermediate domain and 601±8 (2σ) for the rim domain. Whereas the uncertainties derived using a resistant statistical tool are larger than those derived from traditional statistical tools, the method yields more realistic uncertainties that better address the spread in the dataset and account for asymmetry in the data.
Evaluation of Selected Binding Domains for the Analysis of Ubiquitinated Proteomes
NASA Astrophysics Data System (ADS)
Nakayasu, Ernesto S.; Ansong, Charles; Brown, Joseph N.; Yang, Feng; Lopez-Ferrer, Daniel; Qian, Wei-Jun; Smith, Richard D.; Adkins, Joshua N.
2013-08-01
Ubiquitination is an abundant post-translational modification that consists of covalent attachment of ubiquitin to lysine residues or the N-terminus of proteins. Mono- and polyubiquitination have been shown to be involved in many critical eukaryotic cellular functions and are often disrupted by intracellular bacterial pathogens. Affinity enrichment of ubiquitinated proteins enables global analysis of this key modification. In this context, the use of ubiquitin-binding domains is a promising but relatively unexplored alternative to more broadly used immunoaffinity or tagged affinity enrichment methods. In this study, we evaluated the application of eight ubiquitin-binding domains that have differing affinities for ubiquitination states. Small-scale proteomics analysis identified ~200 ubiquitinated protein candidates per ubiquitin-binding domain pull-down experiment. Results from subsequent Western blot analyses that employed anti-ubiquitin or monoclonal antibodies against polyubiquitination at lysine 48 and 63 suggest that ubiquitin-binding domains from Dsk2 and ubiquilin-1 have the broadest specificity in that they captured most types of ubiquitination, whereas the binding domain from NBR1 was more selective to polyubiquitination. These data demonstrate that with optimized purification conditions, ubiquitin-binding domains can be an alternative tool for proteomic applications. This approach is especially promising for the analysis of tissues or cells resistant to transfection, of which the overexpression of tagged ubiquitin is a major hurdle.
Necessary and sufficient condition for the realization of the complex wavelet
NASA Astrophysics Data System (ADS)
Keita, Alpha; Qing, Qianqin; Wang, Nengchao
1997-04-01
Wavelet theory is a whole new signal analysis theory in recent years, and the appearance of which is attracting lots of experts in many different fields giving it a deepen study. Wavelet transformation is a new kind of time. Frequency domain analysis method of localization in can-be- realized time domain or frequency domain. It has many perfect characteristics that many other kinds of time frequency domain analysis, such as Gabor transformation or Viginier. For example, it has orthogonality, direction selectivity, variable time-frequency domain resolution ratio, adjustable local support, parsing data in little amount, and so on. All those above make wavelet transformation a very important new tool and method in signal analysis field. Because the calculation of complex wavelet is very difficult, in application, real wavelet function is used. In this paper, we present a necessary and sufficient condition that the real wavelet function can be obtained by the complex wavelet function. This theorem has some significant values in theory. The paper prepares its technique from Hartley transformation, then, it gives the complex wavelet was a signal engineering expert. His Hartley transformation, which also mentioned by Hartley, had been overlooked for about 40 years, for the social production conditions at that time cannot help to show its superiority. Only when it came to the end of 70s and the early 80s, after the development of the fast algorithm of Fourier transformation and the hardware implement to some degree, the completely some positive-negative transforming method was coming to take seriously. W transformation, which mentioned by Zhongde Wang, pushed the studying work of Hartley transformation and its fast algorithm forward. The kernel function of Hartley transformation.
NASA Technical Reports Server (NTRS)
Storaasli, Olaf O. (Editor); Housner, Jerrold M. (Editor)
1993-01-01
Computing speed is leaping forward by several orders of magnitude each decade. Engineers and scientists gathered at a NASA Langley symposium to discuss these exciting trends as they apply to parallel computational methods for large-scale structural analysis and design. Among the topics discussed were: large-scale static analysis; dynamic, transient, and thermal analysis; domain decomposition (substructuring); and nonlinear and numerical methods.
Samak, M. Mosleh E. Abu; Bakar, A. Ashrif A.; Kashif, Muhammad; Zan, Mohd Saiful Dzulkifly
2016-01-01
This paper discusses numerical analysis methods for different geometrical features that have limited interval values for typically used sensor wavelengths. Compared with existing Finite Difference Time Domain (FDTD) methods, the alternating direction implicit (ADI)-FDTD method reduces the number of sub-steps by a factor of two to three, which represents a 33% time savings in each single run. The local one-dimensional (LOD)-FDTD method has similar numerical equation properties, which should be calculated as in the previous method. Generally, a small number of arithmetic processes, which result in a shorter simulation time, are desired. The alternating direction implicit technique can be considered a significant step forward for improving the efficiency of unconditionally stable FDTD schemes. This comparative study shows that the local one-dimensional method had minimum relative error ranges of less than 40% for analytical frequencies above 42.85 GHz, and the same accuracy was generated by both methods.
Phylogenetic analysis of the envelope protein (domain lll) of dengue 4 viruses
Mota, Javier; Ramos-Castañeda, José; Rico-Hesse, Rebeca; Ramos, Celso
2011-01-01
Objective To evaluate the genetic variability of domain III of envelope (E) protein and to estimate phylogenetic relationships of dengue 4 (Den-4) viruses isolated in Mexico and from other endemic areas of the world. Material and Methods A phylogenetic study of domain III of envelope (E) protein of Den-4 viruses was conducted in 1998 using virus strains from Mexico and other parts of the world, isolated in different years. Specific primers were used to amplify by RT-PCR the domain III and to obtain nucleotide sequence. Based on nucleotide and deduced aminoacid sequence, genetic variability was estimated and a phylogenetic tree was generated. To make an easy genetic analysis of domain III region, a Restriction Fragment Length Polymorphism (RFLP) assay was performed, using six restriction enzymes. Results Study results demonstrate that nucleotide and aminoacid sequence analysis of domain III are similar to those reported from the complete E protein gene. Based on the RFLP analysis of domain III using the restriction enzymes Nla III, Dde I and Cfo I, Den-4 viruses included in this study were clustered into genotypes 1 and 2 previously reported. Conclusions Study results suggest that domain III may be used as a genetic marker for phylogenetic and molecular epidemiology studies of dengue viruses. The English version of this paper is available too at: http://www.insp.mx/salud/index.html PMID:12132320
NASA Astrophysics Data System (ADS)
Ogura, Kenji; Okamura, Hideyasu
2013-10-01
Growth factor receptor-bound protein 2 (Grb2) is a small adapter protein composed of a single SH2 domain flanked by two SH3 domains. The N-terminal SH3 (nSH3) domain of Grb2 binds a proline-rich region present in the guanine nucleotide releasing factor, son of sevenless (Sos). Using NMR relaxation dispersion and chemical shift analysis methods, we investigated the conformational change of the Sos-derived proline-rich peptide during the transition between the free and Grb2 nSH3-bound states. The chemical shift analysis revealed that the peptide does not present a fully random conformation but has a relatively rigid structure. The relaxation dispersion analysis detected conformational exchange of several residues of the peptide upon binding to Grb2 nSH3.
NASA Astrophysics Data System (ADS)
Ramezanzadeh, B.; Arman, S. Y.; Mehdipour, M.; Markhali, B. P.
2014-01-01
In this study, the corrosion inhibition properties of two similar heterocyclic compounds namely benzotriazole (BTA) and benzothiazole (BNS) inhibitors on copper in 1.0 M H2SO4 solution were studied by electrochemical techniques as well as surface analysis. The results showed that corrosion inhibition of copper largely depends on the molecular structure and concentration of the inhibitors. The effect of DC trend on the interpretation of electrochemical noise (ECN) results in time domain was evaluated by moving average removal (MAR) method. Accordingly, the impact of square and Hanning window functions as drift removal methods in frequency domain was studied. After DC trend removal, a good trend was observed between electrochemical noise (ECN) data and the results obtained from EIS and potentiodynamic polarization. Furthermore, the shot noise theory in frequency domain was applied to approach the charge of each electrochemical event (q) from the potential and current noise signals.
Error analysis of multipoint flux domain decomposition methods for evolutionary diffusion problems
NASA Astrophysics Data System (ADS)
Arrarás, A.; Portero, L.; Yotov, I.
2014-01-01
We study space and time discretizations for mixed formulations of parabolic problems. The spatial approximation is based on the multipoint flux mixed finite element method, which reduces to an efficient cell-centered pressure system on general grids, including triangles, quadrilaterals, tetrahedra, and hexahedra. The time integration is performed by using a domain decomposition time-splitting technique combined with multiterm fractional step diagonally implicit Runge-Kutta methods. The resulting scheme is unconditionally stable and computationally efficient, as it reduces the global system to a collection of uncoupled subdomain problems that can be solved in parallel without the need for Schwarz-type iteration. Convergence analysis for both the semidiscrete and fully discrete schemes is presented.
A cross-domain communication resource scheduling method for grid-enabled communication networks
NASA Astrophysics Data System (ADS)
Zheng, Xiangquan; Wen, Xiang; Zhang, Yongding
2011-10-01
To support a wide range of different grid applications in environments where various heterogeneous communication networks coexist, it is important to enable advanced capabilities in on-demand and dynamical integration and efficient co-share with cross-domain heterogeneous communication resource, thus providing communication services which are impossible for single communication resource to afford. Based on plug-and-play co-share and soft integration with communication resource, Grid-enabled communication network is flexibly built up to provide on-demand communication services for gird applications with various requirements on quality of service. Based on the analysis of joint job and communication resource scheduling in grid-enabled communication networks (GECN), this paper presents a cross multi-domain communication resource cooperatively scheduling method and describes the main processes such as traffic requirement resolution for communication services, cross multi-domain negotiation on communication resource, on-demand communication resource scheduling, and so on. The presented method is to afford communication service capability to cross-domain traffic delivery in GECNs. Further research work towards validation and implement of the presented method is pointed out at last.
Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong
2017-12-01
Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.
Testing for Granger Causality in the Frequency Domain: A Phase Resampling Method.
Liu, Siwei; Molenaar, Peter
2016-01-01
This article introduces phase resampling, an existing but rarely used surrogate data method for making statistical inferences of Granger causality in frequency domain time series analysis. Granger causality testing is essential for establishing causal relations among variables in multivariate dynamic processes. However, testing for Granger causality in the frequency domain is challenging due to the nonlinear relation between frequency domain measures (e.g., partial directed coherence, generalized partial directed coherence) and time domain data. Through a simulation study, we demonstrate that phase resampling is a general and robust method for making statistical inferences even with short time series. With Gaussian data, phase resampling yields satisfactory type I and type II error rates in all but one condition we examine: when a small effect size is combined with an insufficient number of data points. Violations of normality lead to slightly higher error rates but are mostly within acceptable ranges. We illustrate the utility of phase resampling with two empirical examples involving multivariate electroencephalography (EEG) and skin conductance data.
Evaluation of Time Domain EM Coupling Techniques. Volume II.
1980-08-01
tool for the analysis of elec- tromangetic coupling and shielding problems: the finite-difference, time-domain (FD- TD ) solution of Maxwell’s equations...The objective of the program was to evaluate the suitability of the FD- TD method to determine the amount of electromagnetic coupling through an...specific questfiowwere addressed during this program: 1. Can the FD- TD method accurately model electromagnetic coupling into a conducting structure for
Huang, Yue; Zheng, Han; Liu, Chi; Ding, Xinghao; Rohde, Gustavo K
2017-11-01
Epithelium-stroma classification is a necessary preprocessing step in histopathological image analysis. Current deep learning based recognition methods for histology data require collection of large volumes of labeled data in order to train a new neural network when there are changes to the image acquisition procedure. However, it is extremely expensive for pathologists to manually label sufficient volumes of data for each pathology study in a professional manner, which results in limitations in real-world applications. A very simple but effective deep learning method, that introduces the concept of unsupervised domain adaptation to a simple convolutional neural network (CNN), has been proposed in this paper. Inspired by transfer learning, our paper assumes that the training data and testing data follow different distributions, and there is an adaptation operation to more accurately estimate the kernels in CNN in feature extraction, in order to enhance performance by transferring knowledge from labeled data in source domain to unlabeled data in target domain. The model has been evaluated using three independent public epithelium-stroma datasets by cross-dataset validations. The experimental results demonstrate that for epithelium-stroma classification, the proposed framework outperforms the state-of-the-art deep neural network model, and it also achieves better performance than other existing deep domain adaptation methods. The proposed model can be considered to be a better option for real-world applications in histopathological image analysis, since there is no longer a requirement for large-scale labeled data in each specified domain.
[Formula: see text] regularity properties of singular parameterizations in isogeometric analysis.
Takacs, T; Jüttler, B
2012-11-01
Isogeometric analysis (IGA) is a numerical simulation method which is directly based on the NURBS-based representation of CAD models. It exploits the tensor-product structure of 2- or 3-dimensional NURBS objects to parameterize the physical domain. Hence the physical domain is parameterized with respect to a rectangle or to a cube. Consequently, singularly parameterized NURBS surfaces and NURBS volumes are needed in order to represent non-quadrangular or non-hexahedral domains without splitting, thereby producing a very compact and convenient representation. The Galerkin projection introduces finite-dimensional spaces of test functions in the weak formulation of partial differential equations. In particular, the test functions used in isogeometric analysis are obtained by composing the inverse of the domain parameterization with the NURBS basis functions. In the case of singular parameterizations, however, some of the resulting test functions do not necessarily fulfill the required regularity properties. Consequently, numerical methods for the solution of partial differential equations cannot be applied properly. We discuss the regularity properties of the test functions. For one- and two-dimensional domains we consider several important classes of singularities of NURBS parameterizations. For specific cases we derive additional conditions which guarantee the regularity of the test functions. In addition we present a modification scheme for the discretized function space in case of insufficient regularity. It is also shown how these results can be applied for computational domains in higher dimensions that can be parameterized via sweeping.
Poisson-Gaussian Noise Analysis and Estimation for Low-Dose X-ray Images in the NSCT Domain.
Lee, Sangyoon; Lee, Min Seok; Kang, Moon Gi
2018-03-29
The noise distribution of images obtained by X-ray sensors in low-dosage situations can be analyzed using the Poisson and Gaussian mixture model. Multiscale conversion is one of the most popular noise reduction methods used in recent years. Estimation of the noise distribution of each subband in the multiscale domain is the most important factor in performing noise reduction, with non-subsampled contourlet transform (NSCT) representing an effective method for scale and direction decomposition. In this study, we use artificially generated noise to analyze and estimate the Poisson-Gaussian noise of low-dose X-ray images in the NSCT domain. The noise distribution of the subband coefficients is analyzed using the noiseless low-band coefficients and the variance of the noisy subband coefficients. The noise-after-transform also follows a Poisson-Gaussian distribution, and the relationship between the noise parameters of the subband and the full-band image is identified. We then analyze noise of actual images to validate the theoretical analysis. Comparison of the proposed noise estimation method with an existing noise reduction method confirms that the proposed method outperforms traditional methods.
Poisson–Gaussian Noise Analysis and Estimation for Low-Dose X-ray Images in the NSCT Domain
Lee, Sangyoon; Lee, Min Seok; Kang, Moon Gi
2018-01-01
The noise distribution of images obtained by X-ray sensors in low-dosage situations can be analyzed using the Poisson and Gaussian mixture model. Multiscale conversion is one of the most popular noise reduction methods used in recent years. Estimation of the noise distribution of each subband in the multiscale domain is the most important factor in performing noise reduction, with non-subsampled contourlet transform (NSCT) representing an effective method for scale and direction decomposition. In this study, we use artificially generated noise to analyze and estimate the Poisson–Gaussian noise of low-dose X-ray images in the NSCT domain. The noise distribution of the subband coefficients is analyzed using the noiseless low-band coefficients and the variance of the noisy subband coefficients. The noise-after-transform also follows a Poisson–Gaussian distribution, and the relationship between the noise parameters of the subband and the full-band image is identified. We then analyze noise of actual images to validate the theoretical analysis. Comparison of the proposed noise estimation method with an existing noise reduction method confirms that the proposed method outperforms traditional methods. PMID:29596335
C3 Domain Analysis, Lessons Learned
1993-09-30
organize the domain. This approach is heavily based on the principles of library science and is geared toward a reuse effort with a large library-like...method adapts many principles from library science to the organization and implementation of a reuse library. C-1 DEFENSE INFORMATION SYSTEMS AGENCY
NASA Technical Reports Server (NTRS)
Berg, Melanie; Label, Kenneth; Campola, Michael; Xapsos, Michael
2017-01-01
We propose a method for the application of single event upset (SEU) data towards the analysis of complex systems using transformed reliability models (from the time domain to the particle fluence domain) and space environment data.
Frias, A.E.; Schabel, M.C.; Roberts, V.H.J.; Tudorica, A.; Grigsby, P.L.; Oh, K.Y.; Kroenke, C. D.
2015-01-01
Purpose The maternal microvasculature of the primate placenta is organized into 10-20 perfusion domains that are functionally optimized to facilitate nutrient exchange to support fetal growth. This study describes a dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) method for identifying vascular domains, and quantifying maternal blood flow in them. Methods A rhesus macaque on the 133rd day of pregnancy (G133, term=165 days) underwent Doppler ultrasound (US) procedures, DCE-MRI, and Cesarean-section delivery. Serial T1-weighted images acquired throughout intravenous injection of a contrast reagent (CR) bolus were analyzed to obtain CR arrival time maps of the placenta. Results Watershed segmentation of the arrival time map identified 16 perfusion domains. The number and location of these domains corresponded to anatomical cotyledonary units observed following delivery. Analysis of the CR wave front through each perfusion domain enabled determination of volumetric flow, which ranged from 9.03 to 44.9 mL/sec (25.2 ± 10.3 mL/sec). These estimates are supported by Doppler US results. Conclusions The DCE-MRI analysis described here provides quantitative estimates of the number of maternal perfusion domains in a primate placenta, and estimates flow within each domain. Anticipated extensions of this technique are to the study placental function in nonhuman primate models of obstetric complications. PMID:24753177
Metabolomics method to comprehensively analyze amino acids in different domains.
Gu, Haiwei; Du, Jianhai; Carnevale Neto, Fausto; Carroll, Patrick A; Turner, Sally J; Chiorean, E Gabriela; Eisenman, Robert N; Raftery, Daniel
2015-04-21
Amino acids play essential roles in both metabolism and the proteome. Many studies have profiled free amino acids (FAAs) or proteins; however, few have connected the measurement of FAA with individual amino acids in the proteome. In this study, we developed a metabolomics method to comprehensively analyze amino acids in different domains, using two examples of different sample types and disease models. We first examined the responses of FAAs and insoluble-proteome amino acids (IPAAs) to the Myc oncogene in Tet21N human neuroblastoma cells. The metabolic and proteomic amino acid profiles were quite different, even under the same Myc condition, and their combination provided a better understanding of the biological status. In addition, amino acids were measured in 3 domains (FAAs, free and soluble-proteome amino acids (FSPAAs), and IPAAs) to study changes in serum amino acid profiles related to colon cancer. A penalized logistic regression model based on the amino acids from the three domains had better sensitivity and specificity than that from each individual domain. To the best of our knowledge, this is the first study to perform a combined analysis of amino acids in different domains, and indicates the useful biological information available from a metabolomics analysis of the protein pellet. This study lays the foundation for further quantitative tracking of the distribution of amino acids in different domains, with opportunities for better diagnosis and mechanistic studies of various diseases.
Fourier/Chebyshev methods for the incompressible Navier-Stokes equations in finite domains
NASA Technical Reports Server (NTRS)
Corral, Roque; Jimenez, Javier
1992-01-01
A fully spectral numerical scheme for the incompressible Navier-Stokes equations in domains which are infinite or semi-infinite in one dimension. The domain is not mapped, and standard Fourier or Chebyshev expansions can be used. The handling of the infinite domain does not introduce any significant overhead. The scheme assumes that the vorticity in the flow is essentially concentrated in a finite region, which is represented numerically by standard spectral collocation methods. To accomodate the slow exponential decay of the velocities at infinity, extra expansion functions are introduced, which are handled analytically. A detailed error analysis is presented, and two applications to Direct Numerical Simulation of turbulent flows are discussed in relation with the numerical performance of the scheme.
A fictitious domain approach for the Stokes problem based on the extended finite element method
NASA Astrophysics Data System (ADS)
Court, Sébastien; Fournié, Michel; Lozinski, Alexei
2014-01-01
In the present work, we propose to extend to the Stokes problem a fictitious domain approach inspired by eXtended Finite Element Method and studied for Poisson problem in [Renard]. The method allows computations in domains whose boundaries do not match. A mixed finite element method is used for fluid flow. The interface between the fluid and the structure is localized by a level-set function. Dirichlet boundary conditions are taken into account using Lagrange multiplier. A stabilization term is introduced to improve the approximation of the normal trace of the Cauchy stress tensor at the interface and avoid the inf-sup condition between the spaces for velocity and the Lagrange multiplier. Convergence analysis is given and several numerical tests are performed to illustrate the capabilities of the method.
Proceedings of the Third Annual Symposium on Mathematical Pattern Recognition and Image Analysis
NASA Technical Reports Server (NTRS)
Guseman, L. F., Jr.
1985-01-01
Topics addressed include: multivariate spline method; normal mixture analysis applied to remote sensing; image data analysis; classifications in spatially correlated environments; probability density functions; graphical nonparametric methods; subpixel registration analysis; hypothesis integration in image understanding systems; rectification of satellite scanner imagery; spatial variation in remotely sensed images; smooth multidimensional interpolation; and optimal frequency domain textural edge detection filters.
NASA Technical Reports Server (NTRS)
Raju, I. S.; Shivakumar, K. N.
1989-01-01
An equivalent domain integral (EDI) method for calculating J-intergrals for two-dimensional cracked elastic bodies is presented. The details of the method and its implementation are presented for isoparametric elements. The total and product integrals consist of the sum of an area of domain integral and line integrals on the crack faces. The line integrals vanish only when the crack faces are traction free and the loading is either pure mode 1 or pure mode 2 or a combination of both with only the square-root singular term in the stress field. The EDI method gave accurate values of the J-integrals for two mode I and two mixed mode problems. Numerical studies showed that domains consisting of one layer of elements are sufficient to obtain accurate J-integral values. Two procedures for separating the individual modes from the domain integrals are presented. The procedure that uses the symmetric and antisymmetric components of the stress and displacement fields to calculate the individual modes gave accurate values of the integrals for all problems analyzed. The EDI method when applied to a problem of an interface crack in two different materials showed that the mode 1 and mode 2 components are domain dependent while the total integral is not. This behavior is caused by the presence of the oscillatory part of the singularity in bimaterial crack problems. The EDI method, thus, shows behavior similar to the virtual crack closure method for bimaterial problems.
A frequency domain analysis of respiratory variations in the seismocardiogram signal.
Pandia, Keya; Inan, Omer T; Kovacs, Gregory T A
2013-01-01
The seismocardiogram (SCG) signal traditionally measured using a chest-mounted accelerometer contains low-frequency (0-100 Hz) cardiac vibrations that can be used to derive diagnostically relevant information about cardiovascular and cardiopulmonary health. This work is aimed at investigating the effects of respiration on the frequency domain characteristics of SCG signals measured from 18 healthy subjects. Toward this end, the 0-100 Hz SCG signal bandwidth of interest was sub-divided into 5 Hz and 10 Hz frequency bins to compare the spectral energy in corresponding frequency bins of the SCG signal measured during three key conditions of respiration--inspiration, expiration, and apnea. Statistically significant differences were observed between the power in ensemble averaged inspiratory and expiratory SCG beats and between ensemble averaged inspiratory and apneaic beats across the 18 subjects for multiple frequency bins in the 10-40 Hz frequency range. Accordingly, the spectral analysis methods described in this paper could provide complementary and improved classification of respiratory modulations in the SCG signal over and above time-domain SCG analysis methods.
Kernel Manifold Alignment for Domain Adaptation.
Tuia, Devis; Camps-Valls, Gustau
2016-01-01
The wealth of sensory data coming from different modalities has opened numerous opportunities for data analysis. The data are of increasing volume, complexity and dimensionality, thus calling for new methodological innovations towards multimodal data processing. However, multimodal architectures must rely on models able to adapt to changes in the data distribution. Differences in the density functions can be due to changes in acquisition conditions (pose, illumination), sensors characteristics (number of channels, resolution) or different views (e.g. street level vs. aerial views of a same building). We call these different acquisition modes domains, and refer to the adaptation problem as domain adaptation. In this paper, instead of adapting the trained models themselves, we alternatively focus on finding mappings of the data sources into a common, semantically meaningful, representation domain. This field of manifold alignment extends traditional techniques in statistics such as canonical correlation analysis (CCA) to deal with nonlinear adaptation and possibly non-corresponding data pairs between the domains. We introduce a kernel method for manifold alignment (KEMA) that can match an arbitrary number of data sources without needing corresponding pairs, just few labeled examples in all domains. KEMA has interesting properties: 1) it generalizes other manifold alignment methods, 2) it can align manifolds of very different complexities, performing a discriminative alignment preserving each manifold inner structure, 3) it can define a domain-specific metric to cope with multimodal specificities, 4) it can align data spaces of different dimensionality, 5) it is robust to strong nonlinear feature deformations, and 6) it is closed-form invertible, which allows transfer across-domains and data synthesis. To authors' knowledge this is the first method addressing all these important issues at once. We also present a reduced-rank version of KEMA for computational efficiency, and discuss the generalization performance of KEMA under Rademacher principles of stability. Aligning multimodal data with KEMA reports outstanding benefits when used as a data pre-conditioner step in the standard data analysis processing chain. KEMA exhibits very good performance over competing methods in synthetic controlled examples, visual object recognition and recognition of facial expressions tasks. KEMA is especially well-suited to deal with high-dimensional problems, such as images and videos, and under complicated distortions, twists and warpings of the data manifolds. A fully functional toolbox is available at https://github.com/dtuia/KEMA.git.
Azari, Nadia; Soleimani, Farin; Vameghi, Roshanak; Sajedi, Firoozeh; Shahshahani, Soheila; Karimi, Hossein; Kraskian, Adis; Shahrokhi, Amin; Teymouri, Robab; Gharib, Masoud
2017-01-01
Bayley Scales of infant & toddler development is a well-known diagnostic developmental assessment tool for children aged 1-42 months. Our aim was investigating the validity & reliability of this scale in Persian speaking children. The method was descriptive-analytic. Translation- back translation and cultural adaptation was done. Content & face validity of translated scale was determined by experts' opinions. Overall, 403 children aged 1 to 42 months were recruited from health centers of Tehran, during years of 2013-2014 for developmental assessment in cognitive, communicative (receptive & expressive) and motor (fine & gross) domains. Reliability of scale was calculated through three methods; internal consistency using Cronbach's alpha coefficient, test-retest and interrater methods. Construct validity was calculated using factor analysis and comparison of the mean scores methods. Cultural and linguistic changes were made in items of all domains especially on communication subscale. Content and face validity of the test were approved by experts' opinions. Cronbach's alpha coefficient was above 0.74 in all domains. Pearson correlation coefficient in various domains, were ≥ 0.982 in test retest method, and ≥0.993 in inter-rater method. Construct validity of the test was approved by factor analysis. Moreover, the mean scores for the different age groups were compared and statistically significant differences were observed between mean scores of different age groups, that confirms validity of the test. The Bayley Scales of Infant and Toddler Development is a valid and reliable tool for child developmental assessment in Persian language children.
NASA Astrophysics Data System (ADS)
Böbel, A.; Knapek, C. A.; Räth, C.
2018-05-01
Experiments of the recrystallization processes in two-dimensional complex plasmas are analyzed to rigorously test a recently developed scale-free phase transition theory. The "fractal-domain-structure" (FDS) theory is based on the kinetic theory of Frenkel. It assumes the formation of homogeneous domains, separated by defect lines, during crystallization and a fractal relationship between domain area and boundary length. For the defect number fraction and system energy a scale-free power-law relation is predicted. The long-range scaling behavior of the bond-order correlation function shows clearly that the complex plasma phase transitions are not of the Kosterlitz, Thouless, Halperin, Nelson, and Young type. Previous preliminary results obtained by counting the number of dislocations and applying a bond-order metric for structural analysis are reproduced. These findings are supplemented by extending the use of the bond-order metric to measure the defect number fraction and furthermore applying state-of-the-art analysis methods, allowing a systematic testing of the FDS theory with unprecedented scrutiny: A morphological analysis of lattice structure is performed via Minkowski tensor methods. Minkowski tensors form a complete family of additive, motion covariant and continuous morphological measures that are sensitive to nonlinear properties. The FDS theory is rigorously confirmed and predictions of the theory are reproduced extremely well. The predicted scale-free power-law relation between defect fraction number and system energy is verified for one more order of magnitude at high energies compared to the inherently discontinuous bond-order metric. It is found that the fractal relation between crystalline domain area and circumference is independent of the experiment, the particular Minkowski tensor method, and the particular choice of parameters. Thus, the fractal relationship seems to be inherent to two-dimensional phase transitions in complex plasmas. Minkowski tensor analysis turns out to be a powerful tool for investigations of crystallization processes. It is capable of revealing nonlinear local topological properties, however, still provides easily interpretable results founded on a solid mathematical framework.
Multi-scale Slip Inversion Based on Simultaneous Spatial and Temporal Domain Wavelet Transform
NASA Astrophysics Data System (ADS)
Liu, W.; Yao, H.; Yang, H. Y.
2017-12-01
Finite fault inversion is a widely used method to study earthquake rupture processes. Some previous studies have proposed different methods to implement finite fault inversion, including time-domain, frequency-domain, and wavelet-domain methods. Many previous studies have found that different frequency bands show different characteristics of the seismic rupture (e.g., Wang and Mori, 2011; Yao et al., 2011, 2013; Uchide et al., 2013; Yin et al., 2017). Generally, lower frequency waveforms correspond to larger-scale rupture characteristics while higher frequency data are representative of smaller-scale ones. Therefore, multi-scale analysis can help us understand the earthquake rupture process thoroughly from larger scale to smaller scale. By the use of wavelet transform, the wavelet-domain methods can analyze both the time and frequency information of signals in different scales. Traditional wavelet-domain methods (e.g., Ji et al., 2002) implement finite fault inversion with both lower and higher frequency signals together to recover larger-scale and smaller-scale characteristics of the rupture process simultaneously. Here we propose an alternative strategy with a two-step procedure, i.e., firstly constraining the larger-scale characteristics with lower frequency signals, and then resolving the smaller-scale ones with higher frequency signals. We have designed some synthetic tests to testify our strategy and compare it with the traditional one. We also have applied our strategy to study the 2015 Gorkha Nepal earthquake using tele-seismic waveforms. Both the traditional method and our two-step strategy only analyze the data in different temporal scales (i.e., different frequency bands), while the spatial distribution of model parameters also shows multi-scale characteristics. A more sophisticated strategy is to transfer the slip model into different spatial scales, and then analyze the smooth slip distribution (larger scales) with lower frequency data firstly and more detailed slip distribution (smaller scales) with higher frequency data subsequently. We are now implementing the slip inversion using both spatial and temporal domain wavelets. This multi-scale analysis can help us better understand frequency-dependent rupture characteristics of large earthquakes.
NASA Technical Reports Server (NTRS)
Berg, Melanie; Label, Kenneth; Campola, Michael; Xapsos, Michael
2017-01-01
We propose a method for the application of single event upset (SEU) data towards the analysis of complex systems using transformed reliability models (from the time domain to the particle fluence domain) and space environment data.
NASA Technical Reports Server (NTRS)
Berg, Melanie; Label, Kenneth; Campola, Michael; Xapsos, Michael
2017-01-01
We propose a method for the application of single event upset (SEU) data towards the analysis of complex systems using transformed reliability models (from the time domain to the particle fluence domain) and space environment data.
EEG Sleep Stages Classification Based on Time Domain Features and Structural Graph Similarity.
Diykh, Mohammed; Li, Yan; Wen, Peng
2016-11-01
The electroencephalogram (EEG) signals are commonly used in diagnosing and treating sleep disorders. Many existing methods for sleep stages classification mainly depend on the analysis of EEG signals in time or frequency domain to obtain a high classification accuracy. In this paper, the statistical features in time domain, the structural graph similarity and the K-means (SGSKM) are combined to identify six sleep stages using single channel EEG signals. Firstly, each EEG segment is partitioned into sub-segments. The size of a sub-segment is determined empirically. Secondly, statistical features are extracted, sorted into different sets of features and forwarded to the SGSKM to classify EEG sleep stages. We have also investigated the relationships between sleep stages and the time domain features of the EEG data used in this paper. The experimental results show that the proposed method yields better classification results than other four existing methods and the support vector machine (SVM) classifier. A 95.93% average classification accuracy is achieved by using the proposed method.
Jahandideh, Samad; Srinivasasainagendra, Vinodh; Zhi, Degui
2012-11-07
RNA-protein interaction plays an important role in various cellular processes, such as protein synthesis, gene regulation, post-transcriptional gene regulation, alternative splicing, and infections by RNA viruses. In this study, using Gene Ontology Annotated (GOA) and Structural Classification of Proteins (SCOP) databases an automatic procedure was designed to capture structurally solved RNA-binding protein domains in different subclasses. Subsequently, we applied tuned multi-class SVM (TMCSVM), Random Forest (RF), and multi-class ℓ1/ℓq-regularized logistic regression (MCRLR) for analysis and classifying RNA-binding protein domains based on a comprehensive set of sequence and structural features. In this study, we compared prediction accuracy of three different state-of-the-art predictor methods. From our results, TMCSVM outperforms the other methods and suggests the potential of TMCSVM as a useful tool for facilitating the multi-class prediction of RNA-binding protein domains. On the other hand, MCRLR by elucidating importance of features for their contribution in predictive accuracy of RNA-binding protein domains subclasses, helps us to provide some biological insights into the roles of sequences and structures in protein-RNA interactions.
NASA Astrophysics Data System (ADS)
Dontu, S.; Miclos, S.; Savastru, D.; Tautan, M.
2017-09-01
In recent years many optoelectronic techniques have been developed for improvement and the development of devices for tissue analysis. Spectral-Domain Optical Coherence Tomography (SD-OCT) is a new medical interferometric imaging modality that provides depth resolved tissue structure information with resolution in the μm range. However, SD-OCT has its own limitations and cannot offer the biochemical information of the tissue. These data can be obtained with hyperspectral imaging, a non-invasive, sensitive and real time technique. In the present study we have combined Spectral-Domain Optical Coherence Tomography (SD-OCT) with Hyperspectral imaging (HSI) for tissue analysis. The Spectral-Domain Optical Coherence Tomography (SD-OCT) and Hyperspectral imaging (HSI) are two methods that have demonstrated significant potential in this context. Preliminary results using different tissue have highlighted the capabilities of this technique of combinations.
Phipps, Denham L; Tam, W Vanessa; Ashcroft, Darren M
2017-03-01
To explore the combined use of a critical incident database and work domain analysis to understand patient safety issues in a health-care setting. A retrospective review was conducted of incidents reported to the UK National Reporting and Learning System (NRLS) that involved community pharmacy between April 2005 and August 2010. A work domain analysis of community pharmacy was constructed using observational data from 5 community pharmacies, technical documentation, and a focus group with 6 pharmacists. Reports from the NRLS were mapped onto the model generated by the work domain analysis. Approximately 14,709 incident reports meeting the selection criteria were retrieved from the NRLS. Descriptive statistical analysis of these reports found that almost all of the incidents involved medication and that the most frequently occurring error types were dose/strength errors, incorrect medication, and incorrect formulation. The work domain analysis identified 4 overall purposes for community pharmacy: business viability, health promotion and clinical services, provision of medication, and use of medication. These purposes were served by lower-order characteristics of the work system (such as the functions, processes and objects). The tasks most frequently implicated in the incident reports were those involving medication storage, assembly, or patient medication records. Combining the insights from different analytical methods improves understanding of patient safety problems. Incident reporting data can be used to identify general patterns, whereas the work domain analysis can generate information about the contextual factors that surround a critical task.
Lazure, Patrice; Bartel, Robert C; Biller, Beverly M K; Molitch, Mark E; Rosenthal, Stephen M; Ross, Judith L; Bernsten, Brock D; Hayes, Sean M
2014-07-24
The Theoretical Domains Framework (TDF) is a set of 14 domains of behavior change that provide a framework for the critical issues and factors influencing optimal knowledge translation. Considering that a previous study has identified optimal knowledge translation techniques for each TDF domain, it was hypothesized that the TDF could be used to contextualize and interpret findings from a behavioral and educational needs assessment. To illustrate this hypothesis, findings and recommendations drawn from a 2012 national behavioral and educational needs assessment conducted with healthcare providers who treat and manage Growth and Growth Hormone Disorders, will be discussed using the TDF. This needs assessment utilized a mixed-methods research approach that included a combination of: [a] data sources (Endocrinologists (n:120), Pediatric Endocrinologists (n:53), Pediatricians (n:52)), [b] data collection methods (focus groups, interviews, online survey), [c] analysis methodologies (qualitative - analyzed through thematic analysis, quantitative - analyzed using frequencies, cross-tabulations, and gap analysis). Triangulation was used to generate trustworthy findings on the clinical practice gaps of endocrinologists, pediatric endocrinologists, and general pediatricians in their provision of care to adult patients with adult growth hormone deficiency or acromegaly, or children/teenagers with pediatric growth disorders. The identified gaps were then broken into key underlying determinants, categorized according to the TDF domains, and linked to optimal behavioral change techniques. The needs assessment identified 13 gaps, each with one or more underlying determinant(s). Overall, these determinants were mapped to 9 of the 14 TDF domains. The Beliefs about Consequences domain was identified as a contributing determinant to 7 of the 13 challenges. Five of the gaps could be related to the Skills domain, while three were linked to the Knowledge domain. The TDF categorization of the needs assessment findings allowed recommendation of appropriate behavior change techniques for each underlying determinant, and facilitated communication and understanding of the identified issues to a broader audience. This approach provides a means for health education researchers to categorize gaps and challenges identified through educational needs assessments, and facilitates the application of these findings by educators and knowledge translators, by linking the gaps to recommended behavioral change techniques.
Domain decomposition: A bridge between nature and parallel computers
NASA Technical Reports Server (NTRS)
Keyes, David E.
1992-01-01
Domain decomposition is an intuitive organizing principle for a partial differential equation (PDE) computation, both physically and architecturally. However, its significance extends beyond the readily apparent issues of geometry and discretization, on one hand, and of modular software and distributed hardware, on the other. Engineering and computer science aspects are bridged by an old but recently enriched mathematical theory that offers the subject not only unity, but also tools for analysis and generalization. Domain decomposition induces function-space and operator decompositions with valuable properties. Function-space bases and operator splittings that are not derived from domain decompositions generally lack one or more of these properties. The evolution of domain decomposition methods for elliptically dominated problems has linked two major algorithmic developments of the last 15 years: multilevel and Krylov methods. Domain decomposition methods may be considered descendants of both classes with an inheritance from each: they are nearly optimal and at the same time efficiently parallelizable. Many computationally driven application areas are ripe for these developments. A progression is made from a mathematically informal motivation for domain decomposition methods to a specific focus on fluid dynamics applications. To be introductory rather than comprehensive, simple examples are provided while convergence proofs and algorithmic details are left to the original references; however, an attempt is made to convey their most salient features, especially where this leads to algorithmic insight.
An instructional guide for leaf color analysis using digital imaging software
Paula F. Murakami; Michelle R. Turner; Abby K. van den Berg; Paul G. Schaberg
2005-01-01
Digital color analysis has become an increasingly popular and cost-effective method utilized by resource managers and scientists for evaluating foliar nutrition and health in response to environmental stresses. We developed and tested a new method of digital image analysis that uses Scion Image or NIH image public domain software to quantify leaf color. This...
Reconstituting protein interaction networks using parameter-dependent domain-domain interactions
2013-01-01
Background We can describe protein-protein interactions (PPIs) as sets of distinct domain-domain interactions (DDIs) that mediate the physical interactions between proteins. Experimental data confirm that DDIs are more consistent than their corresponding PPIs, lending support to the notion that analyses of DDIs may improve our understanding of PPIs and lead to further insights into cellular function, disease, and evolution. However, currently available experimental DDI data cover only a small fraction of all existing PPIs and, in the absence of structural data, determining which particular DDI mediates any given PPI is a challenge. Results We present two contributions to the field of domain interaction analysis. First, we introduce a novel computational strategy to merge domain annotation data from multiple databases. We show that when we merged yeast domain annotations from six annotation databases we increased the average number of domains per protein from 1.05 to 2.44, bringing it closer to the estimated average value of 3. Second, we introduce a novel computational method, parameter-dependent DDI selection (PADDS), which, given a set of PPIs, extracts a small set of domain pairs that can reconstruct the original set of protein interactions, while attempting to minimize false positives. Based on a set of PPIs from multiple organisms, our method extracted 27% more experimentally detected DDIs than existing computational approaches. Conclusions We have provided a method to merge domain annotation data from multiple sources, ensuring large and consistent domain annotation for any given organism. Moreover, we provided a method to extract a small set of DDIs from the underlying set of PPIs and we showed that, in contrast to existing approaches, our method was not biased towards DDIs with low or high occurrence counts. Finally, we used these two methods to highlight the influence of the underlying annotation density on the characteristics of extracted DDIs. Although increased annotations greatly expanded the possible DDIs, the lack of knowledge of the true biological false positive interactions still prevents an unambiguous assignment of domain interactions responsible for all protein network interactions. Executable files and examples are given at: http://www.bhsai.org/downloads/padds/ PMID:23651452
Multiscale Medical Image Fusion in Wavelet Domain
Khare, Ashish
2013-01-01
Wavelet transforms have emerged as a powerful tool in image fusion. However, the study and analysis of medical image fusion is still a challenging area of research. Therefore, in this paper, we propose a multiscale fusion of multimodal medical images in wavelet domain. Fusion of medical images has been performed at multiple scales varying from minimum to maximum level using maximum selection rule which provides more flexibility and choice to select the relevant fused images. The experimental analysis of the proposed method has been performed with several sets of medical images. Fusion results have been evaluated subjectively and objectively with existing state-of-the-art fusion methods which include several pyramid- and wavelet-transform-based fusion methods and principal component analysis (PCA) fusion method. The comparative analysis of the fusion results has been performed with edge strength (Q), mutual information (MI), entropy (E), standard deviation (SD), blind structural similarity index metric (BSSIM), spatial frequency (SF), and average gradient (AG) metrics. The combined subjective and objective evaluations of the proposed fusion method at multiple scales showed the effectiveness and goodness of the proposed approach. PMID:24453868
NASA Astrophysics Data System (ADS)
Shen, Wei; Li, Dongsheng; Zhang, Shuaifang; Ou, Jinping
2017-07-01
This paper presents a hybrid method that combines the B-spline wavelet on the interval (BSWI) finite element method and spectral analysis based on fast Fourier transform (FFT) to study wave propagation in One-Dimensional (1D) structures. BSWI scaling functions are utilized to approximate the theoretical wave solution in the spatial domain and construct a high-accuracy dynamic stiffness matrix. Dynamic reduction on element level is applied to eliminate the interior degrees of freedom of BSWI elements and substantially reduce the size of the system matrix. The dynamic equations of the system are then transformed and solved in the frequency domain through FFT-based spectral analysis which is especially suitable for parallel computation. A comparative analysis of four different finite element methods is conducted to demonstrate the validity and efficiency of the proposed method when utilized in high-frequency wave problems. Other numerical examples are utilized to simulate the influence of crack and delamination on wave propagation in 1D rods and beams. Finally, the errors caused by FFT and their corresponding solutions are presented.
Mainali, Laxman; Camenisch, Theodore G; Hyde, James S; Subczynski, Witold K
2017-12-01
The presence of integral membrane proteins induces the formation of distinct domains in the lipid bilayer portion of biological membranes. Qualitative application of both continuous wave (CW) and saturation recovery (SR) electron paramagnetic resonance (EPR) spin-labeling methods allowed discrimination of the bulk, boundary, and trapped lipid domains. A recently developed method, which is based on the CW EPR spectra of phospholipid (PL) and cholesterol (Chol) analog spin labels, allows evaluation of the relative amount of PLs (% of total PLs) in the boundary plus trapped lipid domain and the relative amount of Chol (% of total Chol) in the trapped lipid domain [ M. Raguz, L. Mainali, W. J. O'Brien, and W. K. Subczynski (2015), Exp. Eye Res., 140:179-186 ]. Here, a new method is presented that, based on SR EPR spin-labeling, allows quantitative evaluation of the relative amounts of PLs and Chol in the trapped lipid domain of intact membranes. This new method complements the existing one, allowing acquisition of more detailed information about the distribution of lipids between domains in intact membranes. The methodological transition of the SR EPR spin-labeling approach from qualitative to quantitative is demonstrated. The abilities of this method are illustrated for intact cortical and nuclear fiber cell plasma membranes from porcine eye lenses. Statistical analysis (Student's t -test) of the data allowed determination of the separations of mean values above which differences can be treated as statistically significant ( P ≤ 0.05) and can be attributed to sources other than preparation/technique.
2013-01-01
Background It is important to quickly and efficiently identify policies that are effective at changing behavior; therefore, we must be able to quantify and evaluate the effect of those policies and of changes to those policies. The purpose of this study was to develop state-level physical education (PE) and physical activity (PA) policy domain scores at the high-school level. Policy domain scores were developed with a focus on measuring policy change. Methods Exploratory factor analysis was used to group items from the state-level School Health Policies and Programs Study (SHPPS) into policy domains. Items that related to PA or PE at the High School level were identified from the 7 SHPPS health program surveys. Data from 2000 and 2006 were used in the factor analysis. RESULTS: From the 98 items identified, 17 policy domains were extracted. Average policy domain change scores were positive for 12 policy domains, with the largest increases for “Discouraging PA as Punishment”, “Collaboration”, and “Staff Development Opportunities”. On average, states increased scores in 4.94 ± 2.76 policy domains, decreased in 3.53 ± 2.03, and had no change in 7.69 ± 2.09 policy domains. Significant correlations were found between several policy domain scores. Conclusions Quantifying policy change and its impact is integral to the policy making and revision process. Our results build on previous research offering a way to examine changes in state-level policies related to PE and PA of high-school students and the faculty and staff who serve them. This work provides methods for combining state-level policies relevant to PE or PA in youth for studies of their impact. PMID:23815860
On-line Monitoring for Cutting Tool Wear Condition Based on the Parameters
NASA Astrophysics Data System (ADS)
Han, Fenghua; Xie, Feng
2017-07-01
In the process of cutting tools, it is very important to monitor the working state of the tools. On the basis of acceleration signal acquisition under the constant speed, time domain and frequency domain analysis of relevant indicators monitor the online of tool wear condition. The analysis results show that the method can effectively judge the tool wear condition in the process of machining. It has certain application value.
Participatory Design Methods for C2 Systems (Proceedings/Presentation)
2006-01-01
Cognitive Task Analysis (CTA) 16. SECURITY CLASSIFICATION OF: 17. LIMITATION 18. NUMBER 19a. NAME OF RESPONSIBLE PERSON OF ABSTRACT OF PAGES Janet E. Miller...systems to support cognitive work such as is accomplished in a network-centric -environment. Cognitive task analysis (CTA) methods are used to...of cognitive task analysis methodologies exist (Schraagen et al., 2000). However, many of these methods are skeptically viewed by a domain’s
A Meta-analysis of Cerebellar Contributions to Higher Cognition from PET and fMRI studies
Keren-Happuch, E; Chen, Shen-Hsing Annabel; Ho, Moon-Ho Ringo; Desmond, John E.
2013-01-01
A growing interest in cerebellar function and its involvement in higher cognition have prompted much research in recent years. Cerebellar presence in a wide range of cognitive functions examined within an increasing body of neuroimaging literature has been observed. We applied a meta-analytic approach, which employed the activation likelihood estimate method, to consolidate results of cerebellar involvement accumulated in different cognitive tasks of interest and systematically identified similarities among the studies. The current analysis included 88 neuroimaging studies demonstrating cerebellar activations in higher cognitive domains involving emotion, executive function, language, music, timing and working memory. While largely consistent with a prior meta-analysis by Stoodley and Schmahmann (2009), our results extended their findings to include music and timing domains to provide further insights into cerebellar involvement and elucidate its role in higher cognition. In addition, we conducted inter- and intra-domain comparisons for the cognitive domains of emotion, language and working memory. We also considered task differences within the domain of verbal working memory by conducting a comparison of the Sternberg with the n-back task, as well as an analysis of the differential components within the Sternberg task. Results showed a consistent cerebellar presence in the timing domain, providing evidence for a role in time keeping. Unique clusters identified within the domain further refine the topographic organization of the cerebellum. PMID:23125108
Evaluation of selected binding domains for the analysis of ubiquitinated proteomes
Nakayasu, Ernesto S.; Ansong, Charles; Brown, Joseph N.; Yang, Feng; Lopez-Ferrer, Daniel; Qian, Wei-Jun; Smith, Richard D.; Adkins, Joshua N.
2013-01-01
Ubiquitination is an abundant post-translational modification that consists of covalent attachment of ubiquitin to lysine residues or the N-terminus of proteins. Mono and polyubiquitination have been shown to be involved in many critical eukaryotic cellular functions and are often disrupted by intracellular bacterial pathogens. Affinity enrichment of ubiquitinated proteins enables global analysis of this key modification. In this context, the use of ubiquitin-binding domains is a promising, but relatively unexplored alternative to more broadly used immunoaffinity or tagged affinity enrichment methods. In this study, we evaluated the application of eight ubiquitin-binding domains that have differing affinities for ubiquitination states. Small-scale proteomics analysis identified ∼200 ubiquitinated protein candidates per ubiquitin-binding domain pull-down experiment. Results from subsequent Western blot analyses that employed anti-ubiquitin or monoclonal antibodies against polyubiquitination at lysine 48 and 63 suggest that ubiquitin-binding domains from Dsk2 and ubiquilin-1 have the broadest specificity in that they captured most types of ubiquitination, whereas the binding domain from NBR1 was more selective to polyubiquitination. These data demonstrate that with optimized purification conditions, ubiquitin-binding domains can be an alternative tool for proteomic applications. This approach is especially promising for the analysis of tissues or cells resistant to transfection, of which the overexpression of tagged ubiquitin is a major hurdle. PMID:23649778
Peng, Wei; Wang, Jianxin; Cheng, Yingjiao; Lu, Yu; Wu, Fangxiang; Pan, Yi
2015-01-01
Prediction of essential proteins which are crucial to an organism's survival is important for disease analysis and drug design, as well as the understanding of cellular life. The majority of prediction methods infer the possibility of proteins to be essential by using the network topology. However, these methods are limited to the completeness of available protein-protein interaction (PPI) data and depend on the network accuracy. To overcome these limitations, some computational methods have been proposed. However, seldom of them solve this problem by taking consideration of protein domains. In this work, we first analyze the correlation between the essentiality of proteins and their domain features based on data of 13 species. We find that the proteins containing more protein domain types which rarely occur in other proteins tend to be essential. Accordingly, we propose a new prediction method, named UDoNC, by combining the domain features of proteins with their topological properties in PPI network. In UDoNC, the essentiality of proteins is decided by the number and the frequency of their protein domain types, as well as the essentiality of their adjacent edges measured by edge clustering coefficient. The experimental results on S. cerevisiae data show that UDoNC outperforms other existing methods in terms of area under the curve (AUC). Additionally, UDoNC can also perform well in predicting essential proteins on data of E. coli.
NASA Astrophysics Data System (ADS)
Cui, Boya; Kielb, Edward; Luo, Jiajun; Tang, Yang; Grayson, Matthew
Superlattices and narrow gap semiconductors often host multiple conducting species, such as electrons and holes, requiring a mobility spectral analysis (MSA) method to separate contributions to the conductivity. Here, a least-squares MSA method is introduced: the QR-algorithm Fourier-domain MSA (FMSA). Like other MSA methods, the FMSA sorts the conductivity contributions of different carrier species from magnetotransport measurements, arriving at a best fit to the experimentally measured longitudinal and Hall conductivities σxx and σxy, respectively. This method distinguishes itself from other methods by using the so-called QR-algorithm of linear algebra to achieve rapid convergence of the mobility spectrum as the solution to an eigenvalue problem, and by alternately solving this problem in both the mobility domain and its Fourier reciprocal-space. The result accurately fits a mobility range spanning nearly four orders of magnitude (μ = 300 to 1,000,000 cm2/V .s). This method resolves the mobility spectra as well as, or better than, competing MSA methods while also achieving high computational efficiency, requiring less than 30 second on average to converge to a solution on a standard desktop computer. Acknowledgement: Funded by AFOSR FA9550-15-1-0377 and AFOSR FA9550-15-1-0247.
NASA Astrophysics Data System (ADS)
Liu, Wen; Zhang, Yuying; Yang, Si; Han, Donghai
2018-05-01
A new technique to identify the floral resources of honeys is demanded. Terahertz time-domain attenuated total reflection spectroscopy combined with chemometrics methods was applied to discriminate different categorizes (Medlar honey, Vitex honey, and Acacia honey). Principal component analysis (PCA), cluster analysis (CA) and partial least squares-discriminant analysis (PLS-DA) have been used to find information of the botanical origins of honeys. Spectral range also was discussed to increase the precision of PLS-DA model. The accuracy of 88.46% for validation set was obtained, using PLS-DA model in 0.5-1.5 THz. This work indicated terahertz time-domain attenuated total reflection spectroscopy was an available approach to evaluate the quality of honey rapidly.
Wavelet transformation to determine impedance spectra of lithium-ion rechargeable battery
NASA Astrophysics Data System (ADS)
Hoshi, Yoshinao; Yakabe, Natsuki; Isobe, Koichiro; Saito, Toshiki; Shitanda, Isao; Itagaki, Masayuki
2016-05-01
A new analytical method is proposed to determine the electrochemical impedance of lithium-ion rechargeable batteries (LIRB) from time domain data by wavelet transformation (WT). The WT is a waveform analysis method that can transform data in the time domain to the frequency domain while retaining time information. In this transformation, the frequency domain data are obtained by the convolution integral of a mother wavelet and original time domain data. A complex Morlet mother wavelet (CMMW) is used to obtain the complex number data in the frequency domain. The CMMW is expressed by combining a Gaussian function and sinusoidal term. The theory to select a set of suitable conditions for variables and constants related to the CMMW, i.e., band, scale, and time parameters, is established by determining impedance spectra from wavelet coefficients using input voltage to the equivalent circuit and the output current. The impedance spectrum of LIRB determined by WT agrees well with that measured using a frequency response analyzer.
AZARI, Nadia; SOLEIMANI, Farin; VAMEGHI, Roshanak; SAJEDI, Firoozeh; SHAHSHAHANI, Soheila; KARIMI, Hossein; KRASKIAN, Adis; SHAHROKHI, Amin; TEYMOURI, Robab; GHARIB, Masoud
2017-01-01
Objective Bayley Scales of infant & toddler development is a well-known diagnostic developmental assessment tool for children aged 1–42 months. Our aim was investigating the validity & reliability of this scale in Persian speaking children. Materials & Methods The method was descriptive-analytic. Translation- back translation and cultural adaptation was done. Content & face validity of translated scale was determined by experts’ opinions. Overall, 403 children aged 1 to 42 months were recruited from health centers of Tehran, during years of 2013-2014 for developmental assessment in cognitive, communicative (receptive & expressive) and motor (fine & gross) domains. Reliability of scale was calculated through three methods; internal consistency using Cronbach’s alpha coefficient, test-retest and interrater methods. Construct validity was calculated using factor analysis and comparison of the mean scores methods. Results Cultural and linguistic changes were made in items of all domains especially on communication subscale. Content and face validity of the test were approved by experts’ opinions. Cronbach’s alpha coefficient was above 0.74 in all domains. Pearson correlation coefficient in various domains, were ≥ 0.982 in test retest method, and ≥0.993 in inter-rater method. Construct validity of the test was approved by factor analysis. Moreover, the mean scores for the different age groups were compared and statistically significant differences were observed between mean scores of different age groups, that confirms validity of the test. Conclusion The Bayley Scales of Infant and Toddler Development is a valid and reliable tool for child developmental assessment in Persian language children. PMID:28277556
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ochiai, Yoshihiro
Heat-conduction analysis under steady state without heat generation can easily be treated by the boundary element method. However, in the case with heat conduction with heat generation can approximately be solved without a domain integral by an improved multiple-reciprocity boundary element method. The convention multiple-reciprocity boundary element method is not suitable for complicated heat generation. In the improved multiple-reciprocity boundary element method, on the other hand, the domain integral in each step is divided into point, line, and area integrals. In order to solve the problem, the contour lines of heat generation, which approximate the actual heat generation, are used.
Summary of Technical Operations, 1991
1992-01-01
exploit commonality. The project is using the Feature-Oriented Domain Analysis ( FODA ) method, developed by the project in 1990, to perform this...the development of new movement control software. The analysis will also serve as a means of improving the FODA method. The results of this analysis ...STARS environment. The NASA Program Office has officially decided to expand the use of Rate Monotonic Analysis (RMA), which was originally isolated to
NASA Astrophysics Data System (ADS)
Naritomi, Yusuke; Fuchigami, Sotaro
2013-12-01
We recently proposed the method of time-structure based independent component analysis (tICA) to examine the slow dynamics involved in conformational fluctuations of a protein as estimated by molecular dynamics (MD) simulation [Y. Naritomi and S. Fuchigami, J. Chem. Phys. 134, 065101 (2011)]. Our previous study focused on domain motions of the protein and examined its dynamics by using rigid-body domain analysis and tICA. However, the protein changes its conformation not only through domain motions but also by various types of motions involving its backbone and side chains. Some of these motions might occur on a slow time scale: we hypothesize that if so, we could effectively detect and characterize them using tICA. In the present study, we investigated slow dynamics of the protein backbone using MD simulation and tICA. The selected target protein was lysine-, arginine-, ornithine-binding protein (LAO), which comprises two domains and undergoes large domain motions. MD simulation of LAO in explicit water was performed for 1 μs, and the obtained trajectory of Cα atoms in the backbone was analyzed by tICA. This analysis successfully provided us with slow modes for LAO that represented either domain motions or local movements of the backbone. Further analysis elucidated the atomic details of the suggested local motions and confirmed that these motions truly occurred on the expected slow time scale.
Makarov, D V; Kon'kov, L E; Uleysky, M Yu; Petrov, P S
2013-01-01
The problem of sound propagation in a randomly inhomogeneous oceanic waveguide is considered. An underwater sound channel in the Sea of Japan is taken as an example. Our attention is concentrated on the domains of finite-range ray stability in phase space and their influence on wave dynamics. These domains can be found by means of the one-step Poincare map. To study manifestations of finite-range ray stability, we introduce the finite-range evolution operator (FREO) describing transformation of a wave field in the course of propagation along a finite segment of a waveguide. Carrying out statistical analysis of the FREO spectrum, we estimate the contribution of regular domains and explore their evanescence with increasing length of the segment. We utilize several methods of spectral analysis: analysis of eigenfunctions by expanding them over modes of the unperturbed waveguide, approximation of level-spacing statistics by means of the Berry-Robnik distribution, and the procedure used by A. Relano and coworkers [Relano et al., Phys. Rev. Lett. 89, 244102 (2002); Relano, Phys. Rev. Lett. 100, 224101 (2008)]. Comparing the results obtained with different methods, we find that the method based on the statistical analysis of FREO eigenfunctions is the most favorable for estimating the contribution of regular domains. It allows one to find directly the waveguide modes whose refraction is regular despite the random inhomogeneity. For example, it is found that near-axial sound propagation in the Sea of Japan preserves stability even over distances of hundreds of kilometers due to the presence of a shearless torus in the classical phase space. Increasing the acoustic wavelength degrades scattering, resulting in recovery of eigenfunction localization near periodic orbits of the one-step Poincaré map.
Frequency-Domain Identification Of Aeroelastic Modes
NASA Technical Reports Server (NTRS)
Acree, C. W., Jr.; Tischler, Mark B.
1991-01-01
Report describes flight measurements and frequency-domain analyses of aeroelastic vibrational modes of wings of XV-15 tilt-rotor aircraft. Begins with description of flight-test methods. Followed by brief discussion of methods of analysis, which include Fourier-transform computations using chirp z transformers, use of coherence and other spectral functions, and methods and computer programs to obtain frequencies and damping coefficients from measurements. Includes brief description of results of flight tests and comparisions among various experimental and theoretical results. Ends with section on conclusions and recommended improvements in techniques.
A general ansatz for constructing quasi-diabatic states in electronically excited aggregated systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Wenlan; Köhn, Andreas; InnovationLab GmbH, Speyerer St. 4, D-69115 Heidelberg
2015-08-28
We present a general method for analyzing the character of singly excited states in terms of charge transfer (CT) and locally excited (LE) configurations. The analysis is formulated for configuration interaction singles (CIS) singly excited wave functions of aggregate systems. It also approximately works for the second-order approximate coupled cluster singles and doubles and the second-order algebraic-diagrammatic construction methods [CC2 and ADC(2)]. The analysis method not only generates a weight of each character for an excited state, but also allows to define the related quasi-diabatic states and corresponding coupling matrix elements. In the character analysis approach, we divide the targetmore » system into domains and use a modified Pipek-Mezey algorithm to localize the canonical MOs on each domain, respectively. The CIS wavefunction is then transformed into the localized basis, which allows us to partition the wavefunction into LE configurations within domains and CT configuration between pairs of different domains. Quasi-diabatic states are then obtained by mixing excited states subject to the condition of maximizing the weight of one single LE or CT configuration (localization in configuration space). Different aims of such a procedure are discussed, either the construction of pure LE and CT states for analysis purposes (by including a large number of excited states) or the construction of effective models for dynamics calculations (by including a restricted number of excited states). Applications are given to LE/CT mixing in π-stacked systems, charge-recombination matrix elements in a hetero-dimer, and excitonic couplings in multi-chromophoric systems.« less
Comparing and Contrasting Consensus versus Empirical Domains
Jason, Leonard A.; Kot, Bobby; Sunnquist, Madison; Brown, Abigail; Reed, Jordan; Furst, Jacob; Newton, Julia L.; Strand, Elin Bolle; Vernon, Suzanne D.
2015-01-01
Background Since the publication of the CFS case definition [1], there have been a number of other criteria proposed including the Canadian Consensus Criteria [2] and the Myalgic Encephalomyelitis: International Consensus Criteria. [3] Purpose The current study compared these domains that were developed through consensus methods to one obtained through more empirical approaches using factor analysis. Methods Using data mining, we compared and contrasted fundamental features of consensus-based criteria versus empirical latent factors. In general, these approaches found the domain of Fatigue/Post-exertional malaise as best differentiating patients from controls. Results Findings indicated that the Fukuda et al. criteria had the worst sensitivity and specificity. Conclusions These outcomes might help both theorists and researchers better determine which fundamental domains to be used for the case definition. PMID:26977374
Improving pairwise comparison of protein sequences with domain co-occurrence
Gascuel, Olivier
2018-01-01
Comparing and aligning protein sequences is an essential task in bioinformatics. More specifically, local alignment tools like BLAST are widely used for identifying conserved protein sub-sequences, which likely correspond to protein domains or functional motifs. However, to limit the number of false positives, these tools are used with stringent sequence-similarity thresholds and hence can miss several hits, especially for species that are phylogenetically distant from reference organisms. A solution to this problem is then to integrate additional contextual information to the procedure. Here, we propose to use domain co-occurrence to increase the sensitivity of pairwise sequence comparisons. Domain co-occurrence is a strong feature of proteins, since most protein domains tend to appear with a limited number of other domains on the same protein. We propose a method to take this information into account in a typical BLAST analysis and to construct new domain families on the basis of these results. We used Plasmodium falciparum as a case study to evaluate our method. The experimental findings showed an increase of 14% of the number of significant BLAST hits and an increase of 25% of the proteome area that can be covered with a domain. Our method identified 2240 new domains for which, in most cases, no model of the Pfam database could be linked. Moreover, our study of the quality of the new domains in terms of alignment and physicochemical properties show that they are close to that of standard Pfam domains. Source code of the proposed approach and supplementary data are available at: https://gite.lirmm.fr/menichelli/pairwise-comparison-with-cooccurrence PMID:29293498
Advanced quantitative magnetic nondestructive evaluation methods - Theory and experiment
NASA Technical Reports Server (NTRS)
Barton, J. R.; Kusenberger, F. N.; Beissner, R. E.; Matzkanin, G. A.
1979-01-01
The paper reviews the scale of fatigue crack phenomena in relation to the size detection capabilities of nondestructive evaluation methods. An assessment of several features of fatigue in relation to the inspection of ball and roller bearings suggested the use of magnetic methods; magnetic domain phenomena including the interaction of domains and inclusions, and the influence of stress and magnetic field on domains are discussed. Experimental results indicate that simplified calculations can be used to predict many features of these results; the data predicted by analytic models which use finite element computer analysis predictions do not agree with respect to certain features. Experimental analyses obtained on rod-type fatigue specimens which show experimental magnetic measurements in relation to the crack opening displacement and volume and crack depth should provide methods for improved crack characterization in relation to fracture mechanics and life prediction.
Wavelet Analyses of F/A-18 Aeroelastic and Aeroservoelastic Flight Test Data
NASA Technical Reports Server (NTRS)
Brenner, Martin J.
1997-01-01
Time-frequency signal representations combined with subspace identification methods were used to analyze aeroelastic flight data from the F/A-18 Systems Research Aircraft (SRA) and aeroservoelastic data from the F/A-18 High Alpha Research Vehicle (HARV). The F/A-18 SRA data were produced from a wingtip excitation system that generated linear frequency chirps and logarithmic sweeps. HARV data were acquired from digital Schroeder-phased and sinc pulse excitation signals to actuator commands. Nondilated continuous Morlet wavelets implemented as a filter bank were chosen for the time-frequency analysis to eliminate phase distortion as it occurs with sliding window discrete Fourier transform techniques. Wavelet coefficients were filtered to reduce effects of noise and nonlinear distortions identically in all inputs and outputs. Cleaned reconstructed time domain signals were used to compute improved transfer functions. Time and frequency domain subspace identification methods were applied to enhanced reconstructed time domain data and improved transfer functions, respectively. Time domain subspace performed poorly, even with the enhanced data, compared with frequency domain techniques. A frequency domain subspace method is shown to produce better results with the data processed using the Morlet time-frequency technique.
1996-01-01
architecture for a family of sys- tems. The Feature-Oriented Domain Analysis ( FODA ) method looks primarily at "user-visible" aspects of a domain. The...Partners I-9 1.4.2 Acquisition, Development, and Post Deployment 1-10 2 Strategic Overview 1-15 2.1 Situation Analysis 1-15 2.1.1...Contents Strategic Overview 1-15 2.1 Situation Analysis 1-15 2.1.1 DoD Budget Reductions, Downsizing, and the Changing Role of the Military
High frequency resolution terahertz time-domain spectroscopy
NASA Astrophysics Data System (ADS)
Sangala, Bagvanth Reddy
2013-12-01
A new method for the high frequency resolution terahertz time-domain spectroscopy is developed based on the characteristic matrix method. This method is useful for studying planar samples or stack of planar samples. The terahertz radiation was generated by optical rectification in a ZnTe crystal and detected by another ZnTe crystal via electro-optic sampling method. In this new characteristic matrix based method, the spectra of the sample and reference waveforms will be modeled by using characteristic matrices. We applied this new method to measure the optical constants of air. The terahertz transmission through the layered systems air-Teflon-air-Quartz-air and Nitrogen gas-Teflon-Nitrogen gas-Quartz-Nitrogen gas was modeled by the characteristic matrix method. A transmission coefficient is derived from these models which was optimized to fit the experimental transmission coefficient to extract the optical constants of air. The optimization of an error function involving the experimental complex transmission coefficient and the theoretical transmission coefficient was performed using patternsearch algorithm of MATLAB. Since this method takes account of the echo waveforms due to reflections in the layered samples, this method allows analysis of longer time-domain waveforms giving rise to very high frequency resolution in the frequency-domain. We have presented the high frequency resolution terahertz time-domain spectroscopy of air and compared the results with the literature values. We have also fitted the complex susceptibility of air to the Lorentzian and Gaussian functions to extract the linewidths.
Outdoor Experiential Environmental Education: An Adult-Centred Intervention for the Affective Domain
ERIC Educational Resources Information Center
Okur-Berberoglu, Emel
2017-01-01
The aim of this research is to evaluate the impact of an outdoor experiential environmental education (OEEE) programme on the affective domain of adult participants--namely, in-service teachers from Turkey. Data collection methods such as; psychodrama, non-participant observation, open-ended questions and content analysis were used within a…
A spectral analysis of the domain decomposed Monte Carlo method for linear systems
Slattery, Stuart R.; Evans, Thomas M.; Wilson, Paul P. H.
2015-09-08
The domain decomposed behavior of the adjoint Neumann-Ulam Monte Carlo method for solving linear systems is analyzed using the spectral properties of the linear oper- ator. Relationships for the average length of the adjoint random walks, a measure of convergence speed and serial performance, are made with respect to the eigenvalues of the linear operator. In addition, relationships for the effective optical thickness of a domain in the decomposition are presented based on the spectral analysis and diffusion theory. Using the effective optical thickness, the Wigner rational approxi- mation and the mean chord approximation are applied to estimate the leakagemore » frac- tion of random walks from a domain in the decomposition as a measure of parallel performance and potential communication costs. The one-speed, two-dimensional neutron diffusion equation is used as a model problem in numerical experiments to test the models for symmetric operators with spectral qualities similar to light water reactor problems. We find, in general, the derived approximations show good agreement with random walk lengths and leakage fractions computed by the numerical experiments.« less
NASA Astrophysics Data System (ADS)
Cai, Jianhua
2017-05-01
The time-frequency analysis method represents signal as a function of time and frequency, and it is considered a powerful tool for handling arbitrary non-stationary time series by using instantaneous frequency and instantaneous amplitude. It also provides a possible alternative to the analysis of the non-stationary magnetotelluric (MT) signal. Based on the Hilbert-Huang transform (HHT), a time-frequency analysis method is proposed to obtain stable estimates of the magnetotelluric response function. In contrast to conventional methods, the response function estimation is performed in the time-frequency domain using instantaneous spectra rather than in the frequency domain, which allows for imaging the response parameter content as a function of time and frequency. The theory of the method is presented and the mathematical model and calculation procedure, which are used to estimate response function based on HHT time-frequency spectrum, are discussed. To evaluate the results, response function estimates are compared with estimates from a standard MT data processing method based on the Fourier transform. All results show that apparent resistivities and phases, which are calculated from the HHT time-frequency method, are generally more stable and reliable than those determined from the simple Fourier analysis. The proposed method overcomes the drawbacks of the traditional Fourier methods, and the resulting parameter minimises the estimation bias caused by the non-stationary characteristics of the MT data.
Frequency-Domain Optical Mammogram
2002-10-01
have performed the proposed analysis of frequency-domain optical mammograms for a clinical population of about 150 patients. This analysis has led to...model the propagation of light in tissue14-20 have led to new approaches to optical mammography. As The authors are with the Department of Electrical...Modulation Methods, and Signal Detection /406 7.2.1 Lasers and arc lamps / 407’ 7.2.2 Pulsed sources / 407 7.2.3 Laser diodes and light-emitting diodes ( LEDs
NASA Technical Reports Server (NTRS)
Srivastava, R.; Reddy, T. S. R.
1996-01-01
This guide describes the input data required, for steady or unsteady aerodynamic and aeroelastic analysis of propellers and the output files generated, in using PROP3D. The aerodynamic forces are obtained by solving three dimensional unsteady, compressible Euler equations. A normal mode structural analysis is used to obtain the aeroelastic equations, which are solved using either time domain or frequency domain solution method. Sample input and output files are included in this guide for steady aerodynamic analysis of single and counter-rotation propellers, and aeroelastic analysis of single-rotation propeller.
Research on vibration signal of engine based on subband energy method
NASA Astrophysics Data System (ADS)
Wu, Chunmei; Cui, Feng; Zhao, Yong; Fu, Baohong; Ma, Junchi; Yang, Guihua
2017-04-01
Based on the research of DA462 type engine cylinder and cylinder head vibration signal of the surface, the signal measured in the time domain and frequency domain are analyzed in detail, draw the following conclusions: the analysis of vibration signal of the subband energy method is applied to the engine, the concentration response of each of the motivation band can clearly be seen. Through the analysis we can see that the combustion excitation frequency response from 0k to 1K, the vibration influence on the body piston lateral impact force is mainly concentrated in 2K˜5K frequency range of Hz, valve opening and closing the excitation response frequency is mainly concentrated in the 3K˜4K range of Hz, and thus locating the valve clearance fault. This method is simple, accurate and practical for the post processing and analysis of vibration signals.
2014-01-01
Background Performance measures are often neglected during the transition period of national health insurance scheme implementation in many low and middle income countries. These measurements evaluate the extent to which various aspects of the schemes meet their key objectives. This study assesses the implementation of a health insurance scheme using optimal resource use domains and examines possible factors that influence each domain, according to providers’ perspectives. Methods A retrospective, cross-sectional survey was done between August and December 2010 in Kaduna state, and 466 health care provider personnel were interviewed. Optimal-resource-use was defined in four domains: provider payment mechanism (capitation and fee-for-service payment methods), benefit package, administrative efficiency, and active monitoring mechanism. Logistic regression analysis was used to identify provider factors that may influence each domain. Results In the provider payment mechanism domain, capitation payment method (95%) performed better than fee-for-service payment method (62%). Benefit package domain performed strongly (97%), while active monitoring mechanism performed weakly (37%). In the administrative efficiency domain, both promptness of referral system (80%) and prompt arrival of funds (93%) performed well. At the individual level, providers with fewer enrolees encountered difficulties with reimbursement. Other factors significantly influenced each of the optimal-resource-use domains. Conclusions Fee-for-service payment method and claims review, in the provider payment and active monitoring mechanisms, respectively, performed weakly according to the providers’ (at individual-level) perspectives. A short-fall on the supply-side of health insurance could lead to a direct or indirect adverse effect on the demand-side of the scheme. Capitation payment per enrolees should be revised to conform to economic circumstances. Performance indicators and providers’ characteristics and experiences associated with resource use can assist policy makers to monitor and evaluate health insurance implementation. PMID:24628889
Leske, David A.; Hatt, Sarah R.; Liebermann, Laura; Holmes, Jonathan M.
2016-01-01
Purpose We compare two methods of analysis for Rasch scoring pre- to postintervention data: Rasch lookup table versus de novo stacked Rasch analysis using the Adult Strabismus-20 (AS-20). Methods One hundred forty-seven subjects completed the AS-20 questionnaire prior to surgery and 6 weeks postoperatively. Subjects were classified 6 weeks postoperatively as “success,” “partial success,” or “failure” based on angle and diplopia status. Postoperative change in AS-20 scores was compared for all four AS-20 domains (self-perception, interactions, reading function, and general function) overall and by success status using two methods: (1) applying historical Rasch threshold measures from lookup tables and (2) performing a stacked de novo Rasch analysis. Change was assessed by analyzing effect size, improvement exceeding 95% limits of agreement (LOA), and score distributions. Results Effect sizes were similar for all AS-20 domains whether obtained from lookup tables or stacked analysis. Similar proportions exceeded 95% LOAs using lookup tables versus stacked analysis. Improvement in median score was observed for all AS-20 domains using lookup tables and stacked analysis (P < 0.0001 for all comparisons). Conclusions The Rasch-scored AS-20 is a responsive and valid instrument designed to measure strabismus-specific health-related quality of life. When analyzing pre- to postoperative change in AS-20 scores, Rasch lookup tables and de novo stacked Rasch analysis yield essentially the same results. Translational Relevance We describe a practical application of lookup tables, allowing the clinician or researcher to score the Rasch-calibrated AS-20 questionnaire without specialized software. PMID:26933524
Face recognition using slow feature analysis and contourlet transform
NASA Astrophysics Data System (ADS)
Wang, Yuehao; Peng, Lingling; Zhe, Fuchuan
2018-04-01
In this paper we propose a novel face recognition approach based on slow feature analysis (SFA) in contourlet transform domain. This method firstly use contourlet transform to decompose the face image into low frequency and high frequency part, and then takes technological advantages of slow feature analysis for facial feature extraction. We named the new method combining the slow feature analysis and contourlet transform as CT-SFA. The experimental results on international standard face database demonstrate that the new face recognition method is effective and competitive.
Identification of modal parameters including unmeasured forces and transient effects
NASA Astrophysics Data System (ADS)
Cauberghe, B.; Guillaume, P.; Verboven, P.; Parloo, E.
2003-08-01
In this paper, a frequency-domain method to estimate modal parameters from short data records with known input (measured) forces and unknown input forces is presented. The method can be used for an experimental modal analysis, an operational modal analysis (output-only data) and the combination of both. A traditional experimental and operational modal analysis in the frequency domain starts respectively, from frequency response functions and spectral density functions. To estimate these functions accurately sufficient data have to be available. The technique developed in this paper estimates the modal parameters directly from the Fourier spectra of the outputs and the known input. Instead of using Hanning windows on these short data records the transient effects are estimated simultaneously with the modal parameters. The method is illustrated, tested and validated by Monte Carlo simulations and experiments. The presented method to process short data sequences leads to unbiased estimates with a small variance in comparison to the more traditional approaches.
Adapting Human Reliability Analysis from Nuclear Power to Oil and Gas Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boring, Ronald Laurids
2015-09-01
ABSTRACT: Human reliability analysis (HRA), as currently used in risk assessments, largely derives its methods and guidance from application in the nuclear energy domain. While there are many similarities be-tween nuclear energy and other safety critical domains such as oil and gas, there remain clear differences. This paper provides an overview of HRA state of the practice in nuclear energy and then describes areas where refinements to the methods may be necessary to capture the operational context of oil and gas. Many key distinctions important to nuclear energy HRA such as Level 1 vs. Level 2 analysis may prove insignifi-cantmore » for oil and gas applications. On the other hand, existing HRA methods may not be sensitive enough to factors like the extensive use of digital controls in oil and gas. This paper provides an overview of these con-siderations to assist in the adaptation of existing nuclear-centered HRA methods to the petroleum sector.« less
Nursing Routine Data as a Basis for Association Analysis in the Domain of Nursing Knowledge
Sellemann, Björn; Stausberg, Jürgen; Hübner, Ursula
2012-01-01
This paper describes the data mining method of association analysis within the framework of Knowledge Discovery in Databases (KDD) with the aim to identify standard patterns of nursing care. The approach is application-oriented and used on nursing routine data of the method LEP nursing 2. The increasing use of information technology in hospitals, especially of nursing information systems, requires the storage of large data sets, which hitherto have not always been analyzed adequately. Three association analyses for the days of admission, surgery and discharge, have been performed. The results of almost 1.5 million generated association rules indicate that it is valid to apply association analysis to nursing routine data. All rules are semantically trivial, since they reflect existing knowledge from the domain of nursing. This may be due either to the method LEP Nursing 2, or to the nursing activities themselves. Nonetheless, association analysis may in future become a useful analytical tool on the basis of structured nursing routine data. PMID:24199122
Radakovics, Katharina; Smith, Terry K.; Bobik, Nina; Round, Adam; Djinović-Carugo, Kristina; Usón, Isabel
2016-01-01
Vaccinia virus interferes with early events of the activation pathway of the transcriptional factor NF-kB by binding to numerous host TIR-domain containing adaptor proteins. We have previously determined the X-ray structure of the A46 C-terminal domain; however, the structure and function of the A46 N-terminal domain and its relationship to the C-terminal domain have remained unclear. Here, we biophysically characterize residues 1–83 of the N-terminal domain of A46 and present the X-ray structure at 1.55 Å. Crystallographic phases were obtained by a recently developed ab initio method entitled ARCIMBOLDO_BORGES that employs tertiary structure libraries extracted from the Protein Data Bank; data analysis revealed an all β-sheet structure. This is the first such structure solved by this method which should be applicable to any protein composed entirely of β-sheets. The A46(1–83) structure itself is a β-sandwich containing a co-purified molecule of myristic acid inside a hydrophobic pocket and represents a previously unknown lipid-binding fold. Mass spectrometry analysis confirmed the presence of long-chain fatty acids in both N-terminal and full-length A46; mutation of the hydrophobic pocket reduced the lipid content. Using a combination of high resolution X-ray structures of the N- and C-terminal domains and SAXS analysis of full-length protein A46(1–240), we present here a structural model of A46 in a tetrameric assembly. Integrating affinity measurements and structural data, we propose how A46 simultaneously interferes with several TIR-domain containing proteins to inhibit NF-κB activation and postulate that A46 employs a bipartite binding arrangement to sequester the host immune adaptors TRAM and MyD88. PMID:27973613
MEM spectral analysis for predicting influenza epidemics in Japan.
Sumi, Ayako; Kamo, Ken-ichi
2012-03-01
The prediction of influenza epidemics has long been the focus of attention in epidemiology and mathematical biology. In this study, we tested whether time series analysis was useful for predicting the incidence of influenza in Japan. The method of time series analysis we used consists of spectral analysis based on the maximum entropy method (MEM) in the frequency domain and the nonlinear least squares method in the time domain. Using this time series analysis, we analyzed the incidence data of influenza in Japan from January 1948 to December 1998; these data are unique in that they covered the periods of pandemics in Japan in 1957, 1968, and 1977. On the basis of the MEM spectral analysis, we identified the periodic modes explaining the underlying variations of the incidence data. The optimum least squares fitting (LSF) curve calculated with the periodic modes reproduced the underlying variation of the incidence data. An extension of the LSF curve could be used to predict the incidence of influenza quantitatively. Our study suggested that MEM spectral analysis would allow us to model temporal variations of influenza epidemics with multiple periodic modes much more effectively than by using the method of conventional time series analysis, which has been used previously to investigate the behavior of temporal variations in influenza data.
Papadimitriou, Konstantinos I.; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M.
2014-01-01
The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4th) order topology. PMID:25653579
Papadimitriou, Konstantinos I; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M
2014-01-01
The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4(th)) order topology.
NASA Technical Reports Server (NTRS)
Srivastava, R.; Reddy, T. S. R.
1997-01-01
The program DuctE3D is used for steady or unsteady aerodynamic and aeroelastic analysis of ducted fans. This guide describes the input data required and the output files generated, in using DuctE3D. The analysis solves three dimensional unsteady, compressible Euler equations to obtain the aerodynamic forces. A normal mode structural analysis is used to obtain the aeroelastic equations, which are solved using either the time domain or the frequency domain solution method. Sample input and output files are included in this guide for steady aerodynamic analysis and aeroelastic analysis of an isolated fan row.
NASA Astrophysics Data System (ADS)
He, Yangkun; Coey, J. M. D.; Schaefer, Rudolf; Jiang, Chengbao
2018-01-01
The ground state of macroscopic samples of magnetically ordered materials is a domain state because of magnetostatic energy or entropy, yet we have limited experimental means for imaging the bulk domain structure and the magnetization process directly. The common methods available reveal the domains at the surface or in electron- or x-ray transparent lamellae, not those in the bulk. The magnetization curve just reflects the vector sum of the moments of all the domains in the sample, but magnetostriction curves are more informative. They are strongly influenced by the domain structure in the unmagnetized state and its evolution during the magnetization process in an applied field. Here we report a method of determining the bulk domain structure in a cubic magnetostrictive material by combining magneto-optic Kerr microscopy with magnetostriction and magnetization measurements on single crystals as a function of applied field. We analyze the magnetostriction of F e83G a17 crystals in terms of a domain structure that is greatly influenced by sample shape and heat treatment. Saturation magnetostriction measurements are used to determine the fraction of domains orientated along the three 〈100 〉 axes in the initial state. Domain wall motion and rotation process have characteristic signatures in the magnetostriction curves, including those associated with the Δ E effect and domain rotation through a 〈110 〉 auxetic direction.
Domain Adaptation with Conditional Transferable Components
Gong, Mingming; Zhang, Kun; Liu, Tongliang; Tao, Dacheng; Glymour, Clark; Schölkopf, Bernhard
2017-01-01
Domain adaptation arises in supervised learning when the training (source domain) and test (target domain) data have different distributions. Let X and Y denote the features and target, respectively, previous work on domain adaptation mainly considers the covariate shift situation where the distribution of the features P(X) changes across domains while the conditional distribution P(Y∣X) stays the same. To reduce domain discrepancy, recent methods try to find invariant components T(X) that have similar P(T(X)) on different domains by explicitly minimizing a distribution discrepancy measure. However, it is not clear if P(Y∣T(X)) in different domains is also similar when P(Y∣X) changes. Furthermore, transferable components do not necessarily have to be invariant. If the change in some components is identifiable, we can make use of such components for prediction in the target domain. In this paper, we focus on the case where P(X∣Y) and P(Y) both change in a causal system in which Y is the cause for X. Under appropriate assumptions, we aim to extract conditional transferable components whose conditional distribution P(T(X)∣Y) is invariant after proper location-scale (LS) transformations, and identify how P(Y) changes between domains simultaneously. We provide theoretical analysis and empirical evaluation on both synthetic and real-world data to show the effectiveness of our method. PMID:28239433
Tennant, Alan; Tyson, Sarah F.; Nordenskiöld, Ulla; Hawkins, Ruth; Prior, Yeliz
2015-01-01
Objectives. The Evaluation of Daily Activity Questionnaire (EDAQ) includes 138 items in 14 domains identified as important by people with RA. The aim of this study was to test the validity and reliability of the English EDAQ. Methods. A total of 502 participants completed two questionnaires 3 weeks apart. The first consisted of the EDAQ, HAQ, RA Quality of Life (RAQoL) and the Medical Outcomes Scale (MOS) 36-item Short-Form Health Survey (SF-36v2), and the second consisted of the EDAQ only. The 14 EDAQ domains were tested for: unidimensionality—using confirmatory factor analysis; fit, response dependency, invariance across groups (differential item functioning)—using Rasch analysis; internal consistency [Person Separation Index (PSI)]; concurrent validity—by correlations with the HAQ, SF-36v2 and RAQoL; and test–retest reliability (Spearman’s correlations). Results. Confirmatory factor analysis of the 14 EDAQ domains indicated unidimensionality, after adjustment for local dependency in each domain. All domains achieved a root mean square error of approximation <0.10 and satisfied Rasch model expectations for local dependency. DIF by age, gender and employment status was largely absent. The PSI was consistent with individual use (PSI = 0.94 for all 14 domains). For all domains, except Caring, concurrent validity was good: HAQ (rs = 0.72–0.91), RAQoL (rs = 0.67–0.82) and SF36v2 Physical Function scale (rs = −0.60 to −0.84) and test–retest reliability was good (rs = 0.70–0.89). Conclusion. Analysis supported a 14-domain, two-component structure (Self care and Mobility) of the EDAQ, where each domain, and both components, satisfied Rasch model requirements, and have robust reliability and validity. PMID:25863045
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hofschen, S.; Wolff, I.
1996-08-01
Time-domain simulation results of two-dimensional (2-D) planar waveguide finite-difference time-domain (FDTD) analysis are normally analyzed using Fourier transform. The introduced method of time series analysis to extract propagation and attenuation constants reduces the desired computation time drastically. Additionally, a nonequidistant discretization together with an adequate excitation technique is used to reduce the number of spatial grid points. Therefore, it is possible to reduce the number of spatial grid points. Therefore, it is possible to simulate normal- and superconducting planar waveguide structures with very thin conductors and small dimensions, as they are used in MMIC technology. The simulation results are comparedmore » with measurements and show good agreement.« less
Feature-Oriented Domain Analysis (FODA) Feasibility Study
1990-11-01
controlling the synchronous behavior of the task. A task may wait for one or more synchronizing or message queue events. "* Each task is designed using the...Comparative Study 13 2.2.1. The Genesis System 13 2.2.2. MCC Work 15 2.2.2.1. The DESIRE Design Recovery Tool 15 0 2.2.2.2. Domain Analysis Method 1f...Illustration 43 Figure 6-1: Architectural Layers 48 Figure 6-2: Window Management Subsystem Design Structure 49 Figure 7-1: Function of a Window Manager
Beta-function B-spline smoothing on triangulations
NASA Astrophysics Data System (ADS)
Dechevsky, Lubomir T.; Zanaty, Peter
2013-03-01
In this work we investigate a novel family of Ck-smooth rational basis functions on triangulations for fitting, smoothing, and denoising geometric data. The introduced basis function is closely related to a recently introduced general method introduced in utilizing generalized expo-rational B-splines, which provides Ck-smooth convex resolutions of unity on very general disjoint partitions and overlapping covers of multidimensional domains with complex geometry. One of the major advantages of this new triangular construction is its locality with respect to the star-1 neighborhood of the vertex on which the said base is providing Hermite interpolation. This locality of the basis functions can be in turn utilized in adaptive methods, where, for instance a local refinement of the underlying triangular mesh affects only the refined domain, whereas, in other method one needs to investigate what changes are occurring outside of the refined domain. Both the triangular and the general smooth constructions have the potential to become a new versatile tool of Computer Aided Geometric Design (CAGD), Finite and Boundary Element Analysis (FEA/BEA) and Iso-geometric Analysis (IGA).
Analysis Of The IJCNN 2011 UTL Challenge
2012-01-13
large datasets from various application domains: handwriting recognition, image recognition, video processing, text processing, and ecology. The goal...validation and final evaluation sets consist of 4096 examples each. Dataset Domain Features Sparsity Devel. Transf. AVICENNA Handwriting 120 0% 150205...documents [3]. Transfer learning methods could accelerate the application of handwriting recognizers to historical manuscript by reducing the need for
Stability analysis of nonlinear systems with slope restricted nonlinearities.
Liu, Xian; Du, Jiajia; Gao, Qing
2014-01-01
The problem of absolute stability of Lur'e systems with sector and slope restricted nonlinearities is revisited. Novel time-domain and frequency-domain criteria are established by using the Lyapunov method and the well-known Kalman-Yakubovich-Popov (KYP) lemma. The criteria strengthen some existing results. Simulations are given to illustrate the efficiency of the results.
NASA Astrophysics Data System (ADS)
Kumar, Amit; Nehra, Vikas; Kaushik, Brajesh Kumar
2017-08-01
Graphene rolled-up cylindrical sheets i.e. carbon nanotubes (CNTs) is one of the finest and emerging research area. This paper presents the investigation of induced crosstalk in coupled on-chip multiwalled carbon nanotube (MWCNT) interconnects using finite-difference analysis (FDA) in time-domain i.e. the finite-difference time-domain (FDTD) method. The exceptional properties of versatile MWCNTs profess their candidacy to replace conventional on-chip copper interconnects. Time delay and crosstalk noise have been evaluated for coupled on-chip MWCNT interconnects. With a decrease in CNT length, the obtained results for an MWCNT shows that transmission performance improves as the number of shells increases. It has been observed that the obtained results using the finite-difference time domain (FDTD) technique shows a very close match with the HSPICE simulated results.
Casoni, Alessandro; Clerici, Francesca; Contini, Alessandro
2013-04-01
We describe the application of molecular dynamics followed by principal component analysis to study the inter-domain movements of the ligand binding domain (LBD) of mGluR5 in response to the binding of selected agonists or antagonists. Our results suggest that the method is an attractive alternative to current approaches to predict the agonist-induced or antagonist-blocked LBD responses. The ratio between the eigenvalues of the first and second eigenvectors (R1,2) is also proposed as a numerical descriptor for discriminating the ligand behavior as a mGluR5 agonist or antagonist. Copyright © 2013 Elsevier Inc. All rights reserved.
OMA analysis of a launcher under operational conditions with time-varying properties
NASA Astrophysics Data System (ADS)
Eugeni, M.; Coppotelli, G.; Mastroddi, F.; Gaudenzi, P.; Muller, S.; Troclet, B.
2018-05-01
The objective of this paper is the investigation of the capability of operational modal analysis approaches to deal with time-varying system in the low-frequency domain. Specifically, the problem of the identification of the dynamic properties of a launch vehicle, working under actual operative conditions, is studied. Two OMA methods are considered: the frequency-domain decomposition and the Hilbert transform method. It is demonstrated that both OMA approaches allow the time-tracking of modal parameters, namely, natural frequencies, damping ratios, and mode shapes, from the response accelerations only recorded during actual flight tests of a launcher characterized by a large mass variation due to fuel burning typical of the first phase of the flight.
Quantitative analysis of the mixtures of illicit drugs using terahertz time-domain spectroscopy
NASA Astrophysics Data System (ADS)
Jiang, Dejun; Zhao, Shusen; Shen, Jingling
2008-03-01
A method was proposed to quantitatively inspect the mixtures of illicit drugs with terahertz time-domain spectroscopy technique. The mass percentages of all components in a mixture can be obtained by linear regression analysis, on the assumption that all components in the mixture and their absorption features be known. For illicit drugs were scarce and expensive, firstly we used common chemicals, Benzophenone, Anthraquinone, Pyridoxine hydrochloride and L-Ascorbic acid in the experiment. Then illicit drugs and a common adulterant, methamphetamine and flour, were selected for our experiment. Experimental results were in significant agreement with actual content, which suggested that it could be an effective method for quantitative identification of illicit drugs.
Shatokhina, Iuliia; Obereder, Andreas; Rosensteiner, Matthias; Ramlau, Ronny
2013-04-20
We present a fast method for the wavefront reconstruction from pyramid wavefront sensor (P-WFS) measurements. The method is based on an analytical relation between pyramid and Shack-Hartmann sensor (SH-WFS) data. The algorithm consists of two steps--a transformation of the P-WFS data to SH data, followed by the application of cumulative reconstructor with domain decomposition, a wavefront reconstructor from SH-WFS measurements. The closed loop simulations confirm that our method provides the same quality as the standard matrix vector multiplication method. A complexity analysis as well as speed tests confirm that the method is very fast. Thus, the method can be used on extremely large telescopes, e.g., for eXtreme adaptive optics systems.
Ultrasonic test of resistance spot welds based on wavelet package analysis.
Liu, Jing; Xu, Guocheng; Gu, Xiaopeng; Zhou, Guanghao
2015-02-01
In this paper, ultrasonic test of spot welds for stainless steel sheets has been studied. It is indicated that traditional ultrasonic signal analysis in either time domain or frequency domain remains inadequate to evaluate the nugget diameter of spot welds. However, the method based on wavelet package analysis in time-frequency domain can easily distinguish the nugget from the corona bond by extracting high-frequency signals in different positions of spot welds, thereby quantitatively evaluating the nugget diameter. The results of ultrasonic test fit the actual measured value well. Mean value of normal distribution of error statistics is 0.00187, and the standard deviation is 0.1392. Furthermore, the quality of spot welds was evaluated, and it is showed ultrasonic nondestructive test based on wavelet packet analysis can be used to evaluate the quality of spot welds, and it is more reliable than single tensile destructive test. Copyright © 2014 Elsevier B.V. All rights reserved.
Miyabara, Renata; Berg, Karsten; Kraemer, Jan F; Baltatu, Ovidiu C; Wessel, Niels; Campos, Luciana A
2017-01-01
Objective: The aim of this study was to identify the most sensitive heart rate and blood pressure variability (HRV and BPV) parameters from a given set of well-known methods for the quantification of cardiovascular autonomic function after several autonomic blockades. Methods: Cardiovascular sympathetic and parasympathetic functions were studied in freely moving rats following peripheral muscarinic (methylatropine), β1-adrenergic (metoprolol), muscarinic + β1-adrenergic, α1-adrenergic (prazosin), and ganglionic (hexamethonium) blockades. Time domain, frequency domain and symbolic dynamics measures for each of HRV and BPV were classified through paired Wilcoxon test for all autonomic drugs separately. In order to select those variables that have a high relevance to, and stable influence on our target measurements (HRV, BPV) we used Fisher's Method to combine the p -value of multiple tests. Results: This analysis led to the following best set of cardiovascular variability parameters: The mean normal beat-to-beat-interval/value (HRV/BPV: meanNN), the coefficient of variation (cvNN = standard deviation over meanNN) and the root mean square differences of successive (RMSSD) of the time domain analysis. In frequency domain analysis the very-low-frequency (VLF) component was selected. From symbolic dynamics Shannon entropy of the word distribution (FWSHANNON) as well as POLVAR3, the non-linear parameter to detect intermittently decreased variability, showed the best ability to discriminate between the different autonomic blockades. Conclusion: Throughout a complex comparative analysis of HRV and BPV measures altered by a set of autonomic drugs, we identified the most sensitive set of informative cardiovascular variability indexes able to pick up the modifications imposed by the autonomic challenges. These indexes may help to increase our understanding of cardiovascular sympathetic and parasympathetic functions in translational studies of experimental diseases.
Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Knox, Lenora A.
The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.
Wen, Tingxi; Zhang, Zhongnan
2017-01-01
Abstract In this paper, genetic algorithm-based frequency-domain feature search (GAFDS) method is proposed for the electroencephalogram (EEG) analysis of epilepsy. In this method, frequency-domain features are first searched and then combined with nonlinear features. Subsequently, these features are selected and optimized to classify EEG signals. The extracted features are analyzed experimentally. The features extracted by GAFDS show remarkable independence, and they are superior to the nonlinear features in terms of the ratio of interclass distance and intraclass distance. Moreover, the proposed feature search method can search for features of instantaneous frequency in a signal after Hilbert transformation. The classification results achieved using these features are reasonable; thus, GAFDS exhibits good extensibility. Multiple classical classifiers (i.e., k-nearest neighbor, linear discriminant analysis, decision tree, AdaBoost, multilayer perceptron, and Naïve Bayes) achieve satisfactory classification accuracies by using the features generated by the GAFDS method and the optimized feature selection. The accuracies for 2-classification and 3-classification problems may reach up to 99% and 97%, respectively. Results of several cross-validation experiments illustrate that GAFDS is effective in the extraction of effective features for EEG classification. Therefore, the proposed feature selection and optimization model can improve classification accuracy. PMID:28489789
Wen, Tingxi; Zhang, Zhongnan
2017-05-01
In this paper, genetic algorithm-based frequency-domain feature search (GAFDS) method is proposed for the electroencephalogram (EEG) analysis of epilepsy. In this method, frequency-domain features are first searched and then combined with nonlinear features. Subsequently, these features are selected and optimized to classify EEG signals. The extracted features are analyzed experimentally. The features extracted by GAFDS show remarkable independence, and they are superior to the nonlinear features in terms of the ratio of interclass distance and intraclass distance. Moreover, the proposed feature search method can search for features of instantaneous frequency in a signal after Hilbert transformation. The classification results achieved using these features are reasonable; thus, GAFDS exhibits good extensibility. Multiple classical classifiers (i.e., k-nearest neighbor, linear discriminant analysis, decision tree, AdaBoost, multilayer perceptron, and Naïve Bayes) achieve satisfactory classification accuracies by using the features generated by the GAFDS method and the optimized feature selection. The accuracies for 2-classification and 3-classification problems may reach up to 99% and 97%, respectively. Results of several cross-validation experiments illustrate that GAFDS is effective in the extraction of effective features for EEG classification. Therefore, the proposed feature selection and optimization model can improve classification accuracy.
Applied Time Domain Stability Margin Assessment for Nonlinear Time-Varying Systems
NASA Technical Reports Server (NTRS)
Kiefer, J. M.; Johnson, M. D.; Wall, J. H.; Dominguez, A.
2016-01-01
The baseline stability margins for NASA's Space Launch System (SLS) launch vehicle were generated via the classical approach of linearizing the system equations of motion and determining the gain and phase margins from the resulting frequency domain model. To improve the fidelity of the classical methods, the linear frequency domain approach can be extended by replacing static, memoryless nonlinearities with describing functions. This technique, however, does not address the time varying nature of the dynamics of a launch vehicle in flight. An alternative technique for the evaluation of the stability of the nonlinear launch vehicle dynamics along its trajectory is to incrementally adjust the gain and/or time delay in the time domain simulation until the system exhibits unstable behavior. This technique has the added benefit of providing a direct comparison between the time domain and frequency domain tools in support of simulation validation. This technique was implemented by using the Stability Aerospace Vehicle Analysis Tool (SAVANT) computer simulation to evaluate the stability of the SLS system with the Adaptive Augmenting Control (AAC) active and inactive along its ascent trajectory. The gains for which the vehicle maintains apparent time-domain stability defines the gain margins, and the time delay similarly defines the phase margin. This method of extracting the control stability margins from the time-domain simulation is relatively straightforward and the resultant margins can be compared to the linearized system results. The sections herein describe the techniques employed to extract the time-domain margins, compare the results between these nonlinear and the linear methods, and provide explanations for observed discrepancies. The SLS ascent trajectory was simulated with SAVANT and the classical linear stability margins were evaluated at one second intervals. The linear analysis was performed with the AAC algorithm disabled to attain baseline stability margins. At each time point, the system was linearized about the current operating point using Simulink's built-in solver. Each linearized system in time was evaluated for its rigid-body gain margin (high frequency gain margin), rigid-body phase margin, and aero gain margin (low frequency gain margin) for each control axis. Using the stability margins derived from the baseline linearization approach, the time domain derived stability margins were determined by executing time domain simulations in which axis-specific incremental gain and phase adjustments were made to the nominal system about the expected neutral stability point at specific flight times. The baseline stability margin time histories were used to shift the system gain to various values around the zero margin point such that a precise amount of expected gain margin was maintained throughout flight. When assessing the gain margins, the gain was applied starting at the time point under consideration, thereafter following the variation in the margin found in the linear analysis. When assessing the rigid-body phase margin, a constant time delay was applied to the system starting at the time point under consideration. If the baseline stability margins were correctly determined via the linear analysis, the time domain simulation results should contain unstable behavior at certain gain and phase values. Examples will be shown from repeated simulations with variable added gain and phase lag. Faithfulness of margins calculated from the linear analysis to the nonlinear system will be demonstrated.
Artifact interactions retard technological improvement: An empirical study
Magee, Christopher L.
2017-01-01
Empirical research has shown performance improvement of many different technological domains occurs exponentially but with widely varying improvement rates. What causes some technologies to improve faster than others do? Previous quantitative modeling research has identified artifact interactions, where a design change in one component influences others, as an important determinant of improvement rates. The models predict that improvement rate for a domain is proportional to the inverse of the domain’s interaction parameter. However, no empirical research has previously studied and tested the dependence of improvement rates on artifact interactions. A challenge to testing the dependence is that any method for measuring interactions has to be applicable to a wide variety of technologies. Here we propose a novel patent-based method that is both technology domain-agnostic and less costly than alternative methods. We use textual content from patent sets in 27 domains to find the influence of interactions on improvement rates. Qualitative analysis identified six specific keywords that signal artifact interactions. Patent sets from each domain were then examined to determine the total count of these 6 keywords in each domain, giving an estimate of artifact interactions in each domain. It is found that improvement rates are positively correlated with the inverse of the total count of keywords with Pearson correlation coefficient of +0.56 with a p-value of 0.002. The results agree with model predictions, and provide, for the first time, empirical evidence that artifact interactions have a retarding effect on improvement rates of technological domains. PMID:28777798
Wang, Yuan; Bao, Shan; Du, Wenjun; Ye, Zhirui; Sayer, James R
2017-11-17
This article investigated and compared frequency domain and time domain characteristics of drivers' behaviors before and after the start of distracted driving. Data from an existing naturalistic driving study were used. Fast Fourier transform (FFT) was applied for the frequency domain analysis to explore drivers' behavior pattern changes between nondistracted (prestarting of visual-manual task) and distracted (poststarting of visual-manual task) driving periods. Average relative spectral power in a low frequency range (0-0.5 Hz) and the standard deviation in a 10-s time window of vehicle control variables (i.e., lane offset, yaw rate, and acceleration) were calculated and further compared. Sensitivity analyses were also applied to examine the reliability of the time and frequency domain analyses. Results of the mixed model analyses from the time and frequency domain analyses all showed significant degradation in lateral control performance after engaging in visual-manual tasks while driving. Results of the sensitivity analyses suggested that the frequency domain analysis was less sensitive to the frequency bandwidth, whereas the time domain analysis was more sensitive to the time intervals selected for variation calculations. Different time interval selections can result in significantly different standard deviation values, whereas average spectral power analysis on yaw rate in both low and high frequency bandwidths showed consistent results, that higher variation values were observed during distracted driving when compared to nondistracted driving. This study suggests that driver state detection needs to consider the behavior changes during the prestarting periods, instead of only focusing on periods with physical presence of distraction, such as cell phone use. Lateral control measures can be a better indicator of distraction detection than longitudinal controls. In addition, frequency domain analyses proved to be a more robust and consistent method in assessing driving performance compared to time domain analyses.
Extracting sets of chemical substructures and protein domains governing drug-target interactions.
Yamanishi, Yoshihiro; Pauwels, Edouard; Saigo, Hiroto; Stoven, Véronique
2011-05-23
The identification of rules governing molecular recognition between drug chemical substructures and protein functional sites is a challenging issue at many stages of the drug development process. In this paper we develop a novel method to extract sets of drug chemical substructures and protein domains that govern drug-target interactions on a genome-wide scale. This is made possible using sparse canonical correspondence analysis (SCCA) for analyzing drug substructure profiles and protein domain profiles simultaneously. The method does not depend on the availability of protein 3D structures. From a data set of known drug-target interactions including enzymes, ion channels, G protein-coupled receptors, and nuclear receptors, we extract a set of chemical substructures shared by drugs able to bind to a set of protein domains. These two sets of extracted chemical substructures and protein domains form components that can be further exploited in a drug discovery process. This approach successfully clusters protein domains that may be evolutionary unrelated but that bind a common set of chemical substructures. As shown in several examples, it can also be very helpful for predicting new protein-ligand interactions and addressing the problem of ligand specificity. The proposed method constitutes a contribution to the recent field of chemogenomics that aims to connect the chemical space with the biological space.
Fast and accurate fitting and filtering of noisy exponentials in Legendre space.
Bao, Guobin; Schild, Detlev
2014-01-01
The parameters of experimentally obtained exponentials are usually found by least-squares fitting methods. Essentially, this is done by minimizing the mean squares sum of the differences between the data, most often a function of time, and a parameter-defined model function. Here we delineate a novel method where the noisy data are represented and analyzed in the space of Legendre polynomials. This is advantageous in several respects. First, parameter retrieval in the Legendre domain is typically two orders of magnitude faster than direct fitting in the time domain. Second, data fitting in a low-dimensional Legendre space yields estimates for amplitudes and time constants which are, on the average, more precise compared to least-squares-fitting with equal weights in the time domain. Third, the Legendre analysis of two exponentials gives satisfactory estimates in parameter ranges where least-squares-fitting in the time domain typically fails. Finally, filtering exponentials in the domain of Legendre polynomials leads to marked noise removal without the phase shift characteristic for conventional lowpass filters.
Temporal Noise Analysis of Charge-Domain Sampling Readout Circuits for CMOS Image Sensors.
Ge, Xiaoliang; Theuwissen, Albert J P
2018-02-27
This paper presents a temporal noise analysis of charge-domain sampling readout circuits for Complementary Metal-Oxide Semiconductor (CMOS) image sensors. In order to address the trade-off between the low input-referred noise and high dynamic range, a Gm-cell-based pixel together with a charge-domain correlated-double sampling (CDS) technique has been proposed to provide a way to efficiently embed a tunable conversion gain along the read-out path. Such readout topology, however, operates in a non-stationery large-signal behavior, and the statistical properties of its temporal noise are a function of time. Conventional noise analysis methods for CMOS image sensors are based on steady-state signal models, and therefore cannot be readily applied for Gm-cell-based pixels. In this paper, we develop analysis models for both thermal noise and flicker noise in Gm-cell-based pixels by employing the time-domain linear analysis approach and the non-stationary noise analysis theory, which help to quantitatively evaluate the temporal noise characteristic of Gm-cell-based pixels. Both models were numerically computed in MATLAB using design parameters of a prototype chip, and compared with both simulation and experimental results. The good agreement between the theoretical and measurement results verifies the effectiveness of the proposed noise analysis models.
Temporal Noise Analysis of Charge-Domain Sampling Readout Circuits for CMOS Image Sensors †
Theuwissen, Albert J. P.
2018-01-01
This paper presents a temporal noise analysis of charge-domain sampling readout circuits for Complementary Metal-Oxide Semiconductor (CMOS) image sensors. In order to address the trade-off between the low input-referred noise and high dynamic range, a Gm-cell-based pixel together with a charge-domain correlated-double sampling (CDS) technique has been proposed to provide a way to efficiently embed a tunable conversion gain along the read-out path. Such readout topology, however, operates in a non-stationery large-signal behavior, and the statistical properties of its temporal noise are a function of time. Conventional noise analysis methods for CMOS image sensors are based on steady-state signal models, and therefore cannot be readily applied for Gm-cell-based pixels. In this paper, we develop analysis models for both thermal noise and flicker noise in Gm-cell-based pixels by employing the time-domain linear analysis approach and the non-stationary noise analysis theory, which help to quantitatively evaluate the temporal noise characteristic of Gm-cell-based pixels. Both models were numerically computed in MATLAB using design parameters of a prototype chip, and compared with both simulation and experimental results. The good agreement between the theoretical and measurement results verifies the effectiveness of the proposed noise analysis models. PMID:29495496
Stern, RJ; Fernandez, A; Jacobs, EA; Neilands, TB; Weech-Maldonado, R; Quan, J; Carle, A; Seligman, HK
2012-01-01
Background Providing culturally competent care shows promise as a mechanism to reduce healthcare inequalities. Until the recent development of the CAHPS Cultural Competency Item Set (CAHPS-CC), no measures capturing patient-level experiences with culturally competent care have been suitable for broad-scale administration. Methods We performed confirmatory factor analysis and internal consistency reliability analysis of CAHPS-CC among patients with type 2 diabetes (n=600) receiving primary care in safety-net clinics. CAHPS-CC domains were also correlated with global physician ratings. Results A 7-factor model demonstrated satisfactory fit (χ2(231)=484.34, p<.0001) with significant factor loadings at p<.05. Three domains showed excellent reliability – Doctor Communication- Positive Behaviors (α=.82), Trust (α=.77), and Doctor Communication- Health Promotion (α=.72). Four domains showed inadequate reliability either among Spanish speakers or overall (overall reliabilities listed): Doctor Communication- Negative Behaviors (α=.54), Equitable Treatment (α=.69), Doctor Communication- Alternative Medicine (α=.52), and Shared Decision-Making (α=.51). CAHPS-CC domains were positively and significantly correlated with global physician rating. Conclusions Select CAHPS-CC domains are suitable for broad-scale administration among safety-net patients. Those domains may be used to target quality-improvement efforts focused on providing culturally competent care in safety-net settings. PMID:22895231
Determination of the transmission coefficients for quantum structures using FDTD method.
Peng, Yangyang; Wang, Xiaoying; Sui, Wenquan
2011-12-01
The purpose of this work is to develop a simple method to incorporate quantum effect in traditional finite-difference time-domain (FDTD) simulators. Witch could make it possible to co-simulate systems include quantum structures and traditional components. In this paper, tunneling transmission coefficient is calculated by solving time-domain Schrödinger equation with a developed FDTD technique, called FDTD-S method. To validate the feasibility of the method, a simple resonant tunneling diode (RTD) structure model has been simulated using the proposed method. The good agreement between the numerical and analytical results proves its accuracy. The effectness and accuracy of this approach makes it a potential method for analysis and design of hybrid systems includes quantum structures and traditional components.
Carden, Tony; Goode, Natassia; Read, Gemma J M; Salmon, Paul M
2017-03-15
Like most work systems, the domain of adventure activities has seen a series of serious incidents and subsequent calls to improve regulation. Safety regulation systems aim to promote safety and reduce accidents. However, there is scant evidence they have led to improved safety outcomes. In fact there is some evidence that the poor integration of regulatory system components has led to adverse safety outcomes in some contexts. Despite this, there is an absence of methods for evaluating regulatory and compliance systems. This article argues that sociotechnical systems theory and methods provide a suitable framework for evaluating regulatory systems. This is demonstrated through an analysis of a recently introduced set of adventure activity regulations. Work Domain Analysis (WDA) was used to describe the regulatory system in terms of its functional purposes, values and priority measures, purpose-related functions, object-related processes and cognitive objects. This allowed judgement to be made on the nature of the new regulatory system and on the constraints that may impact its efficacy following implementation. Importantly, the analysis suggests that the new system's functional purpose of ensuring safe activities is not fully supported in terms of the functions and objects available to fulfil them. Potential improvements to the design of the system are discussed along with the implications for regulatory system design and evaluation across the safety critical domains generally. Copyright © 2017 Elsevier Ltd. All rights reserved.
Fractal dimension approach in postural control of subjects with Prader-Willi Syndrome
2011-01-01
Background Static posturography is user-friendly technique suitable for the study of the centre of pressure (CoP) trajectory. However, the utility of static posturography in clinical practice is somehow limited and there is a need for reliable approaches to extract physiologically meaningful information from stabilograms. The aim of this study was to quantify the postural strategy of Prader-Willi patients with the fractal dimension technique in addition to the CoP trajectory analysis in time and frequency domain. Methods 11 adult patients affected by Prader-Willi Syndrome (PWS) and 20 age-matched individuals (Control group: CG) were included in this study. Postural acquisitions were conducted by means of a force platform and the participants were required to stand barefoot on the platform with eyes open and heels at standardized distance and position for 30 seconds. Platform data were analysed in time and frequency domain. Fractal Dimension (FD) was also computed. Results The analysis of CoP vs. time showed that in PWS participants all the parameters were statistically different from CG, with greater displacements along both the antero-posterior and medio-lateral direction and longer CoP tracks. As for frequency analysis, our data showed no significant differences between PWS and CG. FD evidenced that PWS individuals were characterized by greater value in comparison with CG. Conclusions Our data showed that while the analysis in the frequency domain did not seem to explain the postural deficit in PWS, the FD method appears to provide a more informative description of it and to complement and integrate the time domain analysis. PMID:21854639
Taylor, William R; Stoye, Jonathan P; Taylor, Ian A
2017-04-04
The Spumaretrovirinae (foamy viruses) and the Orthoretrovirinae (e.g. HIV) share many similarities both in genome structure and the sequences of the core viral encoded proteins, such as the aspartyl protease and reverse transcriptase. Similarity in the gag region of the genome is less obvious at the sequence level but has been illuminated by the recent solution of the foamy virus capsid (CA) structure. This revealed a clear structural similarity to the orthoretrovirus capsids but with marked differences that left uncertainty in the relationship between the two domains that comprise the structure. We have applied protein structure comparison methods in order to try and resolve this ambiguous relationship. These included both the DALI method and the SAP method, with rigorous statistical tests applied to the results of both methods. For this, we employed collections of artificial fold 'decoys' (generated from the pair of native structures being compared) to provide a customised background distribution for each comparison, thus allowing significance levels to be estimated. We have shown that the relationship of the two domains conforms to a simple linear correspondence rather than a domain transposition. These similarities suggest that the origin of both viral capsids was a common ancestor with a double domain structure. In addition, we show that there is also a significant structural similarity between the amino and carboxy domains in both the foamy and ortho viruses. These results indicate that, as well as the duplication of the double domain capsid, there may have been an even more ancient gene-duplication that preceded the double domain structure. In addition, our structure comparison methodology demonstrates a general approach to problems where the components have a high intrinsic level of similarity.
NASA Technical Reports Server (NTRS)
Oswald, J. E.; Siegel, P. H.
1994-01-01
The finite difference time domain (FDTD) method is applied to the analysis of microwave, millimeter-wave and submillimeter-wave filter circuits. In each case, the validity of this method is confirmed by comparison with measured data. In addition, the FDTD calculations are used to design a new ultra-thin coplanar-strip filter for feeding a THz planar-antenna mixer.
2D Automatic body-fitted structured mesh generation using advancing extraction method
USDA-ARS?s Scientific Manuscript database
This paper presents an automatic mesh generation algorithm for body-fitted structured meshes in Computational Fluids Dynamics (CFD) analysis using the Advancing Extraction Method (AEM). The method is applicable to two-dimensional domains with complex geometries, which have the hierarchical tree-like...
2D automatic body-fitted structured mesh generation using advancing extraction method
USDA-ARS?s Scientific Manuscript database
This paper presents an automatic mesh generation algorithm for body-fitted structured meshes in Computational Fluids Dynamics (CFD) analysis using the Advancing Extraction Method (AEM). The method is applicable to two-dimensional domains with complex geometries, which have the hierarchical tree-like...
Mulinta, Ras; Yao, Sylvia Y. M.; Ng, Amy M. L.; Cass, Carol E.; Young, James D.
2017-01-01
The human SLC28 family of concentrative nucleoside transporter (CNT) proteins has three members: hCNT1, hCNT2, and hCNT3. Na+-coupled hCNT1 and hCNT2 transport pyrimidine and purine nucleosides, respectively, whereas hCNT3 transports both pyrimidine and purine nucleosides utilizing Na+ and/or H+ electrochemical gradients. Escherichia coli CNT family member NupC resembles hCNT1 in permeant selectivity but is H+-coupled. Using heterologous expression in Xenopus oocytes and the engineered cysteine-less hCNT3 protein hCNT3(C−), substituted cysteine accessibility method analysis with the membrane-impermeant thiol reactive reagent p-chloromercuribenzene sulfonate was performed on the transport domain (interfacial helix 2, hairpin 1, putative transmembrane domain (TM) 7, and TM8), as well as TM9 of the scaffold domain of the protein. This systematic scan of the entire C-terminal half of hCNT3(C−) together with parallel studies of the transport domain of wild-type hCNT1 and the corresponding TMs of cysteine-less NupC(C−) yielded results that validate the newly developed structural homology model of CNT membrane architecture for human CNTs, revealed extended conformationally mobile regions within transport-domain TMs, identified pore-lining residues of functional importance, and provided evidence of an emerging novel elevator-type mechanism of transporter function. PMID:28385889
EHR Improvement Using Incident Reports.
Teame, Tesfay; Stålhane, Tor; Nytrø, Øystein
2017-01-01
This paper discusses reactive improvement of clinical software using methods for incident analysis. We used the "Five Whys" method because we had only descriptive data and depended on a domain expert for the analysis. The analysis showed that there are two major root causes for EHR software failure, and that they are related to human and organizational errors. A main identified improvement is allocating more resources to system maintenance and user training.
Experiences in Eliciting Security Requirements
2006-12-01
FODA ) FODA is a domain analysis and engineer- ing method that focuses on developing reusable assets [9]. By examining related software systems and...describe a trade-off analysis that we used to select a suitable requirements elici- tation method and present results detailed from a case study of one...disaster planning, and how to improve Medicare. Eventually, technology-oriented problems may emerge from these soft problems, but much more analysis is
Multivariate frequency domain analysis of protein dynamics
NASA Astrophysics Data System (ADS)
Matsunaga, Yasuhiro; Fuchigami, Sotaro; Kidera, Akinori
2009-03-01
Multivariate frequency domain analysis (MFDA) is proposed to characterize collective vibrational dynamics of protein obtained by a molecular dynamics (MD) simulation. MFDA performs principal component analysis (PCA) for a bandpass filtered multivariate time series using the multitaper method of spectral estimation. By applying MFDA to MD trajectories of bovine pancreatic trypsin inhibitor, we determined the collective vibrational modes in the frequency domain, which were identified by their vibrational frequencies and eigenvectors. At near zero temperature, the vibrational modes determined by MFDA agreed well with those calculated by normal mode analysis. At 300 K, the vibrational modes exhibited characteristic features that were considerably different from the principal modes of the static distribution given by the standard PCA. The influences of aqueous environments were discussed based on two different sets of vibrational modes, one derived from a MD simulation in water and the other from a simulation in vacuum. Using the varimax rotation, an algorithm of the multivariate statistical analysis, the representative orthogonal set of eigenmodes was determined at each vibrational frequency.
NASA Astrophysics Data System (ADS)
Liao, Yuhe; Sun, Peng; Wang, Baoxiang; Qu, Lei
2018-05-01
The appearance of repetitive transients in a vibration signal is one typical feature of faulty rolling element bearings. However, accurate extraction of these fault-related characteristic components has always been a challenging task, especially when there is interference from large amplitude impulsive noises. A frequency domain multipoint kurtosis (FDMK)-based fault diagnosis method is proposed in this paper. The multipoint kurtosis is redefined in the frequency domain and the computational accuracy is improved. An envelope autocorrelation function is also presented to estimate the fault characteristic frequency, which is used to set the frequency hunting zone of the FDMK. Then, the FDMK, instead of kurtosis, is utilized to generate a fast kurtogram and only the optimal band with maximum FDMK value is selected for envelope analysis. Negative interference from both large amplitude impulsive noise and shaft rotational speed related harmonic components are therefore greatly reduced. The analysis results of simulation and experimental data verify the capability and feasibility of this FDMK-based method
A Kinect based sign language recognition system using spatio-temporal features
NASA Astrophysics Data System (ADS)
Memiş, Abbas; Albayrak, Songül
2013-12-01
This paper presents a sign language recognition system that uses spatio-temporal features on RGB video images and depth maps for dynamic gestures of Turkish Sign Language. Proposed system uses motion differences and accumulation approach for temporal gesture analysis. Motion accumulation method, which is an effective method for temporal domain analysis of gestures, produces an accumulated motion image by combining differences of successive video frames. Then, 2D Discrete Cosine Transform (DCT) is applied to accumulated motion images and temporal domain features transformed into spatial domain. These processes are performed on both RGB images and depth maps separately. DCT coefficients that represent sign gestures are picked up via zigzag scanning and feature vectors are generated. In order to recognize sign gestures, K-Nearest Neighbor classifier with Manhattan distance is performed. Performance of the proposed sign language recognition system is evaluated on a sign database that contains 1002 isolated dynamic signs belongs to 111 words of Turkish Sign Language (TSL) in three different categories. Proposed sign language recognition system has promising success rates.
Sampling in the light of Wigner distribution.
Stern, Adrian; Javidi, Bahram
2004-03-01
We propose a new method for analysis of the sampling and reconstruction conditions of real and complex signals by use of the Wigner domain. It is shown that the Wigner domain may provide a better understanding of the sampling process than the traditional Fourier domain. For example, it explains how certain non-bandlimited complex functions can be sampled and perfectly reconstructed. On the basis of observations in the Wigner domain, we derive a generalization to the Nyquist sampling criterion. By using this criterion, we demonstrate simple preprocessing operations that can adapt a signal that does not fulfill the Nyquist sampling criterion. The preprocessing operations demonstrated can be easily implemented by optical means.
Sarrigiannis, Ptolemaios G; Zhao, Yifan; Wei, Hua-Liang; Billings, Stephen A; Fotheringham, Jayne; Hadjivassiliou, Marios
2014-01-01
To introduce a new method of quantitative EEG analysis in the time domain, the error reduction ratio (ERR)-causality test. To compare performance against cross-correlation and coherence with phase measures. A simulation example was used as a gold standard to assess the performance of ERR-causality, against cross-correlation and coherence. The methods were then applied to real EEG data. Analysis of both simulated and real EEG data demonstrates that ERR-causality successfully detects dynamically evolving changes between two signals, with very high time resolution, dependent on the sampling rate of the data. Our method can properly detect both linear and non-linear effects, encountered during analysis of focal and generalised seizures. We introduce a new quantitative EEG method of analysis. It detects real time levels of synchronisation in the linear and non-linear domains. It computes directionality of information flow with corresponding time lags. This novel dynamic real time EEG signal analysis unveils hidden neural network interactions with a very high time resolution. These interactions cannot be adequately resolved by the traditional methods of coherence and cross-correlation, which provide limited results in the presence of non-linear effects and lack fidelity for changes appearing over small periods of time. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Psychometric properties of the WHOQOL-BREF in an Iranian adult sample.
Yousefy, A R; Usefy, A R; Ghassemi, Gh R; Sarrafzadegan, N; Mallik, S; Baghaei, A M; Rabiei, K
2010-04-01
To evaluate discriminant validity, reliability, internal consistency, and dimensional structure of the World Health Organization Quality of Life-BREF (WHOQOL-BREF) in a heterogeneous Iranian population. A clustered randomized sample of 2,956 healthy with 2,936 unhealthy rural and urban inhabitants aged 30 and above from two dissimilar Iranian provinces during 2006 completed the Persian version of the WHOQOL-BREF. We performed descriptive and analytical analysis including t-student, correlation matrix, Cronbach's Alpha, and factor analysis with principal components method and Varimax rotation with SPSS.15. The mean age of the participants was 42.2 +/- 12.1 years and the mean years of education was 9.3 +/- 3.8. The Iranian version of the WHOQOL-BREF domain scores demonstrated good internal consistency, criterion validity, and discriminant validity. The physical health domain contributed most in overall quality of life, while the environment domain made the least contribution. Factor analysis provided evidence for construct validity for four-factor model of the instrument. The scores of all domains discriminated between healthy persons and the patients. The WHOQOL-BREF has adequate psychometric properties and is, therefore, an adequate measure for assessing quality of life at the domain level in an adult Iranian population.
A method on error analysis for large-aperture optical telescope control system
NASA Astrophysics Data System (ADS)
Su, Yanrui; Wang, Qiang; Yan, Fabao; Liu, Xiang; Huang, Yongmei
2016-10-01
For large-aperture optical telescope, compared with the performance of azimuth in the control system, arc second-level jitters exist in elevation under different speeds' working mode, especially low-speed working mode in the process of its acquisition, tracking and pointing. The jitters are closely related to the working speed of the elevation, resulting in the reduction of accuracy and low-speed stability of the telescope. By collecting a large number of measured data to the elevation, we do analysis on jitters in the time domain, frequency domain and space domain respectively. And the relation between jitter points and the leading speed of elevation and the corresponding space angle is concluded that the jitters perform as periodic disturbance in space domain and the period of the corresponding space angle of the jitter points is 79.1″ approximately. Then we did simulation, analysis and comparison to the influence of the disturbance sources, like PWM power level output disturbance, torque (acceleration) disturbance, speed feedback disturbance and position feedback disturbance on the elevation to find that the space periodic disturbance still exist in the elevation performance. It leads us to infer that the problems maybe exist in angle measurement unit. The telescope employs a 24-bit photoelectric encoder and we can calculate the encoder grating angular resolution as 79.1016'', which is as the corresponding angle value in the whole encoder system of one period of the subdivision signal. The value is approximately equal to the space frequency of the jitters. Therefore, the working elevation of the telescope is affected by subdivision errors and the period of the subdivision error is identical to the period of encoder grating angular. Through comprehensive consideration and mathematical analysis, that DC subdivision error of subdivision error sources causes the jitters is determined, which is verified in the practical engineering. The method that analyze error sources from time domain, frequency domain and space domain respectively has a very good role in guiding to find disturbance sources for large-aperture optical telescope.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aravind, Penmatsa; Rajini, Bheemreddy; Sharma, Yogendra
The crystallization and preliminary X-ray diffraction analysis of AIM1g1, a βγ-crystallin domain of absent in melanoma (AIM1) protein from H. sapiens, is reported. AIM1g1 is a single βγ-crystallin domain from the protein absent in melanoma 1 (AIM1), which appears to play a role in the suppression of melanomas. This domain is known to bind calcium and its structure would help in identifying calcium-coordinating sites in vertebrate crystallins, which have hitherto been believed to have lost this ability during evolution. Crystallization of this domain was performed by the hanging-drop vapour-diffusion method. Crystals diffracted to a maximum resolution of 1.86 Å andmore » were found to belong to space group P6{sub 1} or P6{sub 5}, with unit-cell parameters a = b = 54.98, c = 59.73 Å. Solvent-content analysis indicated the presence of one monomer per asymmetric unit.« less
Guetterman, Timothy C; Creswell, John W; Wittink, Marsha; Barg, Fran K; Castro, Felipe G; Dahlberg, Britt; Watkins, Daphne C; Deutsch, Charles; Gallo, Joseph J
2017-01-01
Demand for training in mixed methods is high, with little research on faculty development or assessment in mixed methods. We describe the development of a self-rated mixed methods skills assessment and provide validity evidence. The instrument taps six research domains: "Research question," "Design/approach," "Sampling," "Data collection," "Analysis," and "Dissemination." Respondents are asked to rate their ability to define or explain concepts of mixed methods under each domain, their ability to apply the concepts to problems, and the extent to which they need to improve. We administered the questionnaire to 145 faculty and students using an internet survey. We analyzed descriptive statistics and performance characteristics of the questionnaire using the Cronbach alpha to assess reliability and an analysis of variance that compared a mixed methods experience index with assessment scores to assess criterion relatedness. Internal consistency reliability was high for the total set of items (0.95) and adequate (≥0.71) for all but one subscale. Consistent with establishing criterion validity, respondents who had more professional experiences with mixed methods (eg, published a mixed methods article) rated themselves as more skilled, which was statistically significant across the research domains. This self-rated mixed methods assessment instrument may be a useful tool to assess skills in mixed methods for training programs. It can be applied widely at the graduate and faculty level. For the learner, assessment may lead to enhanced motivation to learn and training focused on self-identified needs. For faculty, the assessment may improve curriculum and course content planning.
Harmonic Balance Computations of Fan Aeroelastic Stability
NASA Technical Reports Server (NTRS)
Bakhle, Milind A.; Reddy, T. S. R.
2010-01-01
A harmonic balance (HB) aeroelastic analysis, which has been recently developed, was used to determine the aeroelastic stability (flutter) characteristics of an experimental fan. To assess the numerical accuracy of this HB aeroelastic analysis, a time-domain aeroelastic analysis was also used to determine the aeroelastic stability characteristics of the same fan. Both of these three-dimensional analysis codes model the unsteady flowfield due to blade vibrations using the Reynolds-averaged Navier-Stokes (RANS) equations. In the HB analysis, the unsteady flow equations are converted to a HB form and solved using a pseudo-time marching method. In the time-domain analysis, the unsteady flow equations are solved using an implicit time-marching approach. Steady and unsteady computations for two vibration modes were carried out at two rotational speeds: 100 percent (design) and 70 percent (part-speed). The steady and unsteady results obtained from the two analysis methods compare well, thus verifying the recently developed HB aeroelastic analysis. Based on the results, the experimental fan was found to have no aeroelastic instability (flutter) at the conditions examined in this study.
A customisable framework for the assessment of therapies in the solution of therapy decision tasks.
Manjarrés Riesco, A; Martínez Tomás, R; Mira Mira, J
2000-01-01
In current medical research, a growing interest can be observed in the definition of a global therapy-evaluation framework which integrates considerations such as patients preferences and quality-of-life results. In this article, we propose the use of the research results in this domain as a source of knowledge in the design of support systems for therapy decision analysis, in particular with a view to application in oncology. We discuss the incorporation of these considerations in the definition of the therapy-assessment methods involved in the solution of a generic therapy decision task, described in the context of AI software development methodologies such as CommonKADS. The goal of the therapy decision task is to identify the ideal therapy, for a given patient, in accordance with a set of objectives of a diverse nature. The assessment methods applied are based either on data obtained from statistics or on the specific idiosyncrasies of each patient, as identified from their responses to a suite of psychological tests. In the analysis of the therapy decision task we emphasise the importance, from a methodological perspective, of using a rigorous approach to the modelling of domain ontologies and domain-specific data. To this aim we make extensive use of the semi-formal object oriented analysis notation UML to describe the domain level.
A quasi-Lagrangian finite element method for the Navier-Stokes equations in a time-dependent domain
NASA Astrophysics Data System (ADS)
Lozovskiy, Alexander; Olshanskii, Maxim A.; Vassilevski, Yuri V.
2018-05-01
The paper develops a finite element method for the Navier-Stokes equations of incompressible viscous fluid in a time-dependent domain. The method builds on a quasi-Lagrangian formulation of the problem. The paper provides stability and convergence analysis of the fully discrete (finite-difference in time and finite-element in space) method. The analysis does not assume any CFL time-step restriction, it rather needs mild conditions of the form $\\Delta t\\le C$, where $C$ depends only on problem data, and $h^{2m_u+2}\\le c\\,\\Delta t$, $m_u$ is polynomial degree of velocity finite element space. Both conditions result from a numerical treatment of practically important non-homogeneous boundary conditions. The theoretically predicted convergence rate is confirmed by a set of numerical experiments. Further we apply the method to simulate a flow in a simplified model of the left ventricle of a human heart, where the ventricle wall dynamics is reconstructed from a sequence of contrast enhanced Computed Tomography images.
Wang, Hao; Lau, Nathan; Gerdes, Ryan M
2018-04-01
The aim of this study was to apply work domain analysis for cybersecurity assessment and design of supervisory control and data acquisition (SCADA) systems. Adoption of information and communication technology in cyberphysical systems (CPSs) for critical infrastructures enables automated and distributed control but introduces cybersecurity risk. Many CPSs employ SCADA industrial control systems that have become the target of cyberattacks, which inflict physical damage without use of force. Given that absolute security is not feasible for complex systems, cyberintrusions that introduce unanticipated events will occur; a proper response will in turn require human adaptive ability. Therefore, analysis techniques that can support security assessment and human factors engineering are invaluable for defending CPSs. We conducted work domain analysis using the abstraction hierarchy (AH) to model a generic SCADA implementation to identify the functional structures and means-ends relations. We then adopted a case study approach examining the Stuxnet cyberattack by developing and integrating AHs for the uranium enrichment process, SCADA implementation, and malware to investigate the interactions between the three aspects of cybersecurity in CPSs. The AHs for modeling a generic SCADA implementation and studying the Stuxnet cyberattack are useful for mapping attack vectors, identifying deficiencies in security processes and features, and evaluating proposed security solutions with respect to system objectives. Work domain analysis is an effective analytical method for studying cybersecurity of CPSs for critical infrastructures in a psychologically relevant manner. Work domain analysis should be applied to assess cybersecurity risk and inform engineering and user interface design.
Knowledge Representation Standards and Interchange Formats for Causal Graphs
NASA Technical Reports Server (NTRS)
Throop, David R.; Malin, Jane T.; Fleming, Land
2005-01-01
In many domains, automated reasoning tools must represent graphs of causally linked events. These include fault-tree analysis, probabilistic risk assessment (PRA), planning, procedures, medical reasoning about disease progression, and functional architectures. Each of these fields has its own requirements for the representation of causation, events, actors and conditions. The representations include ontologies of function and cause, data dictionaries for causal dependency, failure and hazard, and interchange formats between some existing tools. In none of the domains has a generally accepted interchange format emerged. The paper makes progress towards interoperability across the wide range of causal analysis methodologies. We survey existing practice and emerging interchange formats in each of these fields. Setting forth a set of terms and concepts that are broadly shared across the domains, we examine the several ways in which current practice represents them. Some phenomena are difficult to represent or to analyze in several domains. These include mode transitions, reachability analysis, positive and negative feedback loops, conditions correlated but not causally linked and bimodal probability distributions. We work through examples and contrast the differing methods for addressing them. We detail recent work in knowledge interchange formats for causal trees in aerospace analysis applications in early design, safety and reliability. Several examples are discussed, with a particular focus on reachability analysis and mode transitions. We generalize the aerospace analysis work across the several other domains. We also recommend features and capabilities for the next generation of causal knowledge representation standards.
NASA Astrophysics Data System (ADS)
Shih, C. Y.; Tsuei, Y. G.; Allemang, R. J.; Brown, D. L.
1988-10-01
A method of using the matrix Auto-Regressive Moving Average (ARMA) model in the Laplace domain for multiple-reference global parameter identification is presented. This method is particularly applicable to the area of modal analysis where high modal density exists. The method is also applicable when multiple reference frequency response functions are used to characterise linear systems. In order to facilitate the mathematical solution, the Forsythe orthogonal polynomial is used to reduce the ill-conditioning of the formulated equations and to decouple the normal matrix into two reduced matrix blocks. A Complex Mode Indicator Function (CMIF) is introduced, which can be used to determine the proper order of the rational polynomials.
Sumi, Ayako; Kobayashi, Nobumichi
2017-01-01
In this report, we present a short review of applications of time series analysis, which consists of spectral analysis based on the maximum entropy method in the frequency domain and the least squares method in the time domain, to the incidence data of infectious diseases. This report consists of three parts. First, we present our results obtained by collaborative research on infectious disease epidemics with Chinese, Indian, Filipino and North European research organizations. Second, we present the results obtained with the Japanese infectious disease surveillance data and the time series numerically generated from a mathematical model, called the susceptible/exposed/infectious/recovered (SEIR) model. Third, we present an application of the time series analysis to pathologic tissues to examine the usefulness of time series analysis for investigating the spatial pattern of pathologic tissue. It is anticipated that time series analysis will become a useful tool for investigating not only infectious disease surveillance data but also immunological and genetic tests.
A multi-domain trust management model for supporting RFID applications of IoT
Li, Feng
2017-01-01
The use of RFID technology in complex and distributed environments often leads to a multi-domain RFID system, in which trust establishment among entities from heterogeneous domains without past interaction or prior agreed policy, is a challenge. The current trust management mechanisms in the literature do not meet the specific requirements in multi-domain RFID systems. Therefore, this paper analyzes the special challenges on trust management in multi-domain RFID systems, and identifies the implications and the requirements of the challenges on the solutions to the trust management of multi-domain RFID systems. A multi-domain trust management model is proposed, which provides a hierarchical trust management framework include a diversity of trust evaluation and establishment approaches. The simulation results and analysis show that the proposed method has excellent ability to deal with the trust relationships, better security, and higher accuracy rate. PMID:28708855
A multi-domain trust management model for supporting RFID applications of IoT.
Wu, Xu; Li, Feng
2017-01-01
The use of RFID technology in complex and distributed environments often leads to a multi-domain RFID system, in which trust establishment among entities from heterogeneous domains without past interaction or prior agreed policy, is a challenge. The current trust management mechanisms in the literature do not meet the specific requirements in multi-domain RFID systems. Therefore, this paper analyzes the special challenges on trust management in multi-domain RFID systems, and identifies the implications and the requirements of the challenges on the solutions to the trust management of multi-domain RFID systems. A multi-domain trust management model is proposed, which provides a hierarchical trust management framework include a diversity of trust evaluation and establishment approaches. The simulation results and analysis show that the proposed method has excellent ability to deal with the trust relationships, better security, and higher accuracy rate.
Comparative Cognitive Task Analysis
2007-01-01
is to perform a task analyses to determine how people operate in a specific domain on a specific task. Cognitive Task Analysis (CTA) is a set of...accomplish a task. In this chapter, we build on CTA methods by suggesting that comparative cognitive task analysis (C2TA) can help solve the aforementioned
Measurement methods and algorithms for comparison of local and remote clocks
NASA Technical Reports Server (NTRS)
Levine, Judah
1993-01-01
Several methods for characterizing the performance of clocks with special emphasis on using calibration information that is acquired via an unreliable or noisy channel is discussed. Time-domain variance estimators and frequency-domain techniques such as cross-spectral analysis are discussed. Each of these methods has advantages and limitations that will be illustrated using data obtained via GPS, ACTS, and other methods. No one technique will be optimum for all of these analyses, and some of these problems cannot be completely characterized by any of the techniques discussed. The inverse problem of communicating frequency and time corrections to a real-time steered clock are also discussed. Methods were developed to mitigate the disastrous problems of data corruption and loss of computer control.
Chládek, J; Brázdil, M; Halámek, J; Plešinger, F; Jurák, P
2013-01-01
We present an off-line analysis procedure for exploring brain activity recorded from intra-cerebral electroencephalographic data (SEEG). The objective is to determine the statistical differences between different types of stimulations in the time-frequency domain. The procedure is based on computing relative signal power change and subsequent statistical analysis. An example of characteristic statistically significant event-related de/synchronization (ERD/ERS) detected across different frequency bands following different oddball stimuli is presented. The method is used for off-line functional classification of different brain areas.
A square wave is the most efficient and reliable waveform for resonant actuation of micro switches
NASA Astrophysics Data System (ADS)
Ben Sassi, S.; Khater, M. E.; Najar, F.; Abdel-Rahman, E. M.
2018-05-01
This paper investigates efficient actuation methods of shunt MEMS switches and other parallel-plate actuators. We start by formulating a multi-physics model of the micro switch, coupling the nonlinear Euler-Bernoulli beam theory with the nonlinear Reynolds equation to describe the structural and fluidic domains, respectively. The model takes into account fringing field effects as well as mid-plane stretching and squeeze film damping nonlinearities. Static analysis is undertaken using the differential quadrature method (DQM) to obtain the pull-in voltage, which is verified by means of the finite element model and validated experimentally. We develop a reduced order model employing the Galerkin method for the structural domain and DQM for the fluidic domain. The proposed waveforms are intended to be more suitable for integrated circuit standards. The dynamic response of the micro switch to harmonic, square and triangular waveforms are evaluated and compared experimentally and analytically. Low voltage actuation is obtained using dynamic pull-in with the proposed waveforms. In addition, global stability analysis carried out for the three signals shows advantages of employing the square signal as the actuation method in enhancing the performance of the micro switch in terms of actuation voltage, switching time, and sensitivity to initial conditions.
A Modeling and Data Analysis of Laser Beam Propagation in the Maritime Domain
2015-05-18
approach to computing pdfs is the Kernel Density Method (Reference [9] has an intro - duction to the method), which we will apply to compute the pdf of our...The project has two parts to it: 1) we present a computational analysis of different probability density function approximation techniques; and 2) we... computational analysis of different probability density function approximation techniques; and 2) we introduce preliminary steps towards developing a
High-speed extended-term time-domain simulation for online cascading analysis of power system
NASA Astrophysics Data System (ADS)
Fu, Chuan
A high-speed extended-term (HSET) time domain simulator (TDS), intended to become a part of an energy management system (EMS), has been newly developed for use in online extended-term dynamic cascading analysis of power systems. HSET-TDS includes the following attributes for providing situational awareness of high-consequence events: (i) online analysis, including n-1 and n-k events, (ii) ability to simulate both fast and slow dynamics for 1-3 hours in advance, (iii) inclusion of rigorous protection-system modeling, (iv) intelligence for corrective action ID, storage, and fast retrieval, and (v) high-speed execution. Very fast on-line computational capability is the most desired attribute of this simulator. Based on the process of solving algebraic differential equations describing the dynamics of power system, HSET-TDS seeks to develop computational efficiency at each of the following hierarchical levels, (i) hardware, (ii) strategies, (iii) integration methods, (iv) nonlinear solvers, and (v) linear solver libraries. This thesis first describes the Hammer-Hollingsworth 4 (HH4) implicit integration method. Like the trapezoidal rule, HH4 is symmetrically A-Stable but it possesses greater high-order precision (h4 ) than the trapezoidal rule. Such precision enables larger integration steps and therefore improves simulation efficiency for variable step size implementations. This thesis provides the underlying theory on which we advocate use of HH4 over other numerical integration methods for power system time-domain simulation. Second, motivated by the need to perform high speed extended-term time domain simulation (HSET-TDS) for on-line purposes, this thesis presents principles for designing numerical solvers of differential algebraic systems associated with power system time-domain simulation, including DAE construction strategies (Direct Solution Method), integration methods(HH4), nonlinear solvers(Very Dishonest Newton), and linear solvers(SuperLU). We have implemented a design appropriate for HSET-TDS, and we compare it to various solvers, including the commercial grade PSSE program, with respect to computational efficiency and accuracy, using as examples the New England 39 bus system, the expanded 8775 bus system, and PJM 13029 buses system. Third, we have explored a stiffness-decoupling method, intended to be part of parallel design of time domain simulation software for super computers. The stiffness-decoupling method is able to combine the advantages of implicit methods (A-stability) and explicit method(less computation). With the new stiffness detection method proposed herein, the stiffness can be captured. The expanded 975 buses system is used to test simulation efficiency. Finally, several parallel strategies for super computer deployment to simulate power system dynamics are proposed and compared. Design A partitions the task via scale with the stiffness decoupling method, waveform relaxation, and parallel linear solver. Design B partitions the task via the time axis using a highly precise integration method, the Kuntzmann-Butcher Method - order 8 (KB8). The strategy of partitioning events is designed to partition the whole simulation via the time axis through a simulated sequence of cascading events. For all strategies proposed, a strategy of partitioning cascading events is recommended, since the sub-tasks for each processor are totally independent, and therefore minimum communication time is needed.
Determining XV-15 aeroelastic modes from flight data with frequency-domain methods
NASA Technical Reports Server (NTRS)
Acree, C. W., Jr.; Tischler, Mark B.
1993-01-01
The XV-15 tilt-rotor wing has six major aeroelastic modes that are close in frequency. To precisely excite individual modes during flight test, dual flaperon exciters with automatic frequency-sweep controls were installed. The resulting structural data were analyzed in the frequency domain (Fourier transformed). All spectral data were computed using chirp z-transforms. Modal frequencies and damping were determined by fitting curves to frequency-response magnitude and phase data. The results given in this report are for the XV-15 with its original metal rotor blades. Also, frequency and damping values are compared with theoretical predictions made using two different programs, CAMRAD and ASAP. The frequency-domain data-analysis method proved to be very reliable and adequate for tracking aeroelastic modes during flight-envelope expansion. This approach required less flight-test time and yielded mode estimations that were more repeatable, compared with the exponential-decay method previously used.
Cui, Jiwen; Zhao, Shiyuan; Yang, Di; Ding, Zhenyang
2018-02-20
We use a spectrum interpolation technique to improve the distributed strain measurement accuracy in a Rayleigh-scatter-based optical frequency domain reflectometry sensing system. We demonstrate that strain accuracy is not limited by the "uncertainty principle" that exists in the time-frequency analysis. Different interpolation methods are investigated and used to improve the accuracy of peak position of the cross-correlation and, therefore, improve the accuracy of the strain. Interpolation implemented by padding zeros on one side of the windowed data in the spatial domain, before the inverse fast Fourier transform, is found to have the best accuracy. Using this method, the strain accuracy and resolution are both improved without decreasing the spatial resolution. The strain of 3 μϵ within the spatial resolution of 1 cm at the position of 21.4 m is distinguished, and the measurement uncertainty is 3.3 μϵ.
A Comparative Analysis of Pitch Detection Methods Under the Influence of Different Noise Conditions.
Sukhostat, Lyudmila; Imamverdiyev, Yadigar
2015-07-01
Pitch is one of the most important components in various speech processing systems. The aim of this study was to evaluate different pitch detection methods in terms of various noise conditions. Prospective study. For evaluation of pitch detection algorithms, time-domain, frequency-domain, and hybrid methods were considered by using Keele and CSTR speech databases. Each of them has its own advantages and disadvantages. Experiments have shown that BaNa method achieves the highest pitch detection accuracy. The development of methods for pitch detection, which are robust to additive noise at different signal-to-noise ratio, is an important field of research with many opportunities for enhancement the modern methods. Copyright © 2015 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
Electromagnetic Energy Localization and Characterization of Composites
2013-01-01
polyhedrons ), and [39] (spheres and a complex yet symmetric structure). With time-domain EM analysis, regular shapes, such as cubes, spheres, and regular...spheres), [40] (spheres, crosses, cylinders, and polyhedrons ), and [41] (spheres and cylinders); and 3-D random mixtures using a frequency-domain finite...element method [42] ( polyhedrons ), and [43], [44] (spheres). Such steady-state analyses are limited as they, for example, do not capture temporal
Pazesh, Samaneh; Lazorova, Lucia; Berggren, Jonas; Alderborn, Göran; Gråsjö, Johan
2016-09-10
The main purpose of the study was to evaluate various pre-processing and quantification approaches of Raman spectrum to quantify low level of amorphous content in milled lactose powder. To improve the quantification analysis, several spectral pre-processing methods were used to adjust background effects. The effects of spectral noise on the variation of determined amorphous content were also investigated theoretically by propagation of error analysis and were compared to the experimentally obtained values. Additionally, the applicability of calibration method with crystalline or amorphous domains in the estimation of amorphous content in milled lactose powder was discussed. Two straight baseline pre-processing methods gave the best and almost equal performance. By the succeeding quantification methods, PCA performed best, although the classical least square analysis (CLS) gave comparable results, while peak parameter analysis displayed to be inferior. The standard deviations of experimental determined percentage amorphous content were 0.94% and 0.25% for pure crystalline and pure amorphous samples respectively, which was very close to the standard deviation values from propagated spectral noise. The reasonable conformity between the milled samples spectra and synthesized spectra indicated representativeness of physical mixtures with crystalline or amorphous domains in the estimation of apparent amorphous content in milled lactose. Copyright © 2016 The Author(s). Published by Elsevier B.V. All rights reserved.
Gonzalo, Jed D; Dekhtyar, Michael; Starr, Stephanie R; Borkan, Jeffrey; Brunett, Patrick; Fancher, Tonya; Green, Jennifer; Grethlein, Sara Jo; Lai, Cindy; Lawson, Luan; Monrad, Seetha; O'Sullivan, Patricia; Schwartz, Mark D; Skochelak, Susan
2017-01-01
The authors performed a review of 30 Accelerating Change in Medical Education full grant submissions and an analysis of the health systems science (HSS)-related curricula at the 11 grant recipient schools to develop a potential comprehensive HSS curricular framework with domains and subcategories. In phase 1, to identify domains, grant submissions were analyzed and coded using constant comparative analysis. In phase 2, a detailed review of all existing and planned syllabi and curriculum documents at the grantee schools was performed, and content in the core curricular domains was coded into subcategories. The lead investigators reviewed and discussed drafts of the categorization scheme, collapsed and combined domains and subcategories, and resolved disagreements via group discussion. Analysis yielded three types of domains: core, cross-cutting, and linking. Core domains included health care structures and processes; health care policy, economics, and management; clinical informatics and health information technology; population and public health; value-based care; and health system improvement. Cross-cutting domains included leadership and change agency; teamwork and interprofessional education; evidence-based medicine and practice; professionalism and ethics; and scholarship. One linking domain was identified: systems thinking. This broad framework aims to build on the traditional definition of systems-based practice and highlight the need for medical and other health professions schools to better align education programs with the anticipated needs of the systems in which students will practice. HSS will require a critical investigation into existing curricula to determine the most efficient methods for integration with the basic and clinical sciences.
A developmental screening tool for toddlers with multiple domains based on Rasch analysis.
Hwang, Ai-Wen; Chou, Yeh-Tai; Hsieh, Ching-Lin; Hsieh, Wu-Shiun; Liao, Hua-Fang; Wong, Alice May-Kuen
2015-01-01
Using multidomain developmental screening tools is a feasible method for pediatric health care professionals to identify children at risk of developmental problems in multiple domains simultaneously. The purpose of this study was to develop a Rasch-based tool for Multidimensional Screening in Child Development (MuSiC) for children aged 0-3 years. The MuSic was developed by constructing items bank based on three commonly used screening tools, validating with developmental status (at risk for delay or not) on five developmental domains. Parents of a convenient sample of 632 children (aged 3-35.5 months) with and without developmental delays responded to items from the three screening tools funded by health authorities in Taiwan. Item bank was determined by item fit of Rasch analysis for each of the five developmental domains (cognitive skills, language skills, gross motor skills, fine motor skills, and socioadaptive skills). Children's performance scores in logits derived in Rasch analysis were validated with developmental status for each domain using the area under receiver operating characteristic curves. MuSiC, a 75-item developmental screening tool for five domains, was derived. The diagnostic validity of all five domains was acceptable for all stages of development, except for the infant stage (≤11 months and 15 days). MuSiC can be applied simultaneously to well-child care visits as a universal screening tool for children aged 1-3 years on multiple domains. Items with sound validity for infants need to be further developed. Copyright © 2014. Published by Elsevier B.V.
Aeroelastic Modeling of X-56A Stiff-Wing Configuration Flight Test Data
NASA Technical Reports Server (NTRS)
Grauer, Jared A.; Boucher, Matthew J.
2017-01-01
Aeroelastic stability and control derivatives for the X-56A Multi-Utility Technology Testbed (MUTT), in the stiff-wing configuration, were estimated from flight test data using the output-error method. Practical aspects of the analysis are discussed. The orthogonal phase-optimized multisine inputs provided excellent data information for aeroelastic modeling. Consistent parameter estimates were determined using output error in both the frequency and time domains. The frequency domain analysis converged faster and was less sensitive to starting values for the model parameters, which was useful for determining the aeroelastic model structure and obtaining starting values for the time domain analysis. Including a modal description of the structure from a finite element model reduced the complexity of the estimation problem and improved the modeling results. Effects of reducing the model order on the short period stability and control derivatives were investigated.
Arrhythmia Classification Based on Multi-Domain Feature Extraction for an ECG Recognition System.
Li, Hongqiang; Yuan, Danyang; Wang, Youxi; Cui, Dianyin; Cao, Lu
2016-10-20
Automatic recognition of arrhythmias is particularly important in the diagnosis of heart diseases. This study presents an electrocardiogram (ECG) recognition system based on multi-domain feature extraction to classify ECG beats. An improved wavelet threshold method for ECG signal pre-processing is applied to remove noise interference. A novel multi-domain feature extraction method is proposed; this method employs kernel-independent component analysis in nonlinear feature extraction and uses discrete wavelet transform to extract frequency domain features. The proposed system utilises a support vector machine classifier optimized with a genetic algorithm to recognize different types of heartbeats. An ECG acquisition experimental platform, in which ECG beats are collected as ECG data for classification, is constructed to demonstrate the effectiveness of the system in ECG beat classification. The presented system, when applied to the MIT-BIH arrhythmia database, achieves a high classification accuracy of 98.8%. Experimental results based on the ECG acquisition experimental platform show that the system obtains a satisfactory classification accuracy of 97.3% and is able to classify ECG beats efficiently for the automatic identification of cardiac arrhythmias.
Arrhythmia Classification Based on Multi-Domain Feature Extraction for an ECG Recognition System
Li, Hongqiang; Yuan, Danyang; Wang, Youxi; Cui, Dianyin; Cao, Lu
2016-01-01
Automatic recognition of arrhythmias is particularly important in the diagnosis of heart diseases. This study presents an electrocardiogram (ECG) recognition system based on multi-domain feature extraction to classify ECG beats. An improved wavelet threshold method for ECG signal pre-processing is applied to remove noise interference. A novel multi-domain feature extraction method is proposed; this method employs kernel-independent component analysis in nonlinear feature extraction and uses discrete wavelet transform to extract frequency domain features. The proposed system utilises a support vector machine classifier optimized with a genetic algorithm to recognize different types of heartbeats. An ECG acquisition experimental platform, in which ECG beats are collected as ECG data for classification, is constructed to demonstrate the effectiveness of the system in ECG beat classification. The presented system, when applied to the MIT-BIH arrhythmia database, achieves a high classification accuracy of 98.8%. Experimental results based on the ECG acquisition experimental platform show that the system obtains a satisfactory classification accuracy of 97.3% and is able to classify ECG beats efficiently for the automatic identification of cardiac arrhythmias. PMID:27775596
Action recognition in depth video from RGB perspective: A knowledge transfer manner
NASA Astrophysics Data System (ADS)
Chen, Jun; Xiao, Yang; Cao, Zhiguo; Fang, Zhiwen
2018-03-01
Different video modal for human action recognition has becoming a highly promising trend in the video analysis. In this paper, we propose a method for human action recognition from RGB video to Depth video using domain adaptation, where we use learned feature from RGB videos to do action recognition for depth videos. More specifically, we make three steps for solving this problem in this paper. First, different from image, video is more complex as it has both spatial and temporal information, in order to better encode this information, dynamic image method is used to represent each RGB or Depth video to one image, based on this, most methods for extracting feature in image can be used in video. Secondly, as video can be represented as image, so standard CNN model can be used for training and testing for videos, beside, CNN model can be also used for feature extracting as its powerful feature expressing ability. Thirdly, as RGB videos and Depth videos are belong to two different domains, in order to make two different feature domains has more similarity, domain adaptation is firstly used for solving this problem between RGB and Depth video, based on this, the learned feature from RGB video model can be directly used for Depth video classification. We evaluate the proposed method on one complex RGB-D action dataset (NTU RGB-D), and our method can have more than 2% accuracy improvement using domain adaptation from RGB to Depth action recognition.
Statistical plant set estimation using Schroeder-phased multisinusoidal input design
NASA Technical Reports Server (NTRS)
Bayard, D. S.
1992-01-01
A frequency domain method is developed for plant set estimation. The estimation of a plant 'set' rather than a point estimate is required to support many methods of modern robust control design. The approach here is based on using a Schroeder-phased multisinusoid input design which has the special property of placing input energy only at the discrete frequency points used in the computation. A detailed analysis of the statistical properties of the frequency domain estimator is given, leading to exact expressions for the probability distribution of the estimation error, and many important properties. It is shown that, for any nominal parametric plant estimate, one can use these results to construct an overbound on the additive uncertainty to any prescribed statistical confidence. The 'soft' bound thus obtained can be used to replace 'hard' bounds presently used in many robust control analysis and synthesis methods.
Inclusion of Structural Flexibility in Design Load Analysis for Wave Energy Converters: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Yi; Yu, Yi-Hsiang; van Rij, Jennifer A
2017-08-14
Hydroelastic interactions, caused by ocean wave loading on wave energy devices with deformable structures, are studied in the time domain. A midfidelity, hybrid modeling approach of rigid-body and flexible-body dynamics is developed and implemented in an open-source simulation tool for wave energy converters (WEC-Sim) to simulate the dynamic responses of wave energy converter component structural deformations under wave loading. A generalized coordinate system, including degrees of freedom associated with rigid bodies, structural modes, and constraints connecting multiple bodies, is utilized. A simplified method of calculating stress loads and sectional bending moments is implemented, with the purpose of sizing and designingmore » wave energy converters. Results calculated using the method presented are verified with those of high-fidelity fluid-structure interaction simulations, as well as low-fidelity, frequency-domain, boundary element method analysis.« less
Quantifying NMR relaxation correlation and exchange in articular cartilage with time domain analysis
NASA Astrophysics Data System (ADS)
Mailhiot, Sarah E.; Zong, Fangrong; Maneval, James E.; June, Ronald K.; Galvosas, Petrik; Seymour, Joseph D.
2018-02-01
Measured nuclear magnetic resonance (NMR) transverse relaxation data in articular cartilage has been shown to be multi-exponential and correlated to the health of the tissue. The observed relaxation rates are dependent on experimental parameters such as solvent, data acquisition methods, data analysis methods, and alignment to the magnetic field. In this study, we show that diffusive exchange occurs in porcine articular cartilage and impacts the observed relaxation rates in T1-T2 correlation experiments. By using time domain analysis of T2-T2 exchange spectroscopy, the diffusive exchange time can be quantified by measurements that use a single mixing time. Measured characteristic times for exchange are commensurate with T1 in this material and so impacts the observed T1 behavior. The approach used here allows for reliable quantification of NMR relaxation behavior in cartilage in the presence of diffusive fluid exchange between two environments.
Functional feature embedded space mapping of fMRI data.
Hu, Jin; Tian, Jie; Yang, Lei
2006-01-01
We have proposed a new method for fMRI data analysis which is called Functional Feature Embedded Space Mapping (FFESM). Our work mainly focuses on the experimental design with periodic stimuli which can be described by a number of Fourier coefficients in the frequency domain. A nonlinear dimension reduction technique Isomap is applied to the high dimensional features obtained from frequency domain of the fMRI data for the first time. Finally, the presence of activated time series is identified by the clustering method in which the information theoretic criterion of minimum description length (MDL) is used to estimate the number of clusters. The feasibility of our algorithm is demonstrated by real human experiments. Although we focus on analyzing periodic fMRI data, the approach can be extended to analyze non-periodic fMRI data (event-related fMRI) by replacing the Fourier analysis with a wavelet analysis.
Adolescent Asthma Self-Management: A Concept Analysis and Operational Definition.
Mammen, Jennifer; Rhee, Hyekyun
2012-12-01
BACKGROUND: Adolescents with asthma have a higher risk of morbidity and mortality than other age groups. Asthma self-management has been shown to improve outcomes; however, the concept of asthma self-management is not explicitly defined. METHODS: We use the Norris method of concept clarification to delineate what constitutes the concept of asthma self-management in adolescents. Five databases were searched to identify components of the concept of adolescent asthma self-management, and lists of relevant subconcepts were compiled and categorized. RESULTS: Analysis revealed 4 specific domains of self-management behaviors: (1) symptom prevention; (2) symptom monitoring; (3) acute symptom management; and (4) communication with important others. These domains of self-management were mediated by intrapersonal/cognitive and interpersonal/contextual factors. CONCLUSIONS: Based on the analysis, we offer a research-based operational definition for adolescent asthma self-management and a preliminary model that can serve as a conceptual base for further research.
Lin, Chung-Ying; Hwang, Jing-Shiang; Wang, Wen-Chung; Lai, Wu-Wei; Su, Wu-Chou; Wu, Tzu-Yi; Yao, Grace; Wang, Jung-Der
2018-04-13
Quality of life (QoL) is important for clinicians to evaluate how cancer survivors judge their sense of well-being, and WHOQOL-BREF may be a good tool for clinical use. However, at least three issues remain unresolved: (1) the psychometric properties of the WHOQOL-BREF for cancer patients are insufficient; (2) the scoring method used for WHOQOL-BREF needs to be clarify; (3) whether different types of cancer patients interpret the WHOQOL-BREF similarly. We recruited 1000 outpatients with head/neck cancer, 1000 with colorectal cancer, 965 with liver cancer, 1438 with lung cancer and 1299 with gynecologic cancers in a medical center. Data analyses included Rasch models, confirmatory factor analysis (CFA), and Pearson correlations. The mean WHOQOL-BREF domain scores were between 13.34 and 14.77 among all participants. CFA supported construct validity; Rasch models revealed that almost all items were embedded in their expected domains and were interpreted similarly across five types of cancer patients; all correlation coefficients between Rasch scores and original domain scores were above 0.9. The linear relationship between Rasch scores and domain scores suggested that the current calculations for domain scores were applicable and without serious bias. Clinical practitioners may regularly collect and record the WHOQOL-BREF domain scores into electronic health records. Copyright © 2018. Published by Elsevier B.V.
Hypercube matrix computation task
NASA Technical Reports Server (NTRS)
Calalo, Ruel H.; Imbriale, William A.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Lyons, James R.; Manshadi, Farzin; Patterson, Jean E.
1988-01-01
A major objective of the Hypercube Matrix Computation effort at the Jet Propulsion Laboratory (JPL) is to investigate the applicability of a parallel computing architecture to the solution of large-scale electromagnetic scattering problems. Three scattering analysis codes are being implemented and assessed on a JPL/California Institute of Technology (Caltech) Mark 3 Hypercube. The codes, which utilize different underlying algorithms, give a means of evaluating the general applicability of this parallel architecture. The three analysis codes being implemented are a frequency domain method of moments code, a time domain finite difference code, and a frequency domain finite elements code. These analysis capabilities are being integrated into an electromagnetics interactive analysis workstation which can serve as a design tool for the construction of antennas and other radiating or scattering structures. The first two years of work on the Hypercube Matrix Computation effort is summarized. It includes both new developments and results as well as work previously reported in the Hypercube Matrix Computation Task: Final Report for 1986 to 1987 (JPL Publication 87-18).
Shang, Jianyu; Deng, Zhihong; Fu, Mengyin; Wang, Shunting
2016-06-16
Traditional artillery guidance can significantly improve the attack accuracy and overall combat efficiency of projectiles, which makes it more adaptable to the information warfare of the future. Obviously, the accurate measurement of artillery spin rate, which has long been regarded as a daunting task, is the basis of precise guidance and control. Magnetoresistive (MR) sensors can be applied to spin rate measurement, especially in the high-spin and high-g projectile launch environment. In this paper, based on the theory of a MR sensor measuring spin rate, the mathematical relationship model between the frequency of MR sensor output and projectile spin rate was established through a fundamental derivation. By analyzing the characteristics of MR sensor output whose frequency varies with time, this paper proposed the Chirp z-Transform (CZT) time-frequency (TF) domain analysis method based on the rolling window of a Blackman window function (BCZT) which can accurately extract the projectile spin rate. To put it into practice, BCZT was applied to measure the spin rate of 155 mm artillery projectile. After extracting the spin rate, the impact that launch rotational angular velocity and aspect angle have on the extraction accuracy of the spin rate was analyzed. Simulation results show that the BCZT TF domain analysis method can effectively and accurately measure the projectile spin rate, especially in a high-spin and high-g projectile launch environment.
Exponential Family Functional data analysis via a low-rank model.
Li, Gen; Huang, Jianhua Z; Shen, Haipeng
2018-05-08
In many applications, non-Gaussian data such as binary or count are observed over a continuous domain and there exists a smooth underlying structure for describing such data. We develop a new functional data method to deal with this kind of data when the data are regularly spaced on the continuous domain. Our method, referred to as Exponential Family Functional Principal Component Analysis (EFPCA), assumes the data are generated from an exponential family distribution, and the matrix of the canonical parameters has a low-rank structure. The proposed method flexibly accommodates not only the standard one-way functional data, but also two-way (or bivariate) functional data. In addition, we introduce a new cross validation method for estimating the latent rank of a generalized data matrix. We demonstrate the efficacy of the proposed methods using a comprehensive simulation study. The proposed method is also applied to a real application of the UK mortality study, where data are binomially distributed and two-way functional across age groups and calendar years. The results offer novel insights into the underlying mortality pattern. © 2018, The International Biometric Society.
Analysis for Non-Traditional Security Challenges: Methods and Tools
2006-11-20
PMESII Modeling Challenges modeling or where data is not available to support the model, would aid decision Domain is large, nebulous, complex, and...traditional challenges . This includes enlisting the aid of the inter-agency and alliance/coalition communities. Second, we need to realize this...20 November 2006 MILITARY OPERATIONS RESEARCH SOCIETY MIFh MORS Workshop Analysis for Non-Traditional Security Challenges : Methods and Tools 21-23
Computer analysis of multicircuit shells of revolution by the field method
NASA Technical Reports Server (NTRS)
Cohen, G. A.
1975-01-01
The field method, presented previously for the solution of even-order linear boundary value problems defined on one-dimensional open branch domains, is extended to boundary value problems defined on one-dimensional domains containing circuits. This method converts the boundary value problem into two successive numerically stable initial value problems, which may be solved by standard forward integration techniques. In addition, a new method for the treatment of singular boundary conditions is presented. This method, which amounts to a partial interchange of the roles of force and displacement variables, is problem independent with respect to both accuracy and speed of execution. This method was implemented in a computer program to calculate the static response of ring stiffened orthotropic multicircuit shells of revolution to asymmetric loads. Solutions are presented for sample problems which illustrate the accuracy and efficiency of the method.
Fast and Accurate Fitting and Filtering of Noisy Exponentials in Legendre Space
Bao, Guobin; Schild, Detlev
2014-01-01
The parameters of experimentally obtained exponentials are usually found by least-squares fitting methods. Essentially, this is done by minimizing the mean squares sum of the differences between the data, most often a function of time, and a parameter-defined model function. Here we delineate a novel method where the noisy data are represented and analyzed in the space of Legendre polynomials. This is advantageous in several respects. First, parameter retrieval in the Legendre domain is typically two orders of magnitude faster than direct fitting in the time domain. Second, data fitting in a low-dimensional Legendre space yields estimates for amplitudes and time constants which are, on the average, more precise compared to least-squares-fitting with equal weights in the time domain. Third, the Legendre analysis of two exponentials gives satisfactory estimates in parameter ranges where least-squares-fitting in the time domain typically fails. Finally, filtering exponentials in the domain of Legendre polynomials leads to marked noise removal without the phase shift characteristic for conventional lowpass filters. PMID:24603904
Mega-Analysis of School Psychology Blueprint for Training and Practice Domains
ERIC Educational Resources Information Center
Burns, Matthew K.; Kanive, Rebecca; Zaslofsky, Anne F.; Parker, David C.
2013-01-01
Meta-analytic research is an effective method for synthesizing existing research and for informing practice and policy. Hattie (2009) suggested that meta-analytic procedures could be employed to existing meta-analyses to create a mega-analysis. The current mega-analysis examined a sample of 47 meta-analyses according to the "School…
Low-Resolution Raman-Spectroscopy Combustion Thermometry
NASA Technical Reports Server (NTRS)
Nguyen, Quang-Viet; Kojima, Jun
2008-01-01
A method of optical thermometry, now undergoing development, involves low-resolution measurement of the spectrum of spontaneous Raman scattering (SRS) from N2 and O2 molecules. The method is especially suitable for measuring temperatures in high pressure combustion environments that contain N2, O2, or N2/O2 mixtures (including air). Methods based on SRS (in which scattered light is shifted in wavelength by amounts that depend on vibrational and rotational energy levels of laser-illuminated molecules) have been popular means of probing flames because they are almost the only methods that provide spatially and temporally resolved concentrations and temperatures of multiple molecular species in turbulent combustion. The present SRS-based method differs from prior SRS-based methods that have various drawbacks, a description of which would exceed the scope of this article. Two main differences between this and prior SRS-based methods are that it involves analysis in the frequency (equivalently, wavelength) domain, in contradistinction to analysis in the intensity domain in prior methods; and it involves low-resolution measurement of what amounts to predominantly the rotational Raman spectra of N2 and O2, in contradistinction to higher-resolution measurement of the vibrational Raman spectrum of N2 only in prior methods.
Purification, crystallization and preliminary X-ray analysis of the IgV domain of human nectin-4.
Xu, Xiang; Zhang, Xiaoai; Lu, Guangwen; Cai, Yongping
2012-08-01
Nectin-4 belongs to a family of immunoglobulin-like cell adhesion molecules and is highly expressed in cancer cells. Recently, nectin-4 was found to be a receptor of measles virus and the IgV domain sustains strong binding to measles virus H protein. In this study, the successful expression and purification of human nectin-4 V domain (nectin-4v) is reported. The purified protein was crystallized using the sitting-drop vapour-diffusion method. The crystals diffracted to 1.8 Å resolution and belonged to space group P2(1), with unit-cell parameters a = 33.1, b = 51.7, c = 56.9 Å, β = 94.7°. Preliminary analysis of the diffraction data was also performed.
Purification, crystallization and preliminary X-ray analysis of the IgV domain of human nectin-4
Xu, Xiang; Zhang, Xiaoai; Lu, Guangwen; Cai, Yongping
2012-01-01
Nectin-4 belongs to a family of immunoglobulin-like cell adhesion molecules and is highly expressed in cancer cells. Recently, nectin-4 was found to be a receptor of measles virus and the IgV domain sustains strong binding to measles virus H protein. In this study, the successful expression and purification of human nectin-4 V domain (nectin-4v) is reported. The purified protein was crystallized using the sitting-drop vapour-diffusion method. The crystals diffracted to 1.8 Å resolution and belonged to space group P21, with unit-cell parameters a = 33.1, b = 51.7, c = 56.9 Å, β = 94.7°. Preliminary analysis of the diffraction data was also performed. PMID:22869128
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakamura, Tsutomu; Ishikawa, Kazuhiko; Hagihara, Yoshihisa
The expression, purification and preliminary X-ray diffraction studies of a chitin-binding domain of the chitinase from P. furiosus are reported. The crystallization and preliminary X-ray diffraction analysis of the chitin-binding domain of chitinase from a hyperthermophilic archaeon, Pyrococcus furiosus, are reported. The recombinant protein was prepared using an Escherichia coli overexpression system and was crystallized by the hanging-drop vapour-diffusion method. An X-ray diffraction data set was collected to 1.70 Å resolution. The crystal belonged to space group P4{sub 3}2{sub 1}2 or P4{sub 1}2{sub 1}2. The unit-cell parameters were determined to be a = b = 48.8, c = 85.0 Å.
Chai, Xin; Wang, Qisong; Zhao, Yongping; Li, Yongqiang; Liu, Dan; Liu, Xin; Bai, Ou
2017-01-01
Electroencephalography (EEG)-based emotion recognition is an important element in psychiatric health diagnosis for patients. However, the underlying EEG sensor signals are always non-stationary if they are sampled from different experimental sessions or subjects. This results in the deterioration of the classification performance. Domain adaptation methods offer an effective way to reduce the discrepancy of marginal distribution. However, for EEG sensor signals, both marginal and conditional distributions may be mismatched. In addition, the existing domain adaptation strategies always require a high level of additional computation. To address this problem, a novel strategy named adaptive subspace feature matching (ASFM) is proposed in this paper in order to integrate both the marginal and conditional distributions within a unified framework (without any labeled samples from target subjects). Specifically, we develop a linear transformation function which matches the marginal distributions of the source and target subspaces without a regularization term. This significantly decreases the time complexity of our domain adaptation procedure. As a result, both marginal and conditional distribution discrepancies between the source domain and unlabeled target domain can be reduced, and logistic regression (LR) can be applied to the new source domain in order to train a classifier for use in the target domain, since the aligned source domain follows a distribution which is similar to that of the target domain. We compare our ASFM method with six typical approaches using a public EEG dataset with three affective states: positive, neutral, and negative. Both offline and online evaluations were performed. The subject-to-subject offline experimental results demonstrate that our component achieves a mean accuracy and standard deviation of 80.46% and 6.84%, respectively, as compared with a state-of-the-art method, the subspace alignment auto-encoder (SAAE), which achieves values of 77.88% and 7.33% on average, respectively. For the online analysis, the average classification accuracy and standard deviation of ASFM in the subject-to-subject evaluation for all the 15 subjects in a dataset was 75.11% and 7.65%, respectively, gaining a significant performance improvement compared to the best baseline LR which achieves 56.38% and 7.48%, respectively. The experimental results confirm the effectiveness of the proposed method relative to state-of-the-art methods. Moreover, computational efficiency of the proposed ASFM method is much better than standard domain adaptation; if the numbers of training samples and test samples are controlled within certain range, it is suitable for real-time classification. It can be concluded that ASFM is a useful and effective tool for decreasing domain discrepancy and reducing performance degradation across subjects and sessions in the field of EEG-based emotion recognition. PMID:28467371
Chai, Xin; Wang, Qisong; Zhao, Yongping; Li, Yongqiang; Liu, Dan; Liu, Xin; Bai, Ou
2017-05-03
Electroencephalography (EEG)-based emotion recognition is an important element in psychiatric health diagnosis for patients. However, the underlying EEG sensor signals are always non-stationary if they are sampled from different experimental sessions or subjects. This results in the deterioration of the classification performance. Domain adaptation methods offer an effective way to reduce the discrepancy of marginal distribution. However, for EEG sensor signals, both marginal and conditional distributions may be mismatched. In addition, the existing domain adaptation strategies always require a high level of additional computation. To address this problem, a novel strategy named adaptive subspace feature matching (ASFM) is proposed in this paper in order to integrate both the marginal and conditional distributions within a unified framework (without any labeled samples from target subjects). Specifically, we develop a linear transformation function which matches the marginal distributions of the source and target subspaces without a regularization term. This significantly decreases the time complexity of our domain adaptation procedure. As a result, both marginal and conditional distribution discrepancies between the source domain and unlabeled target domain can be reduced, and logistic regression (LR) can be applied to the new source domain in order to train a classifier for use in the target domain, since the aligned source domain follows a distribution which is similar to that of the target domain. We compare our ASFM method with six typical approaches using a public EEG dataset with three affective states: positive, neutral, and negative. Both offline and online evaluations were performed. The subject-to-subject offline experimental results demonstrate that our component achieves a mean accuracy and standard deviation of 80.46% and 6.84%, respectively, as compared with a state-of-the-art method, the subspace alignment auto-encoder (SAAE), which achieves values of 77.88% and 7.33% on average, respectively. For the online analysis, the average classification accuracy and standard deviation of ASFM in the subject-to-subject evaluation for all the 15 subjects in a dataset was 75.11% and 7.65%, respectively, gaining a significant performance improvement compared to the best baseline LR which achieves 56.38% and 7.48%, respectively. The experimental results confirm the effectiveness of the proposed method relative to state-of-the-art methods. Moreover, computational efficiency of the proposed ASFM method is much better than standard domain adaptation; if the numbers of training samples and test samples are controlled within certain range, it is suitable for real-time classification. It can be concluded that ASFM is a useful and effective tool for decreasing domain discrepancy and reducing performance degradation across subjects and sessions in the field of EEG-based emotion recognition.
Sajobi, Tolulope T; Lix, Lisa M; Singh, Gurbakhshash; Lowerison, Mark; Engbers, Jordan; Mayo, Nancy E
2015-03-01
Response shift (RS) is an important phenomenon that influences the assessment of longitudinal changes in health-related quality of life (HRQOL) studies. Given that RS effects are often small, missing data due to attrition or item non-response can contribute to failure to detect RS effects. Since missing data are often encountered in longitudinal HRQOL data, effective strategies to deal with missing data are important to consider. This study aims to compare different imputation methods on the detection of reprioritization RS in the HRQOL of caregivers of stroke survivors. Data were from a Canadian multi-center longitudinal study of caregivers of stroke survivors over a one-year period. The Stroke Impact Scale physical function score at baseline, with a cutoff of 75, was used to measure patient stroke severity for the reprioritization RS analysis. Mean imputation, likelihood-based expectation-maximization imputation, and multiple imputation methods were compared in test procedures based on changes in relative importance weights to detect RS in SF-36 domains over a 6-month period. Monte Carlo simulation methods were used to compare the statistical powers of relative importance test procedures for detecting RS in incomplete longitudinal data under different missing data mechanisms and imputation methods. Of the 409 caregivers, 15.9 and 31.3 % of them had missing data at baseline and 6 months, respectively. There were no statistically significant changes in relative importance weights on any of the domains when complete-case analysis was adopted. But statistical significant changes were detected on physical functioning and/or vitality domains when mean imputation or EM imputation was adopted. There were also statistically significant changes in relative importance weights for physical functioning, mental health, and vitality domains when multiple imputation method was adopted. Our simulations revealed that relative importance test procedures were least powerful under complete-case analysis method and most powerful when a mean imputation or multiple imputation method was adopted for missing data, regardless of the missing data mechanism and proportion of missing data. Test procedures based on relative importance measures are sensitive to the type and amount of missing data and imputation method. Relative importance test procedures based on mean imputation and multiple imputation are recommended for detecting RS in incomplete data.
CORAL: aligning conserved core regions across domain families.
Fong, Jessica H; Marchler-Bauer, Aron
2009-08-01
Homologous protein families share highly conserved sequence and structure regions that are frequent targets for comparative analysis of related proteins and families. Many protein families, such as the curated domain families in the Conserved Domain Database (CDD), exhibit similar structural cores. To improve accuracy in aligning such protein families, we propose a profile-profile method CORAL that aligns individual core regions as gap-free units. CORAL computes optimal local alignment of two profiles with heuristics to preserve continuity within core regions. We benchmarked its performance on curated domains in CDD, which have pre-defined core regions, against COMPASS, HHalign and PSI-BLAST, using structure superpositions and comprehensive curator-optimized alignments as standards of truth. CORAL improves alignment accuracy on core regions over general profile methods, returning a balanced score of 0.57 for over 80% of all domain families in CDD, compared with the highest balanced score of 0.45 from other methods. Further, CORAL provides E-values to aid in detecting homologous protein families and, by respecting block boundaries, produces alignments with improved 'readability' that facilitate manual refinement. CORAL will be included in future versions of the NCBI Cn3D/CDTree software, which can be downloaded at http://www.ncbi.nlm.nih.gov/Structure/cdtree/cdtree.shtml. Supplementary data are available at Bioinformatics online.
Dong, Chongmei; Vincent, Kate; Sharp, Peter
2009-12-04
TILLING (Targeting Induced Local Lesions IN Genomes) is a powerful tool for reverse genetics, combining traditional chemical mutagenesis with high-throughput PCR-based mutation detection to discover induced mutations that alter protein function. The most popular mutation detection method for TILLING is a mismatch cleavage assay using the endonuclease CelI. For this method, locus-specific PCR is essential. Most wheat genes are present as three similar sequences with high homology in exons and low homology in introns. Locus-specific primers can usually be designed in introns. However, it is sometimes difficult to design locus-specific PCR primers in a conserved region with high homology among the three homoeologous genes, or in a gene lacking introns, or if information on introns is not available. Here we describe a mutation detection method which combines High Resolution Melting (HRM) analysis of mixed PCR amplicons containing three homoeologous gene fragments and sequence analysis using Mutation Surveyor software, aimed at simultaneous detection of mutations in three homoeologous genes. We demonstrate that High Resolution Melting (HRM) analysis can be used in mutation scans in mixed PCR amplicons containing three homoeologous gene fragments. Combining HRM scanning with sequence analysis using Mutation Surveyor is sensitive enough to detect a single nucleotide mutation in the heterozygous state in a mixed PCR amplicon containing three homoeoloci. The method was tested and validated in an EMS (ethylmethane sulfonate)-treated wheat TILLING population, screening mutations in the carboxyl terminal domain of the Starch Synthase II (SSII) gene. Selected identified mutations of interest can be further analysed by cloning to confirm the mutation and determine the genomic origin of the mutation. Polyploidy is common in plants. Conserved regions of a gene often represent functional domains and have high sequence similarity between homoeologous loci. The method described here is a useful alternative to locus-specific based methods for screening mutations in conserved functional domains of homoeologous genes. This method can also be used for SNP (single nucleotide polymorphism) marker development and eco-TILLING in polyploid species.
Event extraction of bacteria biotopes: a knowledge-intensive NLP-based approach
2012-01-01
Background Bacteria biotopes cover a wide range of diverse habitats including animal and plant hosts, natural, medical and industrial environments. The high volume of publications in the microbiology domain provides a rich source of up-to-date information on bacteria biotopes. This information, as found in scientific articles, is expressed in natural language and is rarely available in a structured format, such as a database. This information is of great importance for fundamental research and microbiology applications (e.g., medicine, agronomy, food, bioenergy). The automatic extraction of this information from texts will provide a great benefit to the field. Methods We present a new method for extracting relationships between bacteria and their locations using the Alvis framework. Recognition of bacteria and their locations was achieved using a pattern-based approach and domain lexical resources. For the detection of environment locations, we propose a new approach that combines lexical information and the syntactic-semantic analysis of corpus terms to overcome the incompleteness of lexical resources. Bacteria location relations extend over sentence borders, and we developed domain-specific rules for dealing with bacteria anaphors. Results We participated in the BioNLP 2011 Bacteria Biotope (BB) task with the Alvis system. Official evaluation results show that it achieves the best performance of participating systems. New developments since then have increased the F-score by 4.1 points. Conclusions We have shown that the combination of semantic analysis and domain-adapted resources is both effective and efficient for event information extraction in the bacteria biotope domain. We plan to adapt the method to deal with a larger set of location types and a large-scale scientific article corpus to enable microbiologists to integrate and use the extracted knowledge in combination with experimental data. PMID:22759462
Bore, Thierry; Wagner, Norman; Delepine Lesoille, Sylvie; Taillade, Frederic; Six, Gonzague; Daout, Franck; Placko, Dominique
2016-01-01
Broadband electromagnetic frequency or time domain sensor techniques present high potential for quantitative water content monitoring in porous media. Prior to in situ application, the impact of the relationship between the broadband electromagnetic properties of the porous material (clay-rock) and the water content on the frequency or time domain sensor response is required. For this purpose, dielectric properties of intact clay rock samples experimental determined in the frequency range from 1 MHz to 10 GHz were used as input data in 3-D numerical frequency domain finite element field calculations to model the one port broadband frequency or time domain transfer function for a three rods based sensor embedded in the clay-rock. The sensor response in terms of the reflection factor was analyzed in time domain with classical travel time analysis in combination with an empirical model according to Topp equation, as well as the theoretical Lichtenecker and Rother model (LRM) to estimate the volumetric water content. The mixture equation considering the appropriate porosity of the investigated material provide a practical and efficient approach for water content estimation based on classical travel time analysis with the onset-method. The inflection method is not recommended for water content estimation in electrical dispersive and absorptive material. Moreover, the results clearly indicate that effects due to coupling of the sensor to the material cannot be neglected. Coupling problems caused by an air gap lead to dramatic effects on water content estimation, even for submillimeter gaps. Thus, the quantitative determination of the in situ water content requires careful sensor installation in order to reach a perfect probe clay rock coupling. PMID:27096865
Bore, Thierry; Wagner, Norman; Lesoille, Sylvie Delepine; Taillade, Frederic; Six, Gonzague; Daout, Franck; Placko, Dominique
2016-04-18
Broadband electromagnetic frequency or time domain sensor techniques present high potential for quantitative water content monitoring in porous media. Prior to in situ application, the impact of the relationship between the broadband electromagnetic properties of the porous material (clay-rock) and the water content on the frequency or time domain sensor response is required. For this purpose, dielectric properties of intact clay rock samples experimental determined in the frequency range from 1 MHz to 10 GHz were used as input data in 3-D numerical frequency domain finite element field calculations to model the one port broadband frequency or time domain transfer function for a three rods based sensor embedded in the clay-rock. The sensor response in terms of the reflection factor was analyzed in time domain with classical travel time analysis in combination with an empirical model according to Topp equation, as well as the theoretical Lichtenecker and Rother model (LRM) to estimate the volumetric water content. The mixture equation considering the appropriate porosity of the investigated material provide a practical and efficient approach for water content estimation based on classical travel time analysis with the onset-method. The inflection method is not recommended for water content estimation in electrical dispersive and absorptive material. Moreover, the results clearly indicate that effects due to coupling of the sensor to the material cannot be neglected. Coupling problems caused by an air gap lead to dramatic effects on water content estimation, even for submillimeter gaps. Thus, the quantitative determination of the in situ water content requires careful sensor installation in order to reach a perfect probe clay rock coupling.
Leske, David A; Hatt, Sarah R; Liebermann, Laura; Holmes, Jonathan M
2016-02-01
We compare two methods of analysis for Rasch scoring pre- to postintervention data: Rasch lookup table versus de novo stacked Rasch analysis using the Adult Strabismus-20 (AS-20). One hundred forty-seven subjects completed the AS-20 questionnaire prior to surgery and 6 weeks postoperatively. Subjects were classified 6 weeks postoperatively as "success," "partial success," or "failure" based on angle and diplopia status. Postoperative change in AS-20 scores was compared for all four AS-20 domains (self-perception, interactions, reading function, and general function) overall and by success status using two methods: (1) applying historical Rasch threshold measures from lookup tables and (2) performing a stacked de novo Rasch analysis. Change was assessed by analyzing effect size, improvement exceeding 95% limits of agreement (LOA), and score distributions. Effect sizes were similar for all AS-20 domains whether obtained from lookup tables or stacked analysis. Similar proportions exceeded 95% LOAs using lookup tables versus stacked analysis. Improvement in median score was observed for all AS-20 domains using lookup tables and stacked analysis ( P < 0.0001 for all comparisons). The Rasch-scored AS-20 is a responsive and valid instrument designed to measure strabismus-specific health-related quality of life. When analyzing pre- to postoperative change in AS-20 scores, Rasch lookup tables and de novo stacked Rasch analysis yield essentially the same results. We describe a practical application of lookup tables, allowing the clinician or researcher to score the Rasch-calibrated AS-20 questionnaire without specialized software.
Burnett, T L; Comyn, T P; Merson, E; Bell, A J; Mingard, K; Hegarty, T; Cain, M
2008-05-01
xBiFeO(3)-(1-x)PbTiO(3) single crystals were grown via a flux method for a range of compositions. Presented here is a study of the domain configuration in the 0.5BiFeO(3)-0.5PbTiO(3) composition using electron backscatter diffraction to demonstrate the ability of the technique to map ferroelastic domain structures at the micron and submicron scale. The micron-scale domains exhibit an angle of approximately 85 degrees between each variant, indicative of a ferroelastic domain wall in a tetragonal system with a spontaneous strain, c/a - 1 of 0.10, in excellent agreement with the lattice parameters derived from x-ray diffraction. Contrast seen in forescatter images is attributed to variations in the direction of the electrical polarization vector, providing images of ferroelectric domain patterns.
DeWalt, Emma L.; Begue, Victoria J.; Ronau, Judith A.; Sullivan, Shane Z.; Das, Chittaranjan; Simpson, Garth J.
2013-01-01
Polarization-resolved second-harmonic generation (PR-SHG) microscopy is described and applied to identify the presence of multiple crystallographic domains within protein-crystal conglomerates, which was confirmed by synchrotron X-ray diffraction. Principal component analysis (PCA) of PR-SHG images resulted in principal component 2 (PC2) images with areas of contrasting negative and positive values for conglomerated crystals and PC2 images exhibiting uniformly positive or uniformly negative values for single crystals. Qualitative assessment of PC2 images allowed the identification of domains of different internal ordering within protein-crystal samples as well as differentiation between multi-domain conglomerated crystals and single crystals. PR-SHG assessments of crystalline domains were in good agreement with spatially resolved synchrotron X-ray diffraction measurements. These results have implications for improving the productive throughput of protein structure determination through early identification of multi-domain crystals. PMID:23275165
2010-06-01
1 identifies five fundamental IW operations as they relate to the maritime environment and domain. Maritime IrregularWarfare Activities...they relate to MIW. Figure 2 identifies five fundamental IW operations as they relate to the maritime environment and domain. Maritime...meter RHIB is designed for the insertion and extraction of SEAL Team personnel. It is a twin- turbocharged diesel engine, waterjet-propelled personnel
Eisosomes Are Dynamic Plasma Membrane Domains Showing Pil1-Lsp1 Heteroligomer Binding Equilibrium
Olivera-Couto, Agustina; Salzman, Valentina; Mailhos, Milagros; Digman, Michelle A.; Gratton, Enrico; Aguilar, Pablo S.
2015-01-01
Eisosomes are plasma membrane domains concentrating lipids, transporters, and signaling molecules. In the budding yeast Saccharomyces cerevisiae, these domains are structured by scaffolds composed mainly by two cytoplasmic proteins Pil1 and Lsp1. Eisosomes are immobile domains, have relatively uniform size, and encompass thousands of units of the core proteins Pil1 and Lsp1. In this work we used fluorescence fluctuation analytical methods to determine the dynamics of eisosome core proteins at different subcellular locations. Using a combination of scanning techniques with autocorrelation analysis, we show that Pil1 and Lsp1 cytoplasmic pools freely diffuse whereas an eisosome-associated fraction of these proteins exhibits slow dynamics that fit with a binding-unbinding equilibrium. Number and brightness analysis shows that the eisosome-associated fraction is oligomeric, while cytoplasmic pools have lower aggregation states. Fluorescence lifetime imaging results indicate that Pil1 and Lsp1 directly interact in the cytoplasm and within the eisosomes. These results support a model where Pil1-Lsp1 heterodimers are the minimal eisosomes building blocks. Moreover, individual-eisosome fluorescence fluctuation analysis shows that eisosomes in the same cell are not equal domains: while roughly half of them are mostly static, the other half is actively exchanging core protein subunits. PMID:25863055
Nickson, Adrian A.; Stoll, Kate E.; Clarke, Jane
2008-01-01
Protein-engineering methods (Φ-values) were used to investigate the folding transition state of a lysin motif (LysM) domain from Escherichia coli membrane-bound lytic murein transglycosylase D. This domain consists of just 48 structured residues in a symmetrical βααβ arrangement and is the smallest αβ protein yet investigated using these methods. An extensive mutational analysis revealed a highly robust folding pathway with no detectable transition state plasticity, indicating that LysM is an example of an ideal two-state folder. The pattern of Φ-values denotes a highly polarised transition state, with significant formation of the helices but no structure within the β-sheet. Remarkably, this transition state remains polarised after circularisation of the domain, and exhibits an identical Φ-value pattern; however, the interactions within the transition state are uniformly weaker in the circular variant. This observation is supported by results from an Eyring analysis of the folding rates of the two proteins. We propose that the folding pathway of LysM is dominated by enthalpic rather than entropic considerations, and suggest that the lower entropy cost of formation of the circular transition state is balanced, to some extent, by the lower enthalpy of contacts within this structure. PMID:18538343
Preliminary frequency-domain analysis for the reconstructed spatial resolution of muon tomography
NASA Astrophysics Data System (ADS)
Yu, B.; Zhao, Z.; Wang, X.; Wang, Y.; Wu, D.; Zeng, Z.; Zeng, M.; Yi, H.; Luo, Z.; Yue, X.; Cheng, J.
2014-11-01
Muon tomography is an advanced technology to non-destructively detect high atomic number materials. It exploits the multiple Coulomb scattering information of muon to reconstruct the scattering density image of the traversed object. Because of the statistics of muon scattering, the measurement error of system and the data incompleteness, the reconstruction is always accompanied with a certain level of interference, which will influence the reconstructed spatial resolution. While statistical noises can be reduced by extending the measuring time, system parameters determine the ultimate spatial resolution that one system can reach. In this paper, an effective frequency-domain model is proposed to analyze the reconstructed spatial resolution of muon tomography. The proposed method modifies the resolution analysis in conventional computed tomography (CT) to fit the different imaging mechanism in muon scattering tomography. The measured scattering information is described in frequency domain, then a relationship between the measurements and the original image is proposed in Fourier domain, which is named as "Muon Central Slice Theorem". Furthermore, a preliminary analytical expression of the ultimate reconstructed spatial is derived, and the simulations are performed for validation. While the method is able to predict the ultimate spatial resolution of a given system, it can also be utilized for the optimization of system design and construction.
Mukherjee, Shalini; Yadav, Rajeev; Yung, Iris; Zajdel, Daniel P.; Oken, Barry S.
2011-01-01
Objectives To determine 1) whether heart rate variability (HRV) was a sensitive and reliable measure in mental effort tasks carried out by healthy seniors and 2) whether non-linear approaches to HRV analysis, in addition to traditional time and frequency domain approaches were useful to study such effects. Methods Forty healthy seniors performed two visual working memory tasks requiring different levels of mental effort, while ECG was recorded. They underwent the same tasks and recordings two weeks later. Traditional and 13 non-linear indices of HRV including Poincaré, entropy and detrended fluctuation analysis (DFA) were determined. Results Time domain (especially mean R-R interval/RRI), frequency domain and, among nonlinear parameters- Poincaré and DFA were the most reliable indices. Mean RRI, time domain and Poincaré were also the most sensitive to different mental effort task loads and had the largest effect size. Conclusions Overall, linear measures were the most sensitive and reliable indices to mental effort. In non-linear measures, Poincaré was the most reliable and sensitive, suggesting possible usefulness as an independent marker in cognitive function tasks in healthy seniors. Significance A large number of HRV parameters was both reliable as well as sensitive indices of mental effort, although the simple linear methods were the most sensitive. PMID:21459665
An ambiguity principle for assigning protein structural domains.
Postic, Guillaume; Ghouzam, Yassine; Chebrek, Romain; Gelly, Jean-Christophe
2017-01-01
Ambiguity is the quality of being open to several interpretations. For an image, it arises when the contained elements can be delimited in two or more distinct ways, which may cause confusion. We postulate that it also applies to the analysis of protein three-dimensional structure, which consists in dividing the molecule into subunits called domains. Because different definitions of what constitutes a domain can be used to partition a given structure, the same protein may have different but equally valid domain annotations. However, knowledge and experience generally displace our ability to accept more than one way to decompose the structure of an object-in this case, a protein. This human bias in structure analysis is particularly harmful because it leads to ignoring potential avenues of research. We present an automated method capable of producing multiple alternative decompositions of protein structure (web server and source code available at www.dsimb.inserm.fr/sword/). Our innovative algorithm assigns structural domains through the hierarchical merging of protein units, which are evolutionarily preserved substructures that describe protein architecture at an intermediate level, between domain and secondary structure. To validate the use of these protein units for decomposing protein structures into domains, we set up an extensive benchmark made of expert annotations of structural domains and including state-of-the-art domain parsing algorithms. The relevance of our "multipartitioning" approach is shown through numerous examples of applications covering protein function, evolution, folding, and structure prediction. Finally, we introduce a measure for the structural ambiguity of protein molecules.
Huart, C; Rombaux, Ph; Hummel, T; Mouraux, A
2013-09-01
The clinical usefulness of olfactory event-related brain potentials (OERPs) to assess olfactory function is limited by the relatively low signal-to-noise ratio of the responses identified using conventional time-domain averaging. Recently, it was shown that time-frequency analysis of the obtained EEG signals can markedly improve the signal-to-noise ratio of OERPs in healthy controls, because it enhances both phase-locked and non phase-locked EEG responses. The aim of the present study was to investigate the clinical usefulness of this approach and evaluate its feasibility in a clinical setting. We retrospectively analysed EEG recordings obtained from 45 patients (15 anosmic, 15 hyposmic and 15 normos- mic). The responses to olfactory stimulation were analysed using conventional time-domain analysis and joint time-frequency analysis. The ability of the two methods to discriminate between anosmic, hyposmic and normosmic patients was assessed using a Receiver Operating Characteristic analysis. The discrimination performance of OERPs identified using conventional time-domain averaging was poor. In contrast, the discrimination performance of the EEG response identified in the time-frequency domain was relatively high. Furthermore, we found a significant correlation between the magnitude of this response and the psychophysical olfactory score. Time-frequency analysis of the EEG responses to olfactory stimulation could be used as an effective and reliable diagnostic tool for the objective clinical evaluation of olfactory function in patients.
Analysis and Calculation of the Fluid Flow and the Temperature Field by Finite Element Modeling
NASA Astrophysics Data System (ADS)
Dhamodaran, M.; Jegadeesan, S.; Kumar, R. Praveen
2018-04-01
This paper presents a fundamental and accurate approach to study numerical analysis of fluid flow and heat transfer inside a channel. In this study, the Finite Element Method is used to analyze the channel, which is divided into small subsections. The small subsections are discretized using higher number of domain elements and the corresponding number of nodes. MATLAB codes are developed to be used in the analysis. Simulation results showed that the analyses of fluid flow and temperature are influenced significantly by the changing entrance velocity. Also, there is an apparent effect on the temperature fields due to the presence of an energy source in the middle of the domain. In this paper, the characteristics of flow analysis and heat analysis in a channel have been investigated.
1982-09-17
FK * 1PK (2) The convolution of two transforms in time domain is the inverse transform of the product in frequency domain. Thus Rp(s) - Fgc() Ipg(*) (3...its inverse transform by: R,(r)- R,(a.)e’’ do. (5)2w In order to nuke use f a very accurate numerical method to ompute Fourier "ke and coil...taorm. When the inverse transform it tken by using Eq. (15), the cosine transform, because it converges faster than the sine transform refu-ft the
Cognitive engineering and health informatics: Applications and intersections.
Hettinger, A Zachary; Roth, Emilie M; Bisantz, Ann M
2017-03-01
Cognitive engineering is an applied field with roots in both cognitive science and engineering that has been used to support design of information displays, decision support, human-automation interaction, and training in numerous high risk domains ranging from nuclear power plant control to transportation and defense systems. Cognitive engineering provides a set of structured, analytic methods for data collection and analysis that intersect with and complement methods of Cognitive Informatics. These methods support discovery of aspects of the work that make performance challenging, as well as the knowledge, skills, and strategies that experts use to meet those challenges. Importantly, cognitive engineering methods provide novel representations that highlight the inherent complexities of the work domain and traceable links between the results of cognitive analyses and actionable design requirements. This article provides an overview of relevant cognitive engineering methods, and illustrates how they have been applied to the design of health information technology (HIT) systems. Additionally, although cognitive engineering methods have been applied in the design of user-centered informatics systems, methods drawn from informatics are not typically incorporated into a cognitive engineering analysis. This article presents a discussion regarding ways in which data-rich methods can inform cognitive engineering. Copyright © 2017 Elsevier Inc. All rights reserved.
Mine, Shouhei; Nakamura, Tsutomu; Hirata, Kunio; Ishikawa, Kazuhiko; Hagihara, Yoshihisa; Uegaki, Koichi
2006-01-01
The crystallization and preliminary X-ray diffraction analysis of a catalytic domain of chitinase (PF1233 gene) from the hyperthermophilic archaeon Pyrococcus furiosus is reported. The recombinant protein, prepared using an Escherichia coli expression system, was crystallized by the hanging-drop vapour-diffusion method. An X-ray diffraction data set was collected at the undulator beamline BL44XU at SPring-8 to a resolution of 1.50 Å. The crystals belong to space group P212121, with unit-cell parameters a = 90.0, b = 92.8, c = 107.2 Å. PMID:16880559
Chiu, En-Chi; Lee, Yen; Lai, Kuan-Yu; Kuo, Chian-Jue; Lee, Shu-Chun; Hsieh, Ching-Lin
2015-01-01
Background The Chinese version of the Activities of Daily Living Rating Scale III (ADLRS-III), which has 10 domains, is commonly used for assessing activities of daily living (ADL) in patients with schizophrenia. However, construct validity (i.e., unidimensionality) for each domain of the ADLRS-III is unknown, limiting the explanations of the test results. Purpose This main purpose of this study was to examine unidimensionality of each domain in the ADLRS-III. We also examined internal consistency and ceiling/floor effects in patients with schizophrenia. Methods From occupational therapy records, we obtained 304 self-report data of the ADLRS-III. Confirmatory factor analysis (CFA) was conducted to examine the 10 one-factor structures. If a domain showed an insufficient model fit, exploratory factor analysis (EFA) was performed to investigate the factor structure and choose one factor representing the original construct. Internal consistency was examined using Cronbach’s alpha (α). Ceiling and floor effects were determined by the percentage of patients with the maximum and minimum scores in each domain, respectively. Results CFA analyses showed that 4 domains (i.e., leisure, picture recognition, literacy ability, communication tools use) had sufficient model fits. These 4 domains had acceptable internal consistency (α = 0.79-0.87) and no ceiling/floor effects, except the leisure domain which had a ceiling effect. The other 6 domains showed insufficient model fits. The EFA results showed that these 6 domains were two-factor structures. Conclusion The results supported unidimensional constructs of the leisure, picture recognition, literacy ability, and communication tool uses domains. The sum scores of these 4 domains can be used to represent their respective domain-specific functions. Regarding the 6 domains with insufficient model fits, we have explained the two factors of each domain and chosen one factor to represent its original construct. Future users may use the items from the chosen factors to assess domain-specific functions in patients with schizophrenia. PMID:26121246
Unsupervised discovery of information structure in biomedical documents.
Kiela, Douwe; Guo, Yufan; Stenius, Ulla; Korhonen, Anna
2015-04-01
Information structure (IS) analysis is a text mining technique, which classifies text in biomedical articles into categories that capture different types of information, such as objectives, methods, results and conclusions of research. It is a highly useful technique that can support a range of Biomedical Text Mining tasks and can help readers of biomedical literature find information of interest faster, accelerating the highly time-consuming process of literature review. Several approaches to IS analysis have been presented in the past, with promising results in real-world biomedical tasks. However, all existing approaches, even weakly supervised ones, require several hundreds of hand-annotated training sentences specific to the domain in question. Because biomedicine is subject to considerable domain variation, such annotations are expensive to obtain. This makes the application of IS analysis across biomedical domains difficult. In this article, we investigate an unsupervised approach to IS analysis and evaluate the performance of several unsupervised methods on a large corpus of biomedical abstracts collected from PubMed. Our best unsupervised algorithm (multilevel-weighted graph clustering algorithm) performs very well on the task, obtaining over 0.70 F scores for most IS categories when applied to well-known IS schemes. This level of performance is close to that of lightly supervised IS methods and has proven sufficient to aid a range of practical tasks. Thus, using an unsupervised approach, IS could be applied to support a wide range of tasks across sub-domains of biomedicine. We also demonstrate that unsupervised learning brings novel insights into IS of biomedical literature and discovers information categories that are not present in any of the existing IS schemes. The annotated corpus and software are available at http://www.cl.cam.ac.uk/∼dk427/bio14info.html. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Ares I-X In-Flight Modal Identification
NASA Technical Reports Server (NTRS)
Bartkowicz, Theodore J.; James, George H., III
2011-01-01
Operational modal analysis is a procedure that allows the extraction of modal parameters of a structure in its operating environment. It is based on the idealized premise that input to the structure is white noise. In some cases, when free decay responses are corrupted by unmeasured random disturbances, the response data can be processed into cross-correlation functions that approximate free decay responses. Modal parameters can be computed from these functions by time domain identification methods such as the Eigenvalue Realization Algorithm (ERA). The extracted modal parameters have the same characteristics as impulse response functions of the original system. Operational modal analysis is performed on Ares I-X in-flight data. Since the dynamic system is not stationary due to propellant mass loss, modal identification is only possible by analyzing the system as a series of linearized models over short periods of time via a sliding time-window of short time intervals. A time-domain zooming technique was also employed to enhance the modal parameter extraction. Results of this study demonstrate that free-decay time domain modal identification methods can be successfully employed for in-flight launch vehicle modal extraction.
A domain-centric solution to functional genomics via dcGO Predictor
2013-01-01
Background Computational/manual annotations of protein functions are one of the first routes to making sense of a newly sequenced genome. Protein domain predictions form an essential part of this annotation process. This is due to the natural modularity of proteins with domains as structural, evolutionary and functional units. Sometimes two, three, or more adjacent domains (called supra-domains) are the operational unit responsible for a function, e.g. via a binding site at the interface. These supra-domains have contributed to functional diversification in higher organisms. Traditionally functional ontologies have been applied to individual proteins, rather than families of related domains and supra-domains. We expect, however, to some extent functional signals can be carried by protein domains and supra-domains, and consequently used in function prediction and functional genomics. Results Here we present a domain-centric Gene Ontology (dcGO) perspective. We generalize a framework for automatically inferring ontological terms associated with domains and supra-domains from full-length sequence annotations. This general framework has been applied specifically to primary protein-level annotations from UniProtKB-GOA, generating GO term associations with SCOP domains and supra-domains. The resulting 'dcGO Predictor', can be used to provide functional annotation to protein sequences. The functional annotation of sequences in the Critical Assessment of Function Annotation (CAFA) has been used as a valuable opportunity to validate our method and to be assessed by the community. The functional annotation of all completely sequenced genomes has demonstrated the potential for domain-centric GO enrichment analysis to yield functional insights into newly sequenced or yet-to-be-annotated genomes. This generalized framework we have presented has also been applied to other domain classifications such as InterPro and Pfam, and other ontologies such as mammalian phenotype and disease ontology. The dcGO and its predictor are available at http://supfam.org/SUPERFAMILY/dcGO including an enrichment analysis tool. Conclusions As functional units, domains offer a unique perspective on function prediction regardless of whether proteins are multi-domain or single-domain. The 'dcGO Predictor' holds great promise for contributing to a domain-centric functional understanding of genomes in the next generation sequencing era. PMID:23514627
A Space Affine Matching Approach to fMRI Time Series Analysis.
Chen, Liang; Zhang, Weishi; Liu, Hongbo; Feng, Shigang; Chen, C L Philip; Wang, Huili
2016-07-01
For fMRI time series analysis, an important challenge is to overcome the potential delay between hemodynamic response signal and cognitive stimuli signal, namely the same frequency but different phase (SFDP) problem. In this paper, a novel space affine matching feature is presented by introducing the time domain and frequency domain features. The time domain feature is used to discern different stimuli, while the frequency domain feature to eliminate the delay. And then we propose a space affine matching (SAM) algorithm to match fMRI time series by our affine feature, in which a normal vector is estimated using gradient descent to explore the time series matching optimally. The experimental results illustrate that the SAM algorithm is insensitive to the delay between the hemodynamic response signal and the cognitive stimuli signal. Our approach significantly outperforms GLM method while there exists the delay. The approach can help us solve the SFDP problem in fMRI time series matching and thus of great promise to reveal brain dynamics.
Adaptive subdomain modeling: A multi-analysis technique for ocean circulation models
NASA Astrophysics Data System (ADS)
Altuntas, Alper; Baugh, John
2017-07-01
Many coastal and ocean processes of interest operate over large temporal and geographical scales and require a substantial amount of computational resources, particularly when engineering design and failure scenarios are also considered. This study presents an adaptive multi-analysis technique that improves the efficiency of these computations when multiple alternatives are being simulated. The technique, called adaptive subdomain modeling, concurrently analyzes any number of child domains, with each instance corresponding to a unique design or failure scenario, in addition to a full-scale parent domain providing the boundary conditions for its children. To contain the altered hydrodynamics originating from the modifications, the spatial extent of each child domain is adaptively adjusted during runtime depending on the response of the model. The technique is incorporated in ADCIRC++, a re-implementation of the popular ADCIRC ocean circulation model with an updated software architecture designed to facilitate this adaptive behavior and to utilize concurrent executions of multiple domains. The results of our case studies confirm that the method substantially reduces computational effort while maintaining accuracy.
Scattering from phase-separated vesicles. I. An analytical form factor for multiple static domains
Heberle, Frederick A.; Anghel, Vinicius N. P.; Katsaras, John
2015-08-18
This is the first in a series of studies considering elastic scattering from laterally heterogeneous lipid vesicles containing multiple domains. Unique among biophysical tools, small-angle neutron scattering can in principle give detailed information about the size, shape and spatial arrangement of domains. A general theory for scattering from laterally heterogeneous vesicles is presented, and the analytical form factor for static domains with arbitrary spatial configuration is derived, including a simplification for uniformly sized round domains. The validity of the model, including series truncation effects, is assessed by comparison with simulated data obtained from a Monte Carlo method. Several aspects ofmore » the analytical solution for scattering intensity are discussed in the context of small-angle neutron scattering data, including the effect of varying domain size and number, as well as solvent contrast. Finally, the analysis indicates that effects of domain formation are most pronounced when the vesicle's average scattering length density matches that of the surrounding solvent.« less
Quantitative analyses of tartaric acid based on terahertz time domain spectroscopy
NASA Astrophysics Data System (ADS)
Cao, Binghua; Fan, Mengbao
2010-10-01
Terahertz wave is the electromagnetic spectrum situated between microwave and infrared wave. Quantitative analysis based on terahertz spectroscopy is very important for the application of terahertz techniques. But how to realize it is still under study. L-tartaric acid is widely used as acidulant in beverage, and other food, such as soft drinks, wine, candy, bread and some colloidal sweetmeats. In this paper, terahertz time-domain spectroscopy is applied to quantify the tartaric acid. Two methods are employed to process the terahertz spectra of different samples with different content of tartaric acid. The first one is linear regression combining correlation analysis. The second is partial least square (PLS), in which the absorption spectra in the 0.8-1.4THz region are used to quantify the tartaric acid. To compare the performance of these two principles, the relative error of the two methods is analyzed. For this experiment, the first method does better than the second one. But the first method is suitable for the quantitative analysis of materials which has obvious terahertz absorption peaks, while for material which has no obvious terahertz absorption peaks, the second one is more appropriate.
Meta-analysis of diagnostic test data: a bivariate Bayesian modeling approach.
Verde, Pablo E
2010-12-30
In the last decades, the amount of published results on clinical diagnostic tests has expanded very rapidly. The counterpart to this development has been the formal evaluation and synthesis of diagnostic results. However, published results present substantial heterogeneity and they can be regarded as so far removed from the classical domain of meta-analysis, that they can provide a rather severe test of classical statistical methods. Recently, bivariate random effects meta-analytic methods, which model the pairs of sensitivities and specificities, have been presented from the classical point of view. In this work a bivariate Bayesian modeling approach is presented. This approach substantially extends the scope of classical bivariate methods by allowing the structural distribution of the random effects to depend on multiple sources of variability. Meta-analysis is summarized by the predictive posterior distributions for sensitivity and specificity. This new approach allows, also, to perform substantial model checking, model diagnostic and model selection. Statistical computations are implemented in the public domain statistical software (WinBUGS and R) and illustrated with real data examples. Copyright © 2010 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Swain, Sushree Diptimayee; Ray, Pravat Kumar; Mohanty, K. B.
2016-06-01
This research paper discover the design of a shunt Passive Power Filter (PPF) in Hybrid Series Active Power Filter (HSAPF) that employs a novel analytic methodology which is superior than FFT analysis. This novel approach consists of the estimation, detection and classification of the signals. The proposed method is applied to estimate, detect and classify the power quality (PQ) disturbance such as harmonics. This proposed work deals with three methods: the harmonic detection through wavelet transform method, the harmonic estimation by Kalman Filter algorithm and harmonic classification by decision tree method. From different type of mother wavelets in wavelet transform method, the db8 is selected as suitable mother wavelet because of its potency on transient response and crouched oscillation at frequency domain. In harmonic compensation process, the detected harmonic is compensated through Hybrid Series Active Power Filter (HSAPF) based on Instantaneous Reactive Power Theory (IRPT). The efficacy of the proposed method is verified in MATLAB/SIMULINK domain and as well as with an experimental set up. The obtained results confirm the superiority of the proposed methodology than FFT analysis. This newly proposed PPF is used to make the conventional HSAPF more robust and stable.
Sub-microsecond-resolution probe microscopy
Ginger, David; Giridharagopal, Rajiv; Moore, David; Rayermann, Glennis; Reid, Obadiah
2014-04-01
Methods and apparatus are provided herein for time-resolved analysis of the effect of a perturbation (e.g., a light or voltage pulse) on a sample. By operating in the time domain, the provided method enables sub-microsecond time-resolved measurement of transient, or time-varying, forces acting on a cantilever.
Kaltenbacher, Barbara; Kaltenbacher, Manfred; Sim, Imbo
2013-01-01
We consider the second order wave equation in an unbounded domain and propose an advanced perfectly matched layer (PML) technique for its efficient and reliable simulation. In doing so, we concentrate on the time domain case and use the finite-element (FE) method for the space discretization. Our un-split-PML formulation requires four auxiliary variables within the PML region in three space dimensions. For a reduced version (rPML), we present a long time stability proof based on an energy analysis. The numerical case studies and an application example demonstrate the good performance and long time stability of our formulation for treating open domain problems. PMID:23888085
Cell edge detection in JPEG2000 wavelet domain - analysis on sigmoid function edge model.
Punys, Vytenis; Maknickas, Ramunas
2011-01-01
Big virtual microscopy images (80K x 60K pixels and larger) are usually stored using the JPEG2000 image compression scheme. Diagnostic quantification, based on image analysis, might be faster if performed on compressed data (approx. 20 times less the original amount), representing the coefficients of the wavelet transform. The analysis of possible edge detection without reverse wavelet transform is presented in the paper. Two edge detection methods, suitable for JPEG2000 bi-orthogonal wavelets, are proposed. The methods are adjusted according calculated parameters of sigmoid edge model. The results of model analysis indicate more suitable method for given bi-orthogonal wavelet.
Note on the eigensolution of a homogeneous equation with semi-infinite domain
NASA Technical Reports Server (NTRS)
Wadia, A. R.
1980-01-01
The 'variation-iteration' method using Green's functions to find the eigenvalues and the corresponding eigenfunctions of a homogeneous Fredholm integral equation is employed for the stability analysis of fluid hydromechanics problems with a semiinfinite (infinite) domain of application. The objective of the study is to develop a suitable numerical approach to the solution of such equations in order to better understand the full set of equations for 'real-world' flow models. The study involves a search for a suitable value of the length of the domain which is a fair finite approximation to infinity, which makes the eigensolution an approximation dependent on the length of the interval chosen. In the examples investigated y = 1 = a seems to be the best approximation of infinity; for y greater than unity this method fails due to the polynomial nature of Green's functions.
Executive Functions in Children with Specific Language Impairment: A Meta-Analysis
ERIC Educational Resources Information Center
Pauls, Laura J.; Archibald, Lisa M. D.
2016-01-01
Purpose: Mounting evidence demonstrates deficits in children with specific language impairment (SLI) beyond the linguistic domain. Using meta-analysis, this study examined differences in children with and without SLI on tasks measuring inhibition and cognitive flexibility. Method: Databases were searched for articles comparing children (4-14…
Adolescent Domain Screening Inventory-Short Form: Development and Initial Validation
ERIC Educational Resources Information Center
Corrigan, Matthew J.
2017-01-01
This study sought to develop a short version of the ADSI, and investigate its psychometric properties. Methods: This is a secondary analysis. Analysis to determine the Cronbach's Alpha, correlations to determine concurrent criterion validity and known instrument validity and a logistic regression to determine predictive validity were conducted.…
Prediction of Protein Structure by Template-Based Modeling Combined with the UNRES Force Field.
Krupa, Paweł; Mozolewska, Magdalena A; Joo, Keehyoung; Lee, Jooyoung; Czaplewski, Cezary; Liwo, Adam
2015-06-22
A new approach to the prediction of protein structures that uses distance and backbone virtual-bond dihedral angle restraints derived from template-based models and simulations with the united residue (UNRES) force field is proposed. The approach combines the accuracy and reliability of template-based methods for the segments of the target sequence with high similarity to those having known structures with the ability of UNRES to pack the domains correctly. Multiplexed replica-exchange molecular dynamics with restraints derived from template-based models of a given target, in which each restraint is weighted according to the accuracy of the prediction of the corresponding section of the molecule, is used to search the conformational space, and the weighted histogram analysis method and cluster analysis are applied to determine the families of the most probable conformations, from which candidate predictions are selected. To test the capability of the method to recover template-based models from restraints, five single-domain proteins with structures that have been well-predicted by template-based methods were used; it was found that the resulting structures were of the same quality as the best of the original models. To assess whether the new approach can improve template-based predictions with incorrectly predicted domain packing, four such targets were selected from the CASP10 targets; for three of them the new approach resulted in significantly better predictions compared with the original template-based models. The new approach can be used to predict the structures of proteins for which good templates can be found for sections of the sequence or an overall good template can be found for the entire sequence but the prediction quality is remarkably weaker in putative domain-linker regions.
Stollar, Elliott J.; Lin, Hong; Davidson, Alan R.; Forman-Kay, Julie D.
2012-01-01
There is increasing evidence for the functional importance of multiple dynamically populated states within single proteins. However, peptide binding by protein-protein interaction domains, such as the SH3 domain, has generally been considered to involve the full engagement of peptide to the binding surface with minimal dynamics and simple methods to determine dynamics at the binding surface for multiple related complexes have not been described. We have used NMR spectroscopy combined with isothermal titration calorimetry to comprehensively examine the extent of engagement to the yeast Abp1p SH3 domain for 24 different peptides. Over one quarter of the domain residues display co-linear chemical shift perturbation (CCSP) behavior, in which the position of a given chemical shift in a complex is co-linear with the same chemical shift in the other complexes, providing evidence that each complex exists as a unique dynamic rapidly inter-converting ensemble. The extent the specificity determining sub-surface of AbpSH3 is engaged as judged by CCSP analysis correlates with structural and thermodynamic measurements as well as with functional data, revealing the basis for significant structural and functional diversity amongst the related complexes. Thus, CCSP analysis can distinguish peptide complexes that may appear identical in terms of general structure and percent peptide occupancy but have significant local binding differences across the interface, affecting their ability to transmit conformational change across the domain and resulting in functional differences. PMID:23251481
Reductive evolution and the loss of PDC/PAS domains from the genus Staphylococcus
2013-01-01
Background The Per-Arnt-Sim (PAS) domain represents a ubiquitous structural fold that is involved in bacterial sensing and adaptation systems, including several virulence related functions. Although PAS domains and the subclass of PhoQ-DcuS-CitA (PDC) domains have a common structure, there is limited amino acid sequence similarity. To gain greater insight into the evolution of PDC/PAS domains present in the bacterial kingdom and staphylococci in specific, the PDC/PAS domains from the genomic sequences of 48 bacteria, representing 5 phyla, were identified using the sensitive search method based on HMM-to-HMM comparisons (HHblits). Results A total of 1,007 PAS domains and 686 PDC domains distributed over 1,174 proteins were identified. For 28 Gram-positive bacteria, the distribution, organization, and molecular evolution of PDC/PAS domains were analyzed in greater detail, with a special emphasis on the genus Staphylococcus. Compared to other bacteria the staphylococci have relatively fewer proteins (6–9) containing PDC/PAS domains. As a general rule, the staphylococcal genomes examined in this study contain a core group of seven PDC/PAS domain-containing proteins consisting of WalK, SrrB, PhoR, ArlS, HssS, NreB, and GdpP. The exceptions to this rule are: 1) S. saprophyticus lacks the core NreB protein; 2) S. carnosus has two additional PAS domain containing proteins; 3) S. epidermidis, S. aureus, and S. pseudintermedius have an additional protein with two PDC domains that is predicted to code for a sensor histidine kinase; 4) S. lugdunensis has an additional PDC containing protein predicted to be a sensor histidine kinase. Conclusions This comprehensive analysis demonstrates that variation in PDC/PAS domains among bacteria has limited correlations to the genome size or pathogenicity; however, our analysis established that bacteria having a motile phase in their life cycle have significantly more PDC/PAS-containing proteins. In addition, our analysis revealed a tremendous amount of variation in the number of PDC/PAS-containing proteins within genera. This variation extended to the Staphylococcus genus, which had between 6 and 9 PDC/PAS proteins and some of these appear to be previously undescribed signaling proteins. This latter point is important because most staphylococcal proteins that contain PDC/PAS domains regulate virulence factor synthesis or antibiotic resistance. PMID:23902280
Reductive evolution and the loss of PDC/PAS domains from the genus Staphylococcus.
Shah, Neethu; Gaupp, Rosmarie; Moriyama, Hideaki; Eskridge, Kent M; Moriyama, Etsuko N; Somerville, Greg A
2013-07-31
The Per-Arnt-Sim (PAS) domain represents a ubiquitous structural fold that is involved in bacterial sensing and adaptation systems, including several virulence related functions. Although PAS domains and the subclass of PhoQ-DcuS-CitA (PDC) domains have a common structure, there is limited amino acid sequence similarity. To gain greater insight into the evolution of PDC/PAS domains present in the bacterial kingdom and staphylococci in specific, the PDC/PAS domains from the genomic sequences of 48 bacteria, representing 5 phyla, were identified using the sensitive search method based on HMM-to-HMM comparisons (HHblits). A total of 1,007 PAS domains and 686 PDC domains distributed over 1,174 proteins were identified. For 28 Gram-positive bacteria, the distribution, organization, and molecular evolution of PDC/PAS domains were analyzed in greater detail, with a special emphasis on the genus Staphylococcus. Compared to other bacteria the staphylococci have relatively fewer proteins (6-9) containing PDC/PAS domains. As a general rule, the staphylococcal genomes examined in this study contain a core group of seven PDC/PAS domain-containing proteins consisting of WalK, SrrB, PhoR, ArlS, HssS, NreB, and GdpP. The exceptions to this rule are: 1) S. saprophyticus lacks the core NreB protein; 2) S. carnosus has two additional PAS domain containing proteins; 3) S. epidermidis, S. aureus, and S. pseudintermedius have an additional protein with two PDC domains that is predicted to code for a sensor histidine kinase; 4) S. lugdunensis has an additional PDC containing protein predicted to be a sensor histidine kinase. This comprehensive analysis demonstrates that variation in PDC/PAS domains among bacteria has limited correlations to the genome size or pathogenicity; however, our analysis established that bacteria having a motile phase in their life cycle have significantly more PDC/PAS-containing proteins. In addition, our analysis revealed a tremendous amount of variation in the number of PDC/PAS-containing proteins within genera. This variation extended to the Staphylococcus genus, which had between 6 and 9 PDC/PAS proteins and some of these appear to be previously undescribed signaling proteins. This latter point is important because most staphylococcal proteins that contain PDC/PAS domains regulate virulence factor synthesis or antibiotic resistance.
Bau, Cho-Tsan; Huang, Chung-Yi
2014-01-01
Abstract Objective: To construct a clinical decision support system (CDSS) for undergoing surgery based on domain ontology and rules reasoning in the setting of hospitalized diabetic patients. Materials and Methods: The ontology was created with a modified ontology development method, including specification and conceptualization, formalization, implementation, and evaluation and maintenance. The Protégé–Web Ontology Language editor was used to implement the ontology. Embedded clinical knowledge was elicited to complement the domain ontology with formal concept analysis. The decision rules were translated into JENA format, which JENA can use to infer recommendations based on patient clinical situations. Results: The ontology includes 31 classes and 13 properties, plus 38 JENA rules that were built to generate recommendations. The evaluation studies confirmed the correctness of the ontology, acceptance of recommendations, satisfaction with the system, and usefulness of the ontology for glycemic management of diabetic patients undergoing surgery, especially for domain experts. Conclusions: The contribution of this research is to set up an evidence-based hybrid ontology and an evaluation method for CDSS. The system can help clinicians to achieve inpatient glycemic control in diabetic patients undergoing surgery while avoiding hypoglycemia. PMID:24730353
Zhou, Jianyong; Luo, Zu; Li, Chunquan; Deng, Mi
2018-01-01
When the meshless method is used to establish the mathematical-mechanical model of human soft tissues, it is necessary to define the space occupied by human tissues as the problem domain and the boundary of the domain as the surface of those tissues. Nodes should be distributed in both the problem domain and on the boundaries. Under external force, the displacement of the node is computed by the meshless method to represent the deformation of biological soft tissues. However, computation by the meshless method consumes too much time, which will affect the simulation of real-time deformation of human tissues in virtual surgery. In this article, the Marquardt's Algorithm is proposed to fit the nodal displacement at the problem domain's boundary and obtain the relationship between surface deformation and force. When different external forces are applied, the deformation of soft tissues can be quickly obtained based on this relationship. The analysis and discussion show that the improved model equations with Marquardt's Algorithm not only can simulate the deformation in real-time but also preserve the authenticity of the deformation model's physical properties. Copyright © 2017 Elsevier B.V. All rights reserved.
A probabilistic model for detecting rigid domains in protein structures.
Nguyen, Thach; Habeck, Michael
2016-09-01
Large-scale conformational changes in proteins are implicated in many important biological functions. These structural transitions can often be rationalized in terms of relative movements of rigid domains. There is a need for objective and automated methods that identify rigid domains in sets of protein structures showing alternative conformational states. We present a probabilistic model for detecting rigid-body movements in protein structures. Our model aims to approximate alternative conformational states by a few structural parts that are rigidly transformed under the action of a rotation and a translation. By using Bayesian inference and Markov chain Monte Carlo sampling, we estimate all parameters of the model, including a segmentation of the protein into rigid domains, the structures of the domains themselves, and the rigid transformations that generate the observed structures. We find that our Gibbs sampling algorithm can also estimate the optimal number of rigid domains with high efficiency and accuracy. We assess the power of our method on several thousand entries of the DynDom database and discuss applications to various complex biomolecular systems. The Python source code for protein ensemble analysis is available at: https://github.com/thachnguyen/motion_detection : mhabeck@gwdg.de. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Phase-space topography characterization of nonlinear ultrasound waveforms.
Dehghan-Niri, Ehsan; Al-Beer, Helem
2018-03-01
Fundamental understanding of ultrasound interaction with material discontinuities having closed interfaces has many engineering applications such as nondestructive evaluation of defects like kissing bonds and cracks in critical structural and mechanical components. In this paper, to analyze the acoustic field nonlinearities due to defects with closed interfaces, the use of a common technique in nonlinear physics, based on a phase-space topography construction of ultrasound waveform, is proposed. The central idea is to complement the "time" and "frequency" domain analyses with the "phase-space" domain analysis of nonlinear ultrasound waveforms. A nonlinear time series method known as pseudo phase-space topography construction is used to construct equivalent phase-space portrait of measured ultrasound waveforms. Several nonlinear models are considered to numerically simulate nonlinear ultrasound waveforms. The phase-space response of the simulated waveforms is shown to provide different topographic information, while the frequency domain shows similar spectral behavior. Thus, model classification can be substantially enhanced in the phase-space domain. Experimental results on high strength aluminum samples show that the phase-space transformation provides a unique detection and classification capabilities. The Poincaré map of the phase-space domain is also used to better understand the nonlinear behavior of ultrasound waveforms. It is shown that the analysis of ultrasound nonlinearities is more convenient and informative in the phase-space domain than in the frequency domain. Copyright © 2017 Elsevier B.V. All rights reserved.
Classification and Lineage Tracing of SH2 Domains Throughout Eukaryotes.
Liu, Bernard A
2017-01-01
Today there exists a rapidly expanding number of sequenced genomes. Cataloging protein interaction domains such as the Src Homology 2 (SH2) domain across these various genomes can be accomplished with ease due to existing algorithms and predictions models. An evolutionary analysis of SH2 domains provides a step towards understanding how SH2 proteins integrated with existing signaling networks to position phosphotyrosine signaling as a crucial driver of robust cellular communication networks in metazoans. However organizing and tracing SH2 domain across organisms and understanding their evolutionary trajectory remains a challenge. This chapter describes several methodologies towards analyzing the evolutionary trajectory of SH2 domains including a global SH2 domain classification system, which facilitates annotation of new SH2 sequences essential for tracing the lineage of SH2 domains throughout eukaryote evolution. This classification utilizes a combination of sequence homology, protein domain architecture and the boundary positions between introns and exons within the SH2 domain or genes encoding these domains. Discrete SH2 families can then be traced across various genomes to provide insight into its origins. Furthermore, additional methods for examining potential mechanisms for divergence of SH2 domains from structural changes to alterations in the protein domain content and genome duplication will be discussed. Therefore a better understanding of SH2 domain evolution may enhance our insight into the emergence of phosphotyrosine signaling and the expansion of protein interaction domains.
High-throughput search for new permanent magnet materials.
Goll, D; Loeffler, R; Herbst, J; Karimi, R; Schneider, G
2014-02-12
The currently highest-performance Fe-Nd-B magnets show limited cost-effectiveness and lifetime due to their rare-earth (RE) content. The demand for novel hard magnetic phases with more widely available RE metals, reduced RE content or, even better, completely free of RE metals is therefore tremendous. The chances are that such materials still exist given the large number of as yet unexplored alloy systems. To discover such phases, an elaborate concept is necessary which can restrict and prioritize the search field while making use of efficient synthesis and analysis methods. It is shown that an efficient synthesis of new phases using heterogeneous non-equilibrium diffusion couples and reaction sintering is possible. Quantitative microstructure analysis of the domain pattern of the hard magnetic phases can be used to estimate the intrinsic magnetic parameters (saturation polarization from the domain contrast, anisotropy constant from the domain width, Curie temperature from the temperature dependence of the domain contrast). The probability of detecting TM-rich phases for a given system is high, therefore the approach enables one to scan through even higher component systems with one single sample. The visualization of newly occurring hard magnetic phases via their typical domain structure and the correlation existing between domain structure and intrinsic magnetic properties allows an evaluation of the industrial relevance of these novel phases.
Xue, You-Lin; Wang, Hao; Riedy, Michael; Roberts, Brittany-Lee; Sun, Yuna; Song, Yong-Bo; Jones, Gary W; Masison, Daniel C; Song, Youtao
2018-05-01
Genetic screens using Saccharomyces cerevisiae have identified an array of Hsp40 (Ydj1p) J-domain mutants that are impaired in the ability to cure the yeast [URE3] prion through disrupting functional interactions with Hsp70. However, biochemical analysis of some of these Hsp40 J-domain mutants has so far failed to provide major insight into the specific functional changes in Hsp40-Hsp70 interactions. To explore the detailed structural and dynamic properties of the Hsp40 J-domain, 20 ns molecular dynamic simulations of 4 mutants (D9A, D36A, A30T, and F45S) and wild-type J-domain were performed, followed by Hsp70 docking simulations. Results demonstrated that although the Hsp70 interaction mechanism of the mutants may vary, the major structural change was targeted to the critical HPD motif of the J-domain. Our computational analysis fits well with previous yeast genetics studies regarding highlighting the importance of J-domain function in prion propagation. During the molecular dynamics simulations several important residues were identified and predicted to play an essential role in J-domain structure. Among these residues, Y26 and F45 were confirmed, using both in silico and in vivo methods, as being critical for Ydj1p function.
Shang, Jianyu; Deng, Zhihong; Fu, Mengyin; Wang, Shunting
2016-01-01
Traditional artillery guidance can significantly improve the attack accuracy and overall combat efficiency of projectiles, which makes it more adaptable to the information warfare of the future. Obviously, the accurate measurement of artillery spin rate, which has long been regarded as a daunting task, is the basis of precise guidance and control. Magnetoresistive (MR) sensors can be applied to spin rate measurement, especially in the high-spin and high-g projectile launch environment. In this paper, based on the theory of a MR sensor measuring spin rate, the mathematical relationship model between the frequency of MR sensor output and projectile spin rate was established through a fundamental derivation. By analyzing the characteristics of MR sensor output whose frequency varies with time, this paper proposed the Chirp z-Transform (CZT) time-frequency (TF) domain analysis method based on the rolling window of a Blackman window function (BCZT) which can accurately extract the projectile spin rate. To put it into practice, BCZT was applied to measure the spin rate of 155 mm artillery projectile. After extracting the spin rate, the impact that launch rotational angular velocity and aspect angle have on the extraction accuracy of the spin rate was analyzed. Simulation results show that the BCZT TF domain analysis method can effectively and accurately measure the projectile spin rate, especially in a high-spin and high-g projectile launch environment. PMID:27322266
Interaction Analysis through Proteomic Phage Display
2014-01-01
Phage display is a powerful technique for profiling specificities of peptide binding domains. The method is suited for the identification of high-affinity ligands with inhibitor potential when using highly diverse combinatorial peptide phage libraries. Such experiments further provide consensus motifs for genome-wide scanning of ligands of potential biological relevance. A complementary but considerably less explored approach is to display expression products of genomic DNA, cDNA, open reading frames (ORFs), or oligonucleotide libraries designed to encode defined regions of a target proteome on phage particles. One of the main applications of such proteomic libraries has been the elucidation of antibody epitopes. This review is focused on the use of proteomic phage display to uncover protein-protein interactions of potential relevance for cellular function. The method is particularly suited for the discovery of interactions between peptide binding domains and their targets. We discuss the largely unexplored potential of this method in the discovery of domain-motif interactions of potential biological relevance. PMID:25295249
Vilor-Tejedor, Natàlia; Cáceres, Alejandro; Pujol, Jesús; Sunyer, Jordi; González, Juan R
2017-12-01
Joint analysis of genetic and neuroimaging data, known as Imaging Genetics (IG), offers an opportunity to deepen our knowledge of the biological mechanisms of neurodevelopmental domains. There has been exponential growth in the literature on IG studies, which challenges the standardization of analysis methods in this field. In this review we give a complete up-to-date account of IG studies on attention deficit hyperactivity disorder (ADHD) and related neurodevelopmental domains, which serves as a reference catalog for researchers working on this neurological disorder. We searched MEDLINE/Pubmed and identified 37 articles on IG of ADHD that met our eligibility criteria. We carefully cataloged these articles according to imaging technique, genes and brain region, and summarized the main results and characteristics of each study. We found that IG studies on ADHD generally focus on dopaminergic genes and the structure of basal ganglia using structural Magnetic Resonance Imaging (MRI). We found little research involving multiple genetic factors and brain regions because of the scarce use of multivariate strategies in data analysis. IG of ADHD and related neurodevelopmental domains is still in its early stages, and a lack of replicated findings is one of the most pressing challenges in the field.
Application of Feature-Oriented Domain Analysis to the Army Movement Control Domain (Appendices A-I)
1992-06-01
Cohen, James A. Hess, William E. Novak, & A. Spen- cer Peterson. Feature-Oriented Domain Analysis ( FODA ) Feasibility Study (CMU/SEI-90- TR-21...Oriented Domain Analysis to the Army Movement Control Domain (Appendices A -1) Sholom G. Cohen Jay L. Stanley, Jr. A. Spencer Peterson Robert W...Appendices) June 1992 Application of Feature-Oriented Domain Analysis to the Army Movement Control Domain (Appendices A -1) Sholom G. Cohen Jay L
1992-12-01
OOD) Paradigm ...... .... 2-7 2.4.3 Feature-Oriented Domain Analysis ( FODA ) ..... 2-7 2.4.4 Hierarchical Software Systems .................. 2-7...domain analysis ( FODA ) is one approach to domain analysis whose primary goal is to make domain products reusable (20:47). A domain model describes 2-5...7), among others. 2.4.3 Feature-Oriented Domain Analysis ( FODA ) Kang and others used the com- plete FODA methodology to successfully develop a window
Analysis of spatial distribution of land cover maps accuracy
NASA Astrophysics Data System (ADS)
Khatami, R.; Mountrakis, G.; Stehman, S. V.
2017-12-01
Land cover maps have become one of the most important products of remote sensing science. However, classification errors will exist in any classified map and affect the reliability of subsequent map usage. Moreover, classification accuracy often varies over different regions of a classified map. These variations of accuracy will affect the reliability of subsequent analyses of different regions based on the classified maps. The traditional approach of map accuracy assessment based on an error matrix does not capture the spatial variation in classification accuracy. Here, per-pixel accuracy prediction methods are proposed based on interpolating accuracy values from a test sample to produce wall-to-wall accuracy maps. Different accuracy prediction methods were developed based on four factors: predictive domain (spatial versus spectral), interpolation function (constant, linear, Gaussian, and logistic), incorporation of class information (interpolating each class separately versus grouping them together), and sample size. Incorporation of spectral domain as explanatory feature spaces of classification accuracy interpolation was done for the first time in this research. Performance of the prediction methods was evaluated using 26 test blocks, with 10 km × 10 km dimensions, dispersed throughout the United States. The performance of the predictions was evaluated using the area under the curve (AUC) of the receiver operating characteristic. Relative to existing accuracy prediction methods, our proposed methods resulted in improvements of AUC of 0.15 or greater. Evaluation of the four factors comprising the accuracy prediction methods demonstrated that: i) interpolations should be done separately for each class instead of grouping all classes together; ii) if an all-classes approach is used, the spectral domain will result in substantially greater AUC than the spatial domain; iii) for the smaller sample size and per-class predictions, the spectral and spatial domain yielded similar AUC; iv) for the larger sample size (i.e., very dense spatial sample) and per-class predictions, the spatial domain yielded larger AUC; v) increasing the sample size improved accuracy predictions with a greater benefit accruing to the spatial domain; and vi) the function used for interpolation had the smallest effect on AUC.
The Sensitivity Analysis for the Flow Past Obstacles Problem with Respect to the Reynolds Number
Ito, Kazufumi; Li, Zhilin; Qiao, Zhonghua
2013-01-01
In this paper, numerical sensitivity analysis with respect to the Reynolds number for the flow past obstacle problem is presented. To carry out such analysis, at each time step, we need to solve the incompressible Navier-Stokes equations on irregular domains twice, one for the primary variables; the other is for the sensitivity variables with homogeneous boundary conditions. The Navier-Stokes solver is the augmented immersed interface method for Navier-Stokes equations on irregular domains. One of the most important contribution of this paper is that our analysis can predict the critical Reynolds number at which the vortex shading begins to develop in the wake of the obstacle. Some interesting experiments are shown to illustrate how the critical Reynolds number varies with different geometric settings. PMID:24910780
The Sensitivity Analysis for the Flow Past Obstacles Problem with Respect to the Reynolds Number.
Ito, Kazufumi; Li, Zhilin; Qiao, Zhonghua
2012-02-01
In this paper, numerical sensitivity analysis with respect to the Reynolds number for the flow past obstacle problem is presented. To carry out such analysis, at each time step, we need to solve the incompressible Navier-Stokes equations on irregular domains twice, one for the primary variables; the other is for the sensitivity variables with homogeneous boundary conditions. The Navier-Stokes solver is the augmented immersed interface method for Navier-Stokes equations on irregular domains. One of the most important contribution of this paper is that our analysis can predict the critical Reynolds number at which the vortex shading begins to develop in the wake of the obstacle. Some interesting experiments are shown to illustrate how the critical Reynolds number varies with different geometric settings.
NASA Astrophysics Data System (ADS)
Wu, J.; Yao, W.; Zhang, J.; Li, Y.
2018-04-01
Labeling 3D point cloud data with traditional supervised learning methods requires considerable labelled samples, the collection of which is cost and time expensive. This work focuses on adopting domain adaption concept to transfer existing trained random forest classifiers (based on source domain) to new data scenes (target domain), which aims at reducing the dependence of accurate 3D semantic labeling in point clouds on training samples from the new data scene. Firstly, two random forest classifiers were firstly trained with existing samples previously collected for other data. They were different from each other by using two different decision tree construction algorithms: C4.5 with information gain ratio and CART with Gini index. Secondly, four random forest classifiers adapted to the target domain are derived through transferring each tree in the source random forest models with two types of operations: structure expansion and reduction-SER and structure transfer-STRUT. Finally, points in target domain are labelled by fusing the four newly derived random forest classifiers using weights of evidence based fusion model. To validate our method, experimental analysis was conducted using 3 datasets: one is used as the source domain data (Vaihingen data for 3D Semantic Labelling); another two are used as the target domain data from two cities in China (Jinmen city and Dunhuang city). Overall accuracies of 85.5 % and 83.3 % for 3D labelling were achieved for Jinmen city and Dunhuang city data respectively, with only 1/3 newly labelled samples compared to the cases without domain adaption.
Discovering body site and severity modifiers in clinical texts
Dligach, Dmitriy; Bethard, Steven; Becker, Lee; Miller, Timothy; Savova, Guergana K
2014-01-01
Objective To research computational methods for discovering body site and severity modifiers in clinical texts. Methods We cast the task of discovering body site and severity modifiers as a relation extraction problem in the context of a supervised machine learning framework. We utilize rich linguistic features to represent the pairs of relation arguments and delegate the decision about the nature of the relationship between them to a support vector machine model. We evaluate our models using two corpora that annotate body site and severity modifiers. We also compare the model performance to a number of rule-based baselines. We conduct cross-domain portability experiments. In addition, we carry out feature ablation experiments to determine the contribution of various feature groups. Finally, we perform error analysis and report the sources of errors. Results The performance of our method for discovering body site modifiers achieves F1 of 0.740–0.908 and our method for discovering severity modifiers achieves F1 of 0.905–0.929. Discussion Results indicate that both methods perform well on both in-domain and out-domain data, approaching the performance of human annotators. The most salient features are token and named entity features, although syntactic dependency features also contribute to the overall performance. The dominant sources of errors are infrequent patterns in the data and inability of the system to discern deeper semantic structures. Conclusions We investigated computational methods for discovering body site and severity modifiers in clinical texts. Our best system is released open source as part of the clinical Text Analysis and Knowledge Extraction System (cTAKES). PMID:24091648
Zero-moment point determination of worst-case manoeuvres leading to vehicle wheel lift
NASA Astrophysics Data System (ADS)
Lapapong, S.; Brown, A. A.; Swanson, K. S.; Brennan, S. N.
2012-01-01
This paper proposes a method to evaluate vehicle rollover propensity based on a frequency-domain representation of the zero-moment point (ZMP). Unlike other rollover metrics such as the static stability factor, which is based on the steady-state behaviour, and the load transfer ratio, which requires the calculation of tyre forces, the ZMP is based on a simplified kinematic model of the vehicle and the analysis of the contact point of the vehicle relative to the edge of the support polygon. Previous work has validated the use of the ZMP experimentally in its ability to predict wheel lift in the time domain. This work explores the use of the ZMP in the frequency domain to allow a chassis designer to understand how operating conditions and vehicle parameters affect rollover propensity. The ZMP analysis is then extended to calculate worst-case sinusoidal manoeuvres that lead to untripped wheel lift, and the analysis is tested across several vehicle configurations and compared with that of the standard Toyota J manoeuvre.
An ambiguity principle for assigning protein structural domains
Postic, Guillaume; Ghouzam, Yassine; Chebrek, Romain; Gelly, Jean-Christophe
2017-01-01
Ambiguity is the quality of being open to several interpretations. For an image, it arises when the contained elements can be delimited in two or more distinct ways, which may cause confusion. We postulate that it also applies to the analysis of protein three-dimensional structure, which consists in dividing the molecule into subunits called domains. Because different definitions of what constitutes a domain can be used to partition a given structure, the same protein may have different but equally valid domain annotations. However, knowledge and experience generally displace our ability to accept more than one way to decompose the structure of an object—in this case, a protein. This human bias in structure analysis is particularly harmful because it leads to ignoring potential avenues of research. We present an automated method capable of producing multiple alternative decompositions of protein structure (web server and source code available at www.dsimb.inserm.fr/sword/). Our innovative algorithm assigns structural domains through the hierarchical merging of protein units, which are evolutionarily preserved substructures that describe protein architecture at an intermediate level, between domain and secondary structure. To validate the use of these protein units for decomposing protein structures into domains, we set up an extensive benchmark made of expert annotations of structural domains and including state-of-the-art domain parsing algorithms. The relevance of our “multipartitioning” approach is shown through numerous examples of applications covering protein function, evolution, folding, and structure prediction. Finally, we introduce a measure for the structural ambiguity of protein molecules. PMID:28097215
Fractional domain varying-order differential denoising method
NASA Astrophysics Data System (ADS)
Zhang, Yan-Shan; Zhang, Feng; Li, Bing-Zhao; Tao, Ran
2014-10-01
Removal of noise is an important step in the image restoration process, and it remains a challenging problem in image processing. Denoising is a process used to remove the noise from the corrupted image, while retaining the edges and other detailed features as much as possible. Recently, denoising in the fractional domain is a hot research topic. The fractional-order anisotropic diffusion method can bring a less blocky effect and preserve edges in image denoising, a method that has received much interest in the literature. Based on this method, we propose a new method for image denoising, in which fractional-varying-order differential, rather than constant-order differential, is used. The theoretical analysis and experimental results show that compared with the state-of-the-art fractional-order anisotropic diffusion method, the proposed fractional-varying-order differential denoising model can preserve structure and texture well, while quickly removing noise, and yields good visual effects and better peak signal-to-noise ratio.
Analyzing reflective narratives to assess the ethical reasoning of pediatric residents.
Moon, Margaret; Taylor, Holly A; McDonald, Erin L; Hughes, Mark T; Beach, Mary Catherine; Carrese, Joseph A
2013-01-01
A limiting factor in ethics education in medical training has been difficulty in assessing competence in ethics. This study was conducted to test the concept that content analysis of pediatric residents' personal reflections about ethics experiences can identify changes in ethical sensitivity and reasoning over time. Analysis of written narratives focused on two of our ethics curriculum's goals: 1) To raise sensitivity to ethical issues in everyday clinical practice and 2) to enhance critical reflection on personal and professional values as they affect patient care. Content analysis of written reflections was guided by a tool developed to identify and assess the level of ethical reasoning in eight domains determined to be important aspects of ethical competence. Based on the assessment of narratives written at two times (12 to 16 months/apart) during their training, residents showed significant progress in two specific domains: use of professional values, and use of personal values. Residents did not show decline in ethical reasoning in any domain. This study demonstrates that content analysis of personal narratives may provide a useful method for assessment of developing ethical sensitivity and reasoning.
Generalized fourier analyses of the advection-diffusion equation - Part II: two-dimensional domains
NASA Astrophysics Data System (ADS)
Voth, Thomas E.; Martinez, Mario J.; Christon, Mark A.
2004-07-01
Part I of this work presents a detailed multi-methods comparison of the spatial errors associated with the one-dimensional finite difference, finite element and finite volume semi-discretizations of the scalar advection-diffusion equation. In Part II we extend the analysis to two-dimensional domains and also consider the effects of wave propagation direction and grid aspect ratio on the phase speed, and the discrete and artificial diffusivities. The observed dependence of dispersive and diffusive behaviour on propagation direction makes comparison of methods more difficult relative to the one-dimensional results. For this reason, integrated (over propagation direction and wave number) error and anisotropy metrics are introduced to facilitate comparison among the various methods. With respect to these metrics, the consistent mass Galerkin and consistent mass control-volume finite element methods, and their streamline upwind derivatives, exhibit comparable accuracy, and generally out-perform their lumped mass counterparts and finite-difference based schemes. While this work can only be considered a first step in a comprehensive multi-methods analysis and comparison, it serves to identify some of the relative strengths and weaknesses of multiple numerical methods in a common mathematical framework. Published in 2004 by John Wiley & Sons, Ltd.
Linear regression models and k-means clustering for statistical analysis of fNIRS data.
Bonomini, Viola; Zucchelli, Lucia; Re, Rebecca; Ieva, Francesca; Spinelli, Lorenzo; Contini, Davide; Paganoni, Anna; Torricelli, Alessandro
2015-02-01
We propose a new algorithm, based on a linear regression model, to statistically estimate the hemodynamic activations in fNIRS data sets. The main concern guiding the algorithm development was the minimization of assumptions and approximations made on the data set for the application of statistical tests. Further, we propose a K-means method to cluster fNIRS data (i.e. channels) as activated or not activated. The methods were validated both on simulated and in vivo fNIRS data. A time domain (TD) fNIRS technique was preferred because of its high performances in discriminating cortical activation and superficial physiological changes. However, the proposed method is also applicable to continuous wave or frequency domain fNIRS data sets.
Linear regression models and k-means clustering for statistical analysis of fNIRS data
Bonomini, Viola; Zucchelli, Lucia; Re, Rebecca; Ieva, Francesca; Spinelli, Lorenzo; Contini, Davide; Paganoni, Anna; Torricelli, Alessandro
2015-01-01
We propose a new algorithm, based on a linear regression model, to statistically estimate the hemodynamic activations in fNIRS data sets. The main concern guiding the algorithm development was the minimization of assumptions and approximations made on the data set for the application of statistical tests. Further, we propose a K-means method to cluster fNIRS data (i.e. channels) as activated or not activated. The methods were validated both on simulated and in vivo fNIRS data. A time domain (TD) fNIRS technique was preferred because of its high performances in discriminating cortical activation and superficial physiological changes. However, the proposed method is also applicable to continuous wave or frequency domain fNIRS data sets. PMID:25780751
A web-based system architecture for ontology-based data integration in the domain of IT benchmarking
NASA Astrophysics Data System (ADS)
Pfaff, Matthias; Krcmar, Helmut
2018-03-01
In the domain of IT benchmarking (ITBM), a variety of data and information are collected. Although these data serve as the basis for business analyses, no unified semantic representation of such data yet exists. Consequently, data analysis across different distributed data sets and different benchmarks is almost impossible. This paper presents a system architecture and prototypical implementation for an integrated data management of distributed databases based on a domain-specific ontology. To preserve the semantic meaning of the data, the ITBM ontology is linked to data sources and functions as the central concept for database access. Thus, additional databases can be integrated by linking them to this domain-specific ontology and are directly available for further business analyses. Moreover, the web-based system supports the process of mapping ontology concepts to external databases by introducing a semi-automatic mapping recommender and by visualizing possible mapping candidates. The system also provides a natural language interface to easily query linked databases. The expected result of this ontology-based approach of knowledge representation and data access is an increase in knowledge and data sharing in this domain, which will enhance existing business analysis methods.
NASA Astrophysics Data System (ADS)
Gnyba, M.; Wróbel, M. S.; Karpienko, K.; Milewska, D.; Jedrzejewska-Szczerska, M.
2015-07-01
In this article the simultaneous investigation of blood parameters by complementary optical methods, Raman spectroscopy and spectral-domain low-coherence interferometry, is presented. Thus, the mutual relationship between chemical and physical properties may be investigated, because low-coherence interferometry measures optical properties of the investigated object, while Raman spectroscopy gives information about its molecular composition. A series of in-vitro measurements were carried out to assess sufficient accuracy for monitoring of blood parameters. A vast number of blood samples with various hematological parameters, collected from different donors, were measured in order to achieve a statistical significance of results and validation of the methods. Preliminary results indicate the benefits in combination of presented complementary methods and form the basis for development of a multimodal system for rapid and accurate optical determination of selected parameters in whole human blood. Future development of optical systems and multivariate calibration models are planned to extend the number of detected blood parameters and provide a robust quantitative multi-component analysis.
Dynamic gas temperature measurement system. Volume 2: Operation and program manual
NASA Technical Reports Server (NTRS)
Purpura, P. T.
1983-01-01
The hot section technology (HOST) dynamic gas temperature measurement system computer program acquires data from two type B thermocouples of different diameters. The analysis method determines the in situ value of an aerodynamic parameter T, containing the heat transfer coefficient from the transfer function of the two thermocouples. This aerodynamic parameter is used to compute a fequency response spectrum and compensate the dynamic portion of the signal of the smaller thermocouple. The calculations for the aerodynamic parameter and the data compensation technique are discussed. Compensated data are presented in either the time or frequency domain, time domain data as dynamic temperature vs time, or frequency domain data.
Translation and validation of the Rhinosinusitis Disability Index for use in Nigeria.
Asoegwu, C N; Nwawolo, C C; Okubadejo, N U
2017-07-01
The Rhinosinusitis Disability Index (RSDI) is a validated and reliable measure of severity of chronic rhinosinusitis. The objective of this study was to translate and validate the instrument for use in Nigeria. This is a methodological study. 71 patients with chronic rhinosinusitis attending two Otolaryngology clinics in Lagos, Nigeria. Using standardized methods and trained translators, the RSDI was translated to vernacular (Yoruba language) and back-translated to culturally appropriate English. Data analysis comprised of assessment of the item quality, content validity and internal consistency of the back-translated Rhinosinusitis Disability Index (bRSDI), and correlation to the original RSDI. Content validity (floor and ceiling effects) showed 0% floor and ceiling effects for the total scores, 0% ceiling effects for all domains and floor effect for physical domain, and 9.9 and 8.5% floor effects for functional and emotional domains, respectively. The mean item-own correlation for physical domain was 0.54 ± 0.08, 0.72 ± 0.08 for functional domain and 0.74 ± 0.07 for emotional domain. All domain item-own correlations were higher than item-other domain correlations. The total Cronbach's alpha was 0.936 and was higher than 0.70 for all the domains representing good internal consistency. Pearson correlation analysis showed strong correlation of RSDI to bRSDI (total score 0.881; p = 0.000, and domain subscores-physical: 0.788; p = 0.000, functional: 0.830; p = 0.000, and emotional: 0.888; p = 0.000). The back-translated Rhinosinusitis Disability Index shows good face and content validity with good internal consistency while correlating linearly and significantly with the original Rhinosinusitis Disability Index and is recommended for use in Nigeria.
NASA Astrophysics Data System (ADS)
Ha, Taewoo; Lee, Howon; Sim, Kyung Ik; Kim, Jonghyeon; Jo, Young Chan; Kim, Jae Hoon; Baek, Na Yeon; Kang, Dai-ill; Lee, Han Hyoung
2017-05-01
We have established optimal methods for terahertz time-domain spectroscopic analysis of highly absorbing pigments in powder form based on our investigation of representative traditional Chinese pigments, such as azurite [blue-based color pigment], Chinese vermilion [red-based color pigment], and arsenic yellow [yellow-based color pigment]. To accurately extract the optical constants in the terahertz region of 0.1 - 3 THz, we carried out transmission measurements in such a way that intense absorption peaks did not completely suppress the transmission level. This required preparation of pellet samples with optimized thicknesses and material densities. In some cases, mixing the pigments with polyethylene powder was required to minimize absorption due to certain peak features. The resulting distortion-free terahertz spectra of the investigated set of pigment species exhibited well-defined unique spectral fingerprints. Our study will be useful to future efforts to establish non-destructive analysis methods of traditional pigments, to construct their spectral databases, and to apply these tools to restoration of cultural heritage materials.
NASA Astrophysics Data System (ADS)
Yamanari, Masahiro; Miura, Masahiro; Makita, Shuichi; Yatagai, Toyohiko; Yasuno, Yoshiaki
2007-02-01
Birefringence of retinal nerve fiber layer is measured by polarization-sensitive spectral domain optical coherence tomography using the B-scan-oriented polarization modulation method. Birefringence of the optical fiber and the cornea is compensated by Jones matrix based analysis. Three-dimensional phase retardation map around the optic nerve head and en-face phase retardation map of the retinal nerve fiber layer are shown. Unlike scanning laser polarimetry, our system can measure the phase retardation quantitatively without using bow-tie pattern of the birefringence in the macular region, which enables diagnosis of glaucoma even if the patients have macular disease.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-09-14
This package contains statistical routines for extracting features from multivariate time-series data which can then be used for subsequent multivariate statistical analysis to identify patterns and anomalous behavior. It calculates local linear or quadratic regression model fits to moving windows for each series and then summarizes the model coefficients across user-defined time intervals for each series. These methods are domain agnostic-but they have been successfully applied to a variety of domains, including commercial aviation and electric power grid data.
An Instructor's Diagnostic Aid for Feedback in Training.
ERIC Educational Resources Information Center
Andrews, Dee H.; Uliano, Kevin C.
1988-01-01
Instructor's Diagnostic Aid for Feedback in Training (IDAFT) is a computer-assisted method based on error analysis, domains of learning, and events of instruction. Its use with Navy team instructors is currently being explored. (JOW)
Automatic prediction of protein domains from sequence information using a hybrid learning system.
Nagarajan, Niranjan; Yona, Golan
2004-06-12
We describe a novel method for detecting the domain structure of a protein from sequence information alone. The method is based on analyzing multiple sequence alignments that are derived from a database search. Multiple measures are defined to quantify the domain information content of each position along the sequence and are combined into a single predictor using a neural network. The output is further smoothed and post-processed using a probabilistic model to predict the most likely transition positions between domains. The method was assessed using the domain definitions in SCOP and CATH for proteins of known structure and was compared with several other existing methods. Our method performs well both in terms of accuracy and sensitivity. It improves significantly over the best methods available, even some of the semi-manual ones, while being fully automatic. Our method can also be used to suggest and verify domain partitions based on structural data. A few examples of predicted domain definitions and alternative partitions, as suggested by our method, are also discussed. An online domain-prediction server is available at http://biozon.org/tools/domains/
The application of domain-driven design in NMS
NASA Astrophysics Data System (ADS)
Zhang, Jinsong; Chen, Yan; Qin, Shengjun
2011-12-01
In the traditional design approach of data-model-driven, system analysis and design phases are often separated which makes the demand information can not be expressed explicitly. The method is also easy to lead developer to the process-oriented programming, making codes between the modules or between hierarchies disordered. So it is hard to meet requirement of system scalability. The paper proposes a software hiberarchy based on rich domain model according to domain-driven design named FHRDM, then the Webwork + Spring + Hibernate (WSH) framework is determined. Domain-driven design aims to construct a domain model which not only meets the demand of the field where the software exists but also meets the need of software development. In this way, problems in Navigational Maritime System (NMS) development like big system business volumes, difficulty of requirement elicitation, high development costs and long development cycle can be resolved successfully.
Squeglia, Flavia; Bachert, Beth; Romano, Maria; Lukomski, Slawomir; Berisio, Rita
2013-09-01
Streptococcal collagen-like proteins (Scls) are widely expressed by the well recognized human pathogen Streptococcus pyogenes. These surface proteins contain a signature central collagen-like region and an amino-terminal globular domain, termed the variable domain, which is protruded away from the cell surface by the collagen-like domain. Despite their recognized importance in bacterial pathogenicity, no structural information is presently available on proteins of the Scl class. The variable domain of Scl2 from invasive M3-type S. pyogenes has successfully been crystallized using vapour-diffusion methods. The crystals diffracted to 1.5 Å resolution and belonged to space group H32, with unit-cell parameters a = 44.23, b = 44.23, c = 227.83 Å. The crystal structure was solved by single-wavelength anomalous dispersion using anomalous signal from a europium chloride derivative.|
Interface conditions for domain decomposition with radical grid refinement
NASA Technical Reports Server (NTRS)
Scroggs, Jeffrey S.
1991-01-01
Interface conditions for coupling the domains in a physically motivated domain decomposition method are discussed. The domain decomposition is based on an asymptotic-induced method for the numerical solution of hyperbolic conservation laws with small viscosity. The method consists of multiple stages. The first stage is to obtain a first approximation using a first-order method, such as the Godunov scheme. Subsequent stages of the method involve solving internal-layer problem via a domain decomposition. The method is derived and justified via singular perturbation techniques.
Ultrasonic isolation of the outer membrane of Escherichia coli with autodisplayed Z-domains.
Bong, Ji-Hong; Yoo, Gu; Park, Min; Kang, Min-Jung; Jose, Joachim; Pyun, Jae-Chul
2014-11-01
The outer membrane of Escherichia coli was previously isolated as a liposome-like outer membrane particle using an enzymatic treatment for lysozymes; for immunoassays, the particles were subsequently layered on solid supports via hydrophobic interactions. This work presents an enzyme-free isolation method for the E. coli outer membrane with autodisplayed Z-domains using ultrasonication. First, the properties of the outer membrane particle, such as the particle size, zeta potential, and total protein, were compared with the properties of particles obtained using the previous preparation methods. Compared with the conventional isolation method using an enzyme treatment, the ultrasonic method exhibited a higher efficiency at isolating the outer membrane and less contamination by cytosolic proteins. The isolated outer membrane particles were layered on a gold surface, and the roughness and thickness of the layered outer membrane layers were subsequently analyzed using AFM analysis. Finally, the antibody-binding activity of two outer membrane layers with autodisplayed Z-domains created from particles that were isolated using the enzymatic and ultrasonic isolation methods was measured using fluorescein-labeled antibody as a model analyte, and the activity of the outer membrane layer that was isolated from the ultrasonic method was estimated to be more than 20% higher than that from the conventional enzymatic method. Copyright © 2014 Elsevier Inc. All rights reserved.
Large-Signal Lyapunov-Based Stability Analysis of DC/AC Inverters and Inverter-Based Microgrids
NASA Astrophysics Data System (ADS)
Kabalan, Mahmoud
Microgrid stability studies have been largely based on small-signal linearization techniques. However, the validity and magnitude of the linearization domain is limited to small perturbations. Thus, there is a need to examine microgrids with large-signal nonlinear techniques to fully understand and examine their stability. Large-signal stability analysis can be accomplished by Lyapunov-based mathematical methods. These Lyapunov methods estimate the domain of asymptotic stability of the studied system. A survey of Lyapunov-based large-signal stability studies showed that few large-signal studies have been completed on either individual systems (dc/ac inverters, dc/dc rectifiers, etc.) or microgrids. The research presented in this thesis addresses the large-signal stability of droop-controlled dc/ac inverters and inverter-based microgrids. Dc/ac power electronic inverters allow microgrids to be technically feasible. Thus, as a prelude to examining the stability of microgrids, the research presented in Chapter 3 analyzes the stability of inverters. First, the 13 th order large-signal nonlinear model of a droop-controlled dc/ac inverter connected to an infinite bus is presented. The singular perturbation method is used to decompose the nonlinear model into 11th, 9th, 7th, 5th, 3rd and 1st order models. Each model ignores certain control or structural components of the full order model. The aim of the study is to understand the accuracy and validity of the reduced order models in replicating the performance of the full order nonlinear model. The performance of each model is studied in three different areas: time domain simulations, Lyapunov's indirect method and domain of attraction estimation. The work aims to present the best model to use in each of the three domains of study. Results show that certain reduced order models are capable of accurately reproducing the performance of the full order model while others can be used to gain insights into those three areas of study. This will enable future studies to save computational effort and produce the most accurate results according to the needs of the study being performed. Moreover, the effect of grid (line) impedance on the accuracy of droop control is explored using the 5th order model. Simulation results show that traditional droop control is valid up to R/X line impedance value of 2. Furthermore, the 3rd order nonlinear model improves the currently available inverter-infinite bus models by accounting for grid impedance, active power-frequency droop and reactive power-voltage droop. Results show the 3rd order model's ability to account for voltage and reactive power changes during a transient event. Finally, the large-signal Lyapunov-based stability analysis is completed for a 3 bus microgrid system (made up of 2 inverters and 1 linear load). The thesis provides a systematic state space large-signal nonlinear mathematical modeling method of inverter-based microgrids. The inverters include the dc-side dynamics associated with dc sources. The mathematical model is then used to estimate the domain of asymptotic stability of the 3 bus microgrid. The three bus microgrid system was used as a case study to highlight the design and optimization capability of a large-signal-based approach. The study explores the effect of system component sizing, load transient and generation variations on the asymptotic stability of the microgrid. Essentially, this advancement gives microgrid designers and engineers the ability to manipulate the domain of asymptotic stability depending on performance requirements. Especially important, this research was able to couple the domain of asymptotic stability of the ac microgrid with that of the dc side voltage source. Time domain simulations were used to demonstrate the mathematical nonlinear analysis results.
Park, Hyunseok; Magee, Christopher L
2017-01-01
The aim of this paper is to propose a new method to identify main paths in a technological domain using patent citations. Previous approaches for using main path analysis have greatly improved our understanding of actual technological trajectories but nonetheless have some limitations. They have high potential to miss some dominant patents from the identified main paths; nonetheless, the high network complexity of their main paths makes qualitative tracing of trajectories problematic. The proposed method searches backward and forward paths from the high-persistence patents which are identified based on a standard genetic knowledge persistence algorithm. We tested the new method by applying it to the desalination and the solar photovoltaic domains and compared the results to output from the same domains using a prior method. The empirical results show that the proposed method can dramatically reduce network complexity without missing any dominantly important patents. The main paths identified by our approach for two test cases are almost 10x less complex than the main paths identified by the existing approach. The proposed approach identifies all dominantly important patents on the main paths, but the main paths identified by the existing approach miss about 20% of dominantly important patents.
2017-01-01
The aim of this paper is to propose a new method to identify main paths in a technological domain using patent citations. Previous approaches for using main path analysis have greatly improved our understanding of actual technological trajectories but nonetheless have some limitations. They have high potential to miss some dominant patents from the identified main paths; nonetheless, the high network complexity of their main paths makes qualitative tracing of trajectories problematic. The proposed method searches backward and forward paths from the high-persistence patents which are identified based on a standard genetic knowledge persistence algorithm. We tested the new method by applying it to the desalination and the solar photovoltaic domains and compared the results to output from the same domains using a prior method. The empirical results show that the proposed method can dramatically reduce network complexity without missing any dominantly important patents. The main paths identified by our approach for two test cases are almost 10x less complex than the main paths identified by the existing approach. The proposed approach identifies all dominantly important patents on the main paths, but the main paths identified by the existing approach miss about 20% of dominantly important patents. PMID:28135304
Modeling Heterogeneity in Students Seeking College Counseling
ERIC Educational Resources Information Center
Nordberg, Samuel S.
2013-01-01
Objective: A series of four studies explored the heuristic value of a method of grouping students in counseling by the severity of symptoms across eight domains. Method: Participants were over 50,000 college students in counseling, assessed with the CCAPS-62 and -34 as part of routine clinical care. Latent Profile Analysis was used to group…
Leksa, N.C.; Chiu, P.-L.; Bou-Assaf, G.M.; Quan, C.; Liu, Z.; Goodman, A.B.; Chambers, M.G.; Tsutakawa, S.E.; Hammel, M.; Peters, R.T.; Walz, T.; Kulman, J.D.
2017-01-01
SUMMARY Background Fusion of the human IgG1 Fc domain to the C-terminal C2 domain of B domain-deleted (BDD) factor VIII (FVIII) results in the rFVIIIFc fusion protein that has a 1.5-fold longer half-life in humans. Objective To assess the structural properties of rFVIIIFc by comparing its constituent FVIII and Fc elements with their respective isolated components and evaluating their structural independence within rFVIIIFc. Methods rFVIIIFc and its isolated FVIII and Fc components were compared by hydrogen-deuterium exchange mass spectrometry (HDX-MS). The structure of rFVIIIFc was also evaluated by X-ray crystallography, small-angle X-ray scattering (SAXS), and electron microscopy (EM). The degree of steric interference by the appended Fc domain was assessed by EM and surface plasmon resonance (SPR). Results HDX-MS analysis of rFVIIIFc revealed that fusion caused no structural perturbations in FVIII or Fc. The rFVIIIFc crystal structure showed that the FVIII component is indistinguishable from published BDD FVIII structures. The Fc domain was not observed, indicating high mobility. SAXS analysis was consistent with an ensemble of rigid-body models in which the Fc domain exists in a largely extended orientation relative to FVIII. Binding of Fab fragments of anti-C2 domain antibodies to BDD FVIII was visualized by EM, and the affinities of the corresponding intact antibodies for BDD FVIII and rFVIIIFc were comparable by SPR analysis. Conclusions The FVIII and Fc components of rFVIIIFc are structurally indistinguishable from their isolated constituents and exhibit a high degree of structural independence, consistent with the functional comparability of rFVIIIFc and unmodified FVIII. PMID:28397397
ERIC Educational Resources Information Center
Embrey, Karen K.
2012-01-01
Cognitive task analysis (CTA) is a knowledge elicitation technique employed for acquiring expertise from domain specialists to support the effective instruction of novices. CTA guided instruction has proven effective in improving surgical skills training for medical students and surgical residents. The standard, current method of teaching clinical…
USDA-ARS?s Scientific Manuscript database
Adaptive waveform interpretation with Gaussian filtering (AWIGF) and second order bounded mean oscillation operator Z square 2(u,t,r) are TDR analysis methods based on second order differentiation. AWIGF was originally designed for relatively long probe (greater than 150 mm) TDR waveforms, while Z s...
Hemalatha, G. R.; Rao, D. Satyanarayana; Guruprasad, L.
2007-01-01
We have identified four repeats and ten domains that are novel in proteins encoded by the Bacillus anthracis str. Ames proteome using automated in silico methods. A “repeat” corresponds to a region comprising less than 55-amino-acid residues that occur more than once in the protein sequence and sometimes present in tandem. A “domain” corresponds to a conserved region with greater than 55-amino-acid residues and may be present as single or multiple copies in the protein sequence. These correspond to (1) 57-amino-acid-residue PxV domain, (2) 122-amino-acid-residue FxF domain, (3) 111-amino-acid-residue YEFF domain, (4) 109-amino-acid-residue IMxxH domain, (5) 103-amino-acid-residue VxxT domain, (6) 84-amino-acid-residue ExW domain, (7) 104-amino-acid-residue NTGFIG domain, (8) 36-amino-acid-residue NxGK repeat, (9) 95-amino-acid-residue VYV domain, (10) 75-amino-acid-residue KEWE domain, (11) 59-amino-acid-residue AFL domain, (12) 53-amino-acid-residue RIDVK repeat, (13) (a) 41-amino-acid-residue AGQF repeat and (b) 42-amino-acid-residue GSAL repeat. A repeat or domain type is characterized by specific conserved sequence motifs. We discuss the presence of these repeats and domains in proteins from other genomes and their probable secondary structure. PMID:17538688
NASA Astrophysics Data System (ADS)
Wu, Jiangning; Wang, Xiaohuan
Rapidly increasing amount of mobile phone users and types of services leads to a great accumulation of complaining information. How to use this information to enhance the quality of customers' services is a big issue at present. To handle this kind of problem, the paper presents an approach to construct a domain knowledge map for navigating the explicit and tacit knowledge in two ways: building the Topic Map-based explicit knowledge navigation model, which includes domain TM construction, a semantic topic expansion algorithm and VSM-based similarity calculation; building Social Network Analysis-based tacit knowledge navigation model, which includes a multi-relational expert navigation algorithm and the criterions to evaluate the performance of expert networks. In doing so, both the customer managers and operators in call centers can find the appropriate knowledge and experts quickly and exactly. The experimental results show that the above method is very powerful for knowledge navigation.
Glaucoma diagnosis by mapping macula with Fourier domain optical coherence tomography
NASA Astrophysics Data System (ADS)
Tan, Ou; Lu, Ake; Chopra, Vik; Varma, Rohit; Hiroshi, Ishikawa; Schuman, Joel; Huang, David
2008-03-01
A new image segmentation method was developed to detect macular retinal sub-layers boundary on newly-developed Fourier-Domain Optical Coherence Tomography (FD-OCT) with macular grid scan pattern. The segmentation results were used to create thickness map of macular ganglion cell complex (GCC), which contains the ganglion cell dendrites, cell bodies and axons. Overall average and several pattern analysis parameters were defined on the GCC thickness map and compared for the diagnosis of glaucoma. Intraclass correlation (ICC) is used to compare the reproducibility of the parameters. Area under receiving operative characteristic curve (AROC) was calculated to compare the diagnostic power. The result is also compared to the output of clinical time-domain OCT (TD-OCT). We found that GCC based parameters had good repeatability and comparable diagnostic power with circumpapillary nerve fiber layer (cpNFL) thickness. Parameters based on pattern analysis can increase the diagnostic power of GCC macular mapping.
Zhang, T; Yang, M; Xiao, X; Feng, Z; Li, C; Zhou, Z; Ren, Q; Li, X
2014-03-01
Many infectious diseases exhibit repetitive or regular behaviour over time. Time-domain approaches, such as the seasonal autoregressive integrated moving average model, are often utilized to examine the cyclical behaviour of such diseases. The limitations for time-domain approaches include over-differencing and over-fitting; furthermore, the use of these approaches is inappropriate when the assumption of linearity may not hold. In this study, we implemented a simple and efficient procedure based on the fast Fourier transformation (FFT) approach to evaluate the epidemic dynamic of scarlet fever incidence (2004-2010) in China. This method demonstrated good internal and external validities and overcame some shortcomings of time-domain approaches. The procedure also elucidated the cycling behaviour in terms of environmental factors. We concluded that, under appropriate circumstances of data structure, spectral analysis based on the FFT approach may be applicable for the study of oscillating diseases.
Design sensitivity analysis using EAL. Part 1: Conventional design parameters
NASA Technical Reports Server (NTRS)
Dopker, B.; Choi, Kyung K.; Lee, J.
1986-01-01
A numerical implementation of design sensitivity analysis of builtup structures is presented, using the versatility and convenience of an existing finite element structural analysis code and its database management system. The finite element code used in the implemenatation presented is the Engineering Analysis Language (EAL), which is based on a hybrid method of analysis. It was shown that design sensitivity computations can be carried out using the database management system of EAL, without writing a separate program and a separate database. Conventional (sizing) design parameters such as cross-sectional area of beams or thickness of plates and plane elastic solid components are considered. Compliance, displacement, and stress functionals are considered as performance criteria. The method presented is being extended to implement shape design sensitivity analysis using a domain method and a design component method.
Signal analysis techniques for incipient failure detection in turbomachinery
NASA Technical Reports Server (NTRS)
Coffin, T.
1985-01-01
Signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery were developed, implemented and evaluated. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques were implemented on a computer and applied to dynamic signals. A laboratory evaluation of the methods with respect to signal detection capability is described. Plans for further technique evaluation and data base development to characterize turbopump incipient failure modes from Space Shuttle main engine (SSME) hot firing measurements are outlined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang Shaojie; Tang Xiangyang; School of Automation, Xi'an University of Posts and Telecommunications, Xi'an, Shaanxi 710121
2012-09-15
Purposes: The suppression of noise in x-ray computed tomography (CT) imaging is of clinical relevance for diagnostic image quality and the potential for radiation dose saving. Toward this purpose, statistical noise reduction methods in either the image or projection domain have been proposed, which employ a multiscale decomposition to enhance the performance of noise suppression while maintaining image sharpness. Recognizing the advantages of noise suppression in the projection domain, the authors propose a projection domain multiscale penalized weighted least squares (PWLS) method, in which the angular sampling rate is explicitly taken into consideration to account for the possible variation ofmore » interview sampling rate in advanced clinical or preclinical applications. Methods: The projection domain multiscale PWLS method is derived by converting an isotropic diffusion partial differential equation in the image domain into the projection domain, wherein a multiscale decomposition is carried out. With adoption of the Markov random field or soft thresholding objective function, the projection domain multiscale PWLS method deals with noise at each scale. To compensate for the degradation in image sharpness caused by the projection domain multiscale PWLS method, an edge enhancement is carried out following the noise reduction. The performance of the proposed method is experimentally evaluated and verified using the projection data simulated by computer and acquired by a CT scanner. Results: The preliminary results show that the proposed projection domain multiscale PWLS method outperforms the projection domain single-scale PWLS method and the image domain multiscale anisotropic diffusion method in noise reduction. In addition, the proposed method can preserve image sharpness very well while the occurrence of 'salt-and-pepper' noise and mosaic artifacts can be avoided. Conclusions: Since the interview sampling rate is taken into account in the projection domain multiscale decomposition, the proposed method is anticipated to be useful in advanced clinical and preclinical applications where the interview sampling rate varies.« less
1992-12-01
and add new attributes as needed (11:129). 2.2.3.2 Feature Oriented Domain Analysis In their Feature-Oriented Domain Analysis ( FODA ) study, the...dissertation, The University of Texas at Austin, Austin Texas, 1990. 12. Kang, Kyo C. and others. Feature-Oriented Domain Analysis ( FODA ) Feasibil- ity Study...2-1 2.2.2 Requirements Languages ..................... 2-2 2.2.3 Domain Analysis ............................ 2-3 2.2.4
NASA Astrophysics Data System (ADS)
Escalas, M.; Queralt, P.; Ledo, J.; Marcuello, A.
2012-04-01
Magnetotelluric (MT) method is a passive electromagnetic technique, which is currently used to characterize sites for the geological storage of CO2. These later ones are usually located nearby industrialized, urban or farming areas, where man-made electromagnetic (EM) signals contaminate the MT data. The identification and characterization of the artificial EM sources which generate the so-called "cultural noise" is an important challenge to obtain the most reliable results with the MT method. The polarization attributes of an EM signal (tilt angle, ellipticity and phase difference between its orthogonal components) are related to the character of its source. In a previous work (Escalas et al. 2011), we proposed a method to distinguish natural signal from cultural noise in the raw MT data. It is based on the polarization analysis of the MT time-series in the time-frequency domain, using a wavelet scheme. We developed an algorithm to implement the method, and was tested with both synthetic and field data. In 2010, we carried out a controlled-source electromagnetic (CSEM) experiment in the Hontomín site (the Research Laboratory on Geological Storage of CO2 in Spain). MT time-series were contaminated at different frequencies with the signal emitted by a controlled artificial EM source: two electric dipoles (1 km long, arranged in North-South and East-West directions). The analysis with our algorithm of the electric field time-series acquired in this experiment was successful: the polarization attributes of both the natural and artificial signal were obtained in the time-frequency domain, highlighting their differences. The processing of the magnetic field time-series acquired in the Hontomín experiment has been done in the present work. This new analysis of the polarization attributes of the magnetic field data has provided additional information to detect the contribution of the artificial source in the measured data. Moreover, the joint analysis of the polarization attributes of the electric and magnetic field has been crucial to fully characterize the properties and the location of the noise source. Escalas, M., Queralt, P., Ledo, J., Marcuello, A., 2011. Identification of cultural noise sources in magnetotelluric data: estimating polarization attributes in the time-frequency domain using wavelet analysis. Geophysical Research Abstracts Vol. 13, EGU2011-6085. EGU General Assembly 2011.
Domain decomposition methods for systems of conservation laws: Spectral collocation approximations
NASA Technical Reports Server (NTRS)
Quarteroni, Alfio
1989-01-01
Hyperbolic systems of conversation laws are considered which are discretized in space by spectral collocation methods and advanced in time by finite difference schemes. At any time-level a domain deposition method based on an iteration by subdomain procedure was introduced yielding at each step a sequence of independent subproblems (one for each subdomain) that can be solved simultaneously. The method is set for a general nonlinear problem in several space variables. The convergence analysis, however, is carried out only for a linear one-dimensional system with continuous solutions. A precise form of the error reduction factor at each iteration is derived. Although the method is applied here to the case of spectral collocation approximation only, the idea is fairly general and can be used in a different context as well. For instance, its application to space discretization by finite differences is straight forward.
Performance analysis of FET microwave devices by use of extended spectral-element time-domain method
NASA Astrophysics Data System (ADS)
Sheng, Yijun; Xu, Kan; Wang, Daoxiang; Chen, Rushan
2013-05-01
The extended spectral-element time-domain (SETD) method is employed to analyse field effect transistor (FET) microwave devices. In order to impose the contribution of the FET microwave devices into the electromagnetic simulation, the SETD method is extended by introducing a lumped current term into the vector Helmholtz equation. The change of currents on each lumped component can be expressed by the change of voltage via corresponding models of equivalent circuit. The electric fields around the lumped component must be influenced by the change of voltage on each lumped component, and vice versa. So a global coupling about the EM-circuit can be built directly. The fully explicit solving scheme is maintained in this extended SETD method and the CPU time can be saved spontaneously. Three practical FET microwave devices are analysed in this article. The numerical results demonstrate the ability and accuracy of this method.
An improved local radial point interpolation method for transient heat conduction analysis
NASA Astrophysics Data System (ADS)
Wang, Feng; Lin, Gao; Zheng, Bao-Jing; Hu, Zhi-Qiang
2013-06-01
The smoothing thin plate spline (STPS) interpolation using the penalty function method according to the optimization theory is presented to deal with transient heat conduction problems. The smooth conditions of the shape functions and derivatives can be satisfied so that the distortions hardly occur. Local weak forms are developed using the weighted residual method locally from the partial differential equations of the transient heat conduction. Here the Heaviside step function is used as the test function in each sub-domain to avoid the need for a domain integral. Essential boundary conditions can be implemented like the finite element method (FEM) as the shape functions possess the Kronecker delta property. The traditional two-point difference method is selected for the time discretization scheme. Three selected numerical examples are presented in this paper to demonstrate the availability and accuracy of the present approach comparing with the traditional thin plate spline (TPS) radial basis functions.
No-Reference Video Quality Assessment Based on Statistical Analysis in 3D-DCT Domain.
Li, Xuelong; Guo, Qun; Lu, Xiaoqiang
2016-05-13
It is an important task to design models for universal no-reference video quality assessment (NR-VQA) in multiple video processing and computer vision applications. However, most existing NR-VQA metrics are designed for specific distortion types which are not often aware in practical applications. A further deficiency is that the spatial and temporal information of videos is hardly considered simultaneously. In this paper, we propose a new NR-VQA metric based on the spatiotemporal natural video statistics (NVS) in 3D discrete cosine transform (3D-DCT) domain. In the proposed method, a set of features are firstly extracted based on the statistical analysis of 3D-DCT coefficients to characterize the spatiotemporal statistics of videos in different views. These features are used to predict the perceived video quality via the efficient linear support vector regression (SVR) model afterwards. The contributions of this paper are: 1) we explore the spatiotemporal statistics of videos in 3DDCT domain which has the inherent spatiotemporal encoding advantage over other widely used 2D transformations; 2) we extract a small set of simple but effective statistical features for video visual quality prediction; 3) the proposed method is universal for multiple types of distortions and robust to different databases. The proposed method is tested on four widely used video databases. Extensive experimental results demonstrate that the proposed method is competitive with the state-of-art NR-VQA metrics and the top-performing FR-VQA and RR-VQA metrics.
NASA Technical Reports Server (NTRS)
Barth, Timothy J.; Kutler, Paul (Technical Monitor)
1998-01-01
Several stabilized demoralization procedures for conservation law equations on triangulated domains will be considered. Specifically, numerical schemes based on upwind finite volume, fluctuation splitting, Galerkin least-squares, and space discontinuous Galerkin demoralization will be considered in detail. A standard energy analysis for several of these methods will be given via entropy symmetrization. Next, we will present some relatively new theoretical results concerning congruence relationships for left or right symmetrized equations. These results suggest new variants of existing FV, DG, GLS, and FS methods which are computationally more efficient while retaining the pleasant theoretical properties achieved by entropy symmetrization. In addition, the task of Jacobean linearization of these schemes for use in Newton's method is greatly simplified owing to exploitation of exact symmetries which exist in the system. The FV, FS and DG schemes also permit discrete maximum principle analysis and enforcement which greatly adds to the robustness of the methods. Discrete maximum principle theory will be presented for general finite volume approximations on unstructured meshes. Next, we consider embedding these nonlinear space discretizations into exact and inexact Newton solvers which are preconditioned using a nonoverlapping (Schur complement) domain decomposition technique. Elements of nonoverlapping domain decomposition for elliptic problems will be reviewed followed by the present extension to hyperbolic and elliptic-hyperbolic problems. Other issues of practical relevance such the meshing of geometries, code implementation, turbulence modeling, global convergence, etc, will. be addressed as needed.
NASA Technical Reports Server (NTRS)
Barth, Timothy; Chancellor, Marisa K. (Technical Monitor)
1997-01-01
Several stabilized discretization procedures for conservation law equations on triangulated domains will be considered. Specifically, numerical schemes based on upwind finite volume, fluctuation splitting, Galerkin least-squares, and space discontinuous Galerkin discretization will be considered in detail. A standard energy analysis for several of these methods will be given via entropy symmetrization. Next, we will present some relatively new theoretical results concerning congruence relationships for left or right symmetrized equations. These results suggest new variants of existing FV, DG, GLS and FS methods which are computationally more efficient while retaining the pleasant theoretical properties achieved by entropy symmetrization. In addition, the task of Jacobian linearization of these schemes for use in Newton's method is greatly simplified owing to exploitation of exact symmetries which exist in the system. These variants have been implemented in the "ELF" library for which example calculations will be shown. The FV, FS and DG schemes also permit discrete maximum principle analysis and enforcement which greatly adds to the robustness of the methods. Some prevalent limiting strategies will be reviewed. Next, we consider embedding these nonlinear space discretizations into exact and inexact Newton solvers which are preconditioned using a nonoverlapping (Schur complement) domain decomposition technique. Elements of nonoverlapping domain decomposition for elliptic problems will be reviewed followed by the present extension to hyperbolic and elliptic-hyperbolic problems. Other issues of practical relevance such the meshing of geometries, code implementation, turbulence modeling, global convergence, etc. will be addressed as needed.
NASA Astrophysics Data System (ADS)
Nguyen-Thanh, Nhon; Li, Weidong; Zhou, Kun
2018-03-01
This paper develops a coupling approach which integrates the meshfree method and isogeometric analysis (IGA) for static and free-vibration analyses of cracks in thin-shell structures. In this approach, the domain surrounding the cracks is represented by the meshfree method while the rest domain is meshed by IGA. The present approach is capable of preserving geometry exactness and high continuity of IGA. The local refinement is achieved by adding the nodes along the background cells in the meshfree domain. Moreover, the equivalent domain integral technique for three-dimensional problems is derived from the additional Kirchhoff-Love theory to compute the J-integral for the thin-shell model. The proposed approach is able to address the problems involving through-the-thickness cracks without using additional rotational degrees of freedom, which facilitates the enrichment strategy for crack tips. The crack tip enrichment effects and the stress distribution and displacements around the crack tips are investigated. Free vibrations of cracks in thin shells are also analyzed. Numerical examples are presented to demonstrate the accuracy and computational efficiency of the coupling approach.
Modal element method for scattering of sound by absorbing bodies
NASA Technical Reports Server (NTRS)
Baumeister, Kenneth J.; Kreider, Kevin L.
1992-01-01
The modal element method for acoustic scattering from 2-D body is presented. The body may be acoustically soft (absorbing) or hard (reflecting). The infinite computational region is divided into two subdomains - the bounded finite element domain, which is characterized by complicated geometry and/or variable material properties, and the surrounding unbounded homogeneous domain. The acoustic pressure field is represented approximately in the finite element domain by a finite element solution, and is represented analytically by an eigenfunction expansion in the homogeneous domain. The two solutions are coupled by the continuity of pressure and velocity across the interface between the two subdomains. Also, for hard bodies, a compact modal ring grid system is introduced for which computing requirements are drastically reduced. Analysis for 2-D scattering from solid and coated (acoustically treated) bodies is presented, and several simple numerical examples are discussed. In addition, criteria are presented for determining the number of modes to accurately resolve the scattered pressure field from a solid cylinder as a function of the frequency of the incoming wave and the radius of the cylinder.
Recent Development of Anticancer Therapeutics Targeting Akt
Morrow, John K.; Du-Cuny, Lei; Chen, Lu; Meuillet, Emmanuelle J.; Mash, Eugene A.; Powis, Garth; Zhang, Shuxing
2013-01-01
The serine/threonine kinase Akt has proven to be a significant signaling target, involved in various biological functions. Because of its cardinal role in numerous cellular responses, Akt has been implicated in many human diseases, particularly cancer. It has been established that Akt is a viable and feasible target for anticancer therapeutics. Analysis of all Akt kinases reveals conserved homology for an N-terminal regulatory domain, which contains a pleckstrin-homology (PH) domain for cellular translocation, a kinase domain with serine/threonine specificity, and a C-terminal extension domain. These well defined regions have been targeted, and various approaches, including in silico methods, have been implemented to develop Akt inhibitors. In spite of unique techniques and a prolific body of knowledge surrounding Akt, no targeted Akt therapeutics have reached the market yet. Here we will highlight successes and challenges to date on the development of anticancer agents modulating the Akt pathway in recent patents as well as discuss the methods employed for this task. Special attention will be given to patents with focus on those discoveries using computer-aided drug design approaches. PMID:21110830
Analysis of Nanodomain Composition in High-Impact Polypropylene by Atomic Force Microscopy-Infrared.
Tang, Fuguang; Bao, Peite; Su, Zhaohui
2016-05-03
In this paper, compositions of nanodomains in a commercial high-impact polypropylene (HIPP) were investigated by an atomic force microscopy-infrared (AFM-IR) technique. An AFM-IR quantitative analysis method was established for the first time, which was then employed to analyze the polyethylene content in the nanoscopic domains of the rubber particles dispersed in the polypropylene matrix. It was found that the polyethylene content in the matrix was close to zero and was high in the rubbery intermediate layers, both as expected. However, the major component of the rigid cores of the rubber particles was found to be polypropylene rather than polyethylene, contrary to what was previously believed. The finding provides new insight into the complicated structure of HIPPs, and the AFM-IR quantitative method reported here offers a useful tool for assessing compositions of nanoscopic domains in complex polymeric systems.
NASA Astrophysics Data System (ADS)
FernáNdez Pantoja, M.; Yarovoy, A. G.; Rubio Bretones, A.; GonzáLez GarcíA, S.
2009-12-01
This paper presents a procedure to extend the methods of moments in time domain for the transient analysis of thin-wire antennas to include those cases where the antennas are located over a lossy half-space. This extended technique is based on the reflection coefficient (RC) approach, which approximates the fields incident on the ground interface as plane waves and calculates the time domain RC using the inverse Fourier transform of Fresnel equations. The implementation presented in this paper uses general expressions for the RC which extend its range of applicability to lossy grounds, and is proven to be accurate and fast for antennas located not too near to the ground. The resulting general purpose procedure, able to treat arbitrarily oriented thin-wire antennas, is appropriate for all kind of half-spaces, including lossy cases, and it has turned out to be as computationally fast solving the problem of an arbitrary ground as dealing with a perfect electric conductor ground plane. Results show a numerical validation of the method for different half-spaces, paying special attention to the influence of the antenna to ground distance in the accuracy of the results.
Discovering body site and severity modifiers in clinical texts.
Dligach, Dmitriy; Bethard, Steven; Becker, Lee; Miller, Timothy; Savova, Guergana K
2014-01-01
To research computational methods for discovering body site and severity modifiers in clinical texts. We cast the task of discovering body site and severity modifiers as a relation extraction problem in the context of a supervised machine learning framework. We utilize rich linguistic features to represent the pairs of relation arguments and delegate the decision about the nature of the relationship between them to a support vector machine model. We evaluate our models using two corpora that annotate body site and severity modifiers. We also compare the model performance to a number of rule-based baselines. We conduct cross-domain portability experiments. In addition, we carry out feature ablation experiments to determine the contribution of various feature groups. Finally, we perform error analysis and report the sources of errors. The performance of our method for discovering body site modifiers achieves F1 of 0.740-0.908 and our method for discovering severity modifiers achieves F1 of 0.905-0.929. Results indicate that both methods perform well on both in-domain and out-domain data, approaching the performance of human annotators. The most salient features are token and named entity features, although syntactic dependency features also contribute to the overall performance. The dominant sources of errors are infrequent patterns in the data and inability of the system to discern deeper semantic structures. We investigated computational methods for discovering body site and severity modifiers in clinical texts. Our best system is released open source as part of the clinical Text Analysis and Knowledge Extraction System (cTAKES).
A framework for joint image-and-shape analysis
NASA Astrophysics Data System (ADS)
Gao, Yi; Tannenbaum, Allen; Bouix, Sylvain
2014-03-01
Techniques in medical image analysis are many times used for the comparison or regression on the intensities of images. In general, the domain of the image is a given Cartesian grids. Shape analysis, on the other hand, studies the similarities and differences among spatial objects of arbitrary geometry and topology. Usually, there is no function defined on the domain of shapes. Recently, there has been a growing needs for defining and analyzing functions defined on the shape space, and a coupled analysis on both the shapes and the functions defined on them. Following this direction, in this work we present a coupled analysis for both images and shapes. As a result, the statistically significant discrepancies in both the image intensities as well as on the underlying shapes are detected. The method is applied on both brain images for the schizophrenia and heart images for atrial fibrillation patients.
New methods for engineering site characterization using reflection and surface wave seismic survey
NASA Astrophysics Data System (ADS)
Chaiprakaikeow, Susit
This study presents two new seismic testing methods for engineering application, a new shallow seismic reflection method and Time Filtered Analysis of Surface Waves (TFASW). Both methods are described in this dissertation. The new shallow seismic reflection was developed to measure reflection at a single point using two to four receivers, assuming homogeneous, horizontal layering. It uses one or more shakers driven by a swept sine function as a source, and the cross-correlation technique to identify wave arrivals. The phase difference between the source forcing function and the ground motion due to the dynamic response of the shaker-ground interface was corrected by using a reference geophone. Attenuated high frequency energy was also recovered using the whitening in frequency domain. The new shallow seismic reflection testing was performed at the crest of Porcupine Dam in Paradise, Utah. The testing used two horizontal Vibroseis sources and four receivers for spacings between 6 and 300 ft. Unfortunately, the results showed no clear evidence of the reflectors despite correction of the magnitude and phase of the signals. However, an improvement in the shape of the cross-correlations was noticed after the corrections. The results showed distinct primary lobes in the corrected cross-correlated signals up to 150 ft offset. More consistent maximum peaks were observed in the corrected waveforms. TFASW is a new surface (Rayleigh) wave method to determine the shear wave velocity profile at a site. It is a time domain method as opposed to the Spectral Analysis of Surface Waves (SASW) method, which is a frequency domain method. This method uses digital filtering to optimize bandwidth used to determine the dispersion curve. Results from testings at three different sites in Utah indicated good agreement with the dispersion curves measured using both TFASW and SASW methods. The advantage of TFASW method is that the dispersion curves had less scatter at long wavelengths as a result from wider bandwidth used in those tests.
Multiresolution persistent homology for excessively large biomolecular datasets
NASA Astrophysics Data System (ADS)
Xia, Kelin; Zhao, Zhixiong; Wei, Guo-Wei
2015-10-01
Although persistent homology has emerged as a promising tool for the topological simplification of complex data, it is computationally intractable for large datasets. We introduce multiresolution persistent homology to handle excessively large datasets. We match the resolution with the scale of interest so as to represent large scale datasets with appropriate resolution. We utilize flexibility-rigidity index to access the topological connectivity of the data set and define a rigidity density for the filtration analysis. By appropriately tuning the resolution of the rigidity density, we are able to focus the topological lens on the scale of interest. The proposed multiresolution topological analysis is validated by a hexagonal fractal image which has three distinct scales. We further demonstrate the proposed method for extracting topological fingerprints from DNA molecules. In particular, the topological persistence of a virus capsid with 273 780 atoms is successfully analyzed which would otherwise be inaccessible to the normal point cloud method and unreliable by using coarse-grained multiscale persistent homology. The proposed method has also been successfully applied to the protein domain classification, which is the first time that persistent homology is used for practical protein domain analysis, to our knowledge. The proposed multiresolution topological method has potential applications in arbitrary data sets, such as social networks, biological networks, and graphs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xia, Kelin; Zhao, Zhixiong; Wei, Guo-Wei, E-mail: wei@math.msu.edu
Although persistent homology has emerged as a promising tool for the topological simplification of complex data, it is computationally intractable for large datasets. We introduce multiresolution persistent homology to handle excessively large datasets. We match the resolution with the scale of interest so as to represent large scale datasets with appropriate resolution. We utilize flexibility-rigidity index to access the topological connectivity of the data set and define a rigidity density for the filtration analysis. By appropriately tuning the resolution of the rigidity density, we are able to focus the topological lens on the scale of interest. The proposed multiresolution topologicalmore » analysis is validated by a hexagonal fractal image which has three distinct scales. We further demonstrate the proposed method for extracting topological fingerprints from DNA molecules. In particular, the topological persistence of a virus capsid with 273 780 atoms is successfully analyzed which would otherwise be inaccessible to the normal point cloud method and unreliable by using coarse-grained multiscale persistent homology. The proposed method has also been successfully applied to the protein domain classification, which is the first time that persistent homology is used for practical protein domain analysis, to our knowledge. The proposed multiresolution topological method has potential applications in arbitrary data sets, such as social networks, biological networks, and graphs.« less
NASA Astrophysics Data System (ADS)
Xuan, Albert L.; Shinghal, Rajjan
1989-03-01
As the need for knowledge-based systems increases, an increasing number of domain experts are becoming interested in taking more active part in the building of knowledge-based systems. However, such a domain expert often must deal with a large number of unfamiliar terms concepts, facts, procedures and principles based on different approaches and schools of thought. He (for brevity, we shall use masculine pronouns for both genders) may need the help of a knowledge engineer (KE) in building the knowledge-based system but may encounter a number of problems. For instance, much of the early interaction between him and the knowl edge engineer may be spent in educating each other about their seperate kinds of expertise. Since the knowledge engineer will usually be ignorant of the knowledge domain while the domain expert (DE) will have little knowledge about knowledge-based systems, a great deal of time will be wasted on these issues ad the DE and the KE train each other to the point where a fruitful interaction can occur. In some situations, it may not even be possible for the DE to find a suitable KE to work with because he has no time to train the latter in his domain. This will engender the need for the DE to be more knowledgeable about knowledge-based systems and for the KE to find methods and techniques which will allow them to learn new domains as fast as they can. In any event, it is likely that the process of building knowledge-based systems will be smooth, er and more efficient if the domain expert is knowledgeable about the methods and techniques of knowledge-based systems building.
Beausoleil, N J; Mellor, D J
2015-01-01
Many pest control activities have the potential to impact negatively on the welfare of animals, and animal welfare is an important consideration in the development, implementation and evaluation of ethically defensible vertebrate pest control. Thus, reliable and accurate methods for assessing welfare impacts are required. The Five Domains model provides a systematic method for identifying potential or actual welfare impacts associated with an event or situation in four physical or functional domains (nutrition, environment, health or functional status, behaviour) and one mental domain (overall mental or affective state). Here we evaluate the advantages and limitations of the Five Domains model for this purpose and illustrate them using specific examples from a recent assessment of the welfare impacts of poisons used to lethally control possums in New Zealand. The model has a number of advantages which include the following: the systematic identification of a wide range of impacts associated with a variety of control tools; the production of relative rankings of tools in terms of their welfare impacts; the easy incorporation of new information into assessments; and the highlighting of additional information needed. For example, a recent analysis of sodium fluoroacetate (1080) poisoning in possums revealed the need for more information on the period from the onset of clinical signs to the point at which consciousness is lost, as well as on the level of consciousness during or after the occurrence of muscle spasms and seizures. The model is also valuable because it clearly separates physical or functional and affective impacts, encourages more comprehensive consideration of negative affective experiences than has occurred in the past, and allows development and evaluation of targeted mitigation strategies. Caution must be used in interpreting and applying the outputs of the model, most importantly because relative rankings or grades are fundamentally qualitative in nature. Certain domains are more useful for evaluating impacts associated with slower/longer-acting tools than for faster-acting methods, and it may be easier to identify impacts in some domains than others. Overall, we conclude that the Five Domains model advances evaluation of the animal welfare impacts of vertebrate pest control methods, provided users are cognisant of its limitations.
NASA Technical Reports Server (NTRS)
Vanel, Florence O.; Baysal, Oktay
1995-01-01
Important characteristics of the aeroacoustic wave propagation are mostly encoded in their dispersion relations. Hence, a computational aeroacoustic (CAA) algorithm, which reasonably preserves these relations, was investigated. It was derived using an optimization procedure to ensure, that the numerical derivatives preserved the wave number and angular frequency of the differential terms in the linearized, 2-D Euler equations. Then, simulations were performed to validate the scheme and a compatible set of discretized boundary conditions. The computational results were found to agree favorably with the exact solutions. The boundary conditions were transparent to the outgoing waves, except when the disturbance source was close to a boundary. The time-domain data generated by such CAA solutions were often intractable until their spectra was analyzed. Therefore, the relative merits of three different methods were included in the study. For simple, periodic waves, the periodogram method produced better estimates of the steep-sloped spectra than the Blackman-Tukey method. Also, for this problem, the Hanning window was more effective when used with the weighted-overlapped-segment-averaging and Blackman-Tukey methods gave better results than the periodogram method. Finally, it was demonstrated that the representation of time domain-data was significantly dependent on the particular spectral analysis method employed.
Numerical analysis of laser ablation using the axisymmetric two-temperature model
NASA Astrophysics Data System (ADS)
Dziatkiewicz, Jolanta; Majchrzak, Ewa
2018-01-01
Laser ablation of the axisymmetric micro-domain is analyzed. To describe the thermal processes occurring in the micro-domain the two-temperature hyperbolic model supplemented by the boundary and initial conditions is used. This model takes into account the phase changes of material (solid-liquid and liquid-vapour) and the ablation process. At the stage of numerical computations the finite difference method with staggered grid is used. In the final part the results of computations are shown.
New Flutter Analysis Technique for Time-Domain Computational Aeroelasticity
NASA Technical Reports Server (NTRS)
Pak, Chan-Gi; Lung, Shun-Fat
2017-01-01
A new time-domain approach for computing flutter speed is presented. Based on the time-history result of aeroelastic simulation, the unknown unsteady aerodynamics model is estimated using a system identification technique. The full aeroelastic model is generated via coupling the estimated unsteady aerodynamic model with the known linear structure model. The critical dynamic pressure is computed and used in the subsequent simulation until the convergence of the critical dynamic pressure is achieved. The proposed method is applied to a benchmark cantilevered rectangular wing.
Discriminant Analysis of Time Series in the Presence of Within-Group Spectral Variability.
Krafty, Robert T
2016-07-01
Many studies record replicated time series epochs from different groups with the goal of using frequency domain properties to discriminate between the groups. In many applications, there exists variation in cyclical patterns from time series in the same group. Although a number of frequency domain methods for the discriminant analysis of time series have been explored, there is a dearth of models and methods that account for within-group spectral variability. This article proposes a model for groups of time series in which transfer functions are modeled as stochastic variables that can account for both between-group and within-group differences in spectra that are identified from individual replicates. An ensuing discriminant analysis of stochastic cepstra under this model is developed to obtain parsimonious measures of relative power that optimally separate groups in the presence of within-group spectral variability. The approach possess favorable properties in classifying new observations and can be consistently estimated through a simple discriminant analysis of a finite number of estimated cepstral coefficients. Benefits in accounting for within-group spectral variability are empirically illustrated in a simulation study and through an analysis of gait variability.
Pires, Jennifer M; Ferreira, Ana M; Rocha, Filipa; Andrade, Luis G; Campos, Inês; Margalho, Paulo; Laíns, Jorge
2018-05-09
Bowel function is frequently compromised after spinal cord injury (SCI). Regardless of this crucial importance in patients' lives, there is still scarce literature on the Neurogenic Bowel Dysfunction (NBD) deleterious impact on SCI patient's lives and only few studies correlating NBD severity with quality of life (QoL). To our knowledge there are no studies assessing the impact of NBD on the context of ICF domains. To assess NBD after SCI using ICF domains and to assess its impact in QoL. Retrospective data analysis and cross-sectional phone survey. Outpatient spinal cord injury setting. Portuguese adult spinal cord injury patients. Retrospective analysis of demographic data, lesion characteristics and bowel management methods at last inpatient discharge. Cross-sectional phone survey assessing current bowel management methods, the Neurogenic Bowel Dysfunction Score and a Likert scale questionnaire about the impact on ICF domains and QoL. 64 patients answered the questionnaire. The majority was male (65.6%), mean age 56.6±15.6 years, AIS A lesion (39.1%), with a traumatic cause (71.9%). The main bowel management methods were contact laxatives, suppositories and osmotic laxatives. 50.1% of patients scored moderate or severe NBD. Considering ICF domains, the greatest impact was in personal and environmental factors, with 39.1% reporting impact in financial costs, 45.3% in need of assistance, 45.3% in emotional health and 46.9% in loss of privacy. There was a significant association between severity of NBD and negative impact on QoL (p<0.05). The study confirms the major impact of NBD on personal and environmental factors of ICF and on the quality of life of SCI population. These findings confirm that it is relevant to identify the main ICF domains affected by NBD after SCI in order to address targeted interventions, working toward changes in health policies and psychosocial aspects.
Frequency domain analysis of errors in cross-correlations of ambient seismic noise
NASA Astrophysics Data System (ADS)
Liu, Xin; Ben-Zion, Yehuda; Zigone, Dimitri
2016-12-01
We analyse random errors (variances) in cross-correlations of ambient seismic noise in the frequency domain, which differ from previous time domain methods. Extending previous theoretical results on ensemble averaged cross-spectrum, we estimate confidence interval of stacked cross-spectrum of finite amount of data at each frequency using non-overlapping windows with fixed length. The extended theory also connects amplitude and phase variances with the variance of each complex spectrum value. Analysis of synthetic stationary ambient noise is used to estimate the confidence interval of stacked cross-spectrum obtained with different length of noise data corresponding to different number of evenly spaced windows of the same duration. This method allows estimating Signal/Noise Ratio (SNR) of noise cross-correlation in the frequency domain, without specifying filter bandwidth or signal/noise windows that are needed for time domain SNR estimations. Based on synthetic ambient noise data, we also compare the probability distributions, causal part amplitude and SNR of stacked cross-spectrum function using one-bit normalization or pre-whitening with those obtained without these pre-processing steps. Natural continuous noise records contain both ambient noise and small earthquakes that are inseparable from the noise with the existing pre-processing steps. Using probability distributions of random cross-spectrum values based on the theoretical results provides an effective way to exclude such small earthquakes, and additional data segments (outliers) contaminated by signals of different statistics (e.g. rain, cultural noise), from continuous noise waveforms. This technique is applied to constrain values and uncertainties of amplitude and phase velocity of stacked noise cross-spectrum at different frequencies, using data from southern California at both regional scale (˜35 km) and dense linear array (˜20 m) across the plate-boundary faults. A block bootstrap resampling method is used to account for temporal correlation of noise cross-spectrum at low frequencies (0.05-0.2 Hz) near the ocean microseismic peaks.
Time Domain Diffraction by Composite Structures
NASA Astrophysics Data System (ADS)
Riccio, Giovanni; Frongillo, Marcello
2017-04-01
Time domain (TD) diffraction problems are receiving great attention because of the widespread use of ultra wide band (UWB) communication and radar systems. It is commonly accepted that, due to the large bandwidth of the UWB signals, the analysis of the wave propagation mechanisms in the TD framework is preferable to the frequency domain (FD) data processing. Furthermore, the analysis of transient scattering phenomena is also of importance for predicting the effects of electromagnetic pulses on civil structures. Diffraction in the TD framework represents a challenging problem and numerical discretization techniques can be used to support research and industry activities. Unfortunately, these methods become rapidly intractable when considering excitation pulses with high frequency content. This contribution deals with the TD diffraction phenomenon related to composite structures containing a dielectric wedge with arbitrary apex angle when illuminated by a plane wave. The approach is the same used in [1]-[3]. The transient diffracted field originated by an arbitrary function plane wave is evaluated via a convolution integral involving the TD diffraction coefficients, which are determined in closed form starting from the knowledge of the corresponding FD counterparts. In particular, the inverse Laplace transform is applied to the FD Uniform Asymptotic Physical Optics (FD-UAPO) diffraction coefficients available for the internal region of the structure and the surrounding space. For each observation domain, the FD-UAPO expressions are obtained by considering electric and magnetic equivalent PO surface currents located on the interfaces. The surface radiation integrals using these sources is assumed as starting point and manipulated for obtaining integrals able to be solved by means of the Steepest Descent Method and the Multiplicative Method. [1] G. Gennarelli and G. Riccio, "Time domain diffraction by a right-angled penetrable wedge," IEEE Trans. Antennas Propag., Vol. 60, 2829-2833, 2012. [2] G. Gennarelli and G. Riccio, "Obtuse-angled penetrable wedges: a time domain solution for the diffraction coefficients," J. Electromagn. Waves Appl., Vol. 27, 2020-2028, 2013. [3] M. Frongillo, G. Gennarelli and G. Riccio, "TD-UAPO diffracted field evaluation for penetrable wedges with acute apex angle," J. Opt. Soc. Am. A, Vol. 32, 1271-1275, 2015.
On the interpretation of domain averaged Fermi hole analyses of correlated wavefunctions.
Francisco, E; Martín Pendás, A; Costales, Aurora
2014-03-14
Few methods allow for a physically sound analysis of chemical bonds in cases where electron correlation may be a relevant factor. The domain averaged Fermi hole (DAFH) analysis, a tool firstly proposed by Robert Ponec in the 1990's to provide interpretations of the chemical bonding existing between two fragments Ω and Ω' that divide the real space exhaustively, is one of them. This method allows for a partition of the delocalization index or bond order between Ω and Ω' into one electron contributions, but the chemical interpretation of its parameters has been firmly established only for single determinant wavefunctions. In this paper we report a general interpretation based on the concept of excluded density that is also valid for correlated descriptions. Both analytical models and actual computations on a set of simple molecules (H2, N2, LiH, and CO) are discussed, and a classification of the possible DAFH situations is presented. Our results show that this kind of analysis may reveal several correlated assisted bonding patterns that might be difficult to detect using other methods. In agreement with previous knowledge, we find that the effective bond order in covalent links decreases due to localization of electrons driven by Coulomb correlation.
Detailed Vibration Analysis of Pinion Gear with Time-Frequency Methods
NASA Technical Reports Server (NTRS)
Mosher, Marianne; Pryor, Anna H.; Lewicki, David G.
2003-01-01
In this paper, the authors show a detailed analysis of the vibration signal from the destructive testing of a spiral bevel gear and pinion pair containing seeded faults. The vibration signal is analyzed in the time domain, frequency domain and with four time-frequency transforms: the Short Time Frequency Transform (STFT), the Wigner-Ville Distribution with the Choi-Williams kernel (WV-CW), the Continuous Wavelet' Transform (CWT) and the Discrete Wavelet Transform (DWT). Vibration data of bevel gear tooth fatigue cracks, under a variety of operating load levels and damage conditions, are analyzed using these methods. A new metric for automatic anomaly detection is developed and can be produced from any systematic numerical representation of the vibration signals. This new metric reveals indications of gear damage with all of the time-frequency transforms, as well as time and frequency representations, on this data set. Analysis with the CWT detects changes in the signal at low torque levels not found with the other transforms. The WV-CW and CWT use considerably more resources than the STFT and the DWT. More testing of the new metric is needed to determine its value for automatic anomaly detection and to develop fault detection methods for the metric.
NASA Astrophysics Data System (ADS)
Heller, Nicholas Walter Medicus
Powder coatings are becoming ubiquitous in the coating marketplace due to the absence of solvents in their formulation, but they have yet to see implementation in low-reflectance outdoor applications. This demand could be met by utilizing polymer blends formulated with low loadings of matting agents and pigments. The goal of this research is a thorough characterization of prototype low-reflectance coatings through several analytical techniques. Prototypical thermoset blends consist of functionalized polyurethanes rendered immiscible by differences in polar and hydrogen bonding characteristics, resulting in a surface roughened by droplet domains. Analysis of both pigmented and control clear films was performed. This research project had three primary aims: (1) determine the composition of the resin components of the polymer blend; (2) to monitor the evolution of domains before and during curing of clear polymer blends; (3) to monitor the evolution of these domains when pigments are added to these blends. The clear films enabled unhindered analysis by Fourier transform infrared (FTIR) and Raman spectroscopy on the binder. However, these domains provided no spectroscopic signatures despite their observation by optical microscopy. This necessitated the development of a new procedure for cross-section preparation that leaves no contamination from polishing media, which enabled Raman mapping of the morphology via an introduced marker peak from styrene monomer. The clears were analyzed as a powder and as films that were quenched at various cure-times using FTIR, Raman, transmission electron microscopy (TEM), and thermomechanical methods to construct a model of coating evolution based on cure parameters and polymer dynamics. Domains were observed in the powder, and underwent varying rates of coarsening as the cure progressed. TEM, scanning electron microscopy and thermomechanical methods were also used on pigmented systems at different states of the cure, including in powder form. TEM analysis additionally revealed the encapsulation of pigment particles by the domains, which helped explain the interaction between phase separation and pigment materials. The knowledge gained from fundamental characterization could be used to enable future generations of durable powder coatings with dead matte finishes.
Molecular Mechanics of the α-Actinin Rod Domain: Bending, Torsional, and Extensional Behavior
Golji, Javad; Collins, Robert; Mofrad, Mohammad R. K.
2009-01-01
α-Actinin is an actin crosslinking molecule that can serve as a scaffold and maintain dynamic actin filament networks. As a crosslinker in the stressed cytoskeleton, α-actinin can retain conformation, function, and strength. α-Actinin has an actin binding domain and a calmodulin homology domain separated by a long rod domain. Using molecular dynamics and normal mode analysis, we suggest that the α-actinin rod domain has flexible terminal regions which can twist and extend under mechanical stress, yet has a highly rigid interior region stabilized by aromatic packing within each spectrin repeat, by electrostatic interactions between the spectrin repeats, and by strong salt bridges between its two anti-parallel monomers. By exploring the natural vibrations of the α-actinin rod domain and by conducting bending molecular dynamics simulations we also predict that bending of the rod domain is possible with minimal force. We introduce computational methods for analyzing the torsional strain of molecules using rotating constraints. Molecular dynamics extension of the α-actinin rod is also performed, demonstrating transduction of the unfolding forces across salt bridges to the associated monomer of the α-actinin rod domain. PMID:19436721
Examination of the 8th grade students' TIMSS mathematics success in terms of different variables
NASA Astrophysics Data System (ADS)
Kaleli-Yılmaz, Gül; Hanci, Alper
2016-07-01
The aim of this study is to determine how the TIMSS mathematics success of the 8th grade students differentiates according to the school type, gender, mathematics report mark, parents' education level, cognitive domains and cognitive domains by gender. Relational survey method was used in the study. Six-hundred fifty two 8th grade students studying in the same city in Turkey participated in this study. In this study, a 45 question test that was made up by choosing TIMSS 2011 mathematics questionnaire was used as a data collection tool. Quantitative data analysis methods were used in the data analysis, frequency, percentage, average, standard deviation, independent sample test, one-way analysis of variance and post-hoc tests were applied to data by using SPSS packaged software. At the end of the study, it was determined that the school type, mathematics school mark, parents' education level and cognitive domains influenced the students' TIMSS mathematics success but their gender was a neutral element. Moreover, it was seen that schools which are really successful in national exams are more successful in TIMSS exam; students whose mathematics school marks are 5 and whose parents graduated from university are more successful in TIMSS exams than others. This study was produced from Alper HANCİ's master thesis that is made consulting Asst. Prof. Gül KALELİ YILMAZ.
Variational Methods in Sensitivity Analysis and Optimization for Aerodynamic Applications
NASA Technical Reports Server (NTRS)
Ibrahim, A. H.; Hou, G. J.-W.; Tiwari, S. N. (Principal Investigator)
1996-01-01
Variational methods (VM) sensitivity analysis, which is the continuous alternative to the discrete sensitivity analysis, is employed to derive the costate (adjoint) equations, the transversality conditions, and the functional sensitivity derivatives. In the derivation of the sensitivity equations, the variational methods use the generalized calculus of variations, in which the variable boundary is considered as the design function. The converged solution of the state equations together with the converged solution of the costate equations are integrated along the domain boundary to uniquely determine the functional sensitivity derivatives with respect to the design function. The determination of the sensitivity derivatives of the performance index or functional entails the coupled solutions of the state and costate equations. As the stable and converged numerical solution of the costate equations with their boundary conditions are a priori unknown, numerical stability analysis is performed on both the state and costate equations. Thereafter, based on the amplification factors obtained by solving the generalized eigenvalue equations, the stability behavior of the costate equations is discussed and compared with the state (Euler) equations. The stability analysis of the costate equations suggests that the converged and stable solution of the costate equation is possible only if the computational domain of the costate equations is transformed to take into account the reverse flow nature of the costate equations. The application of the variational methods to aerodynamic shape optimization problems is demonstrated for internal flow problems at supersonic Mach number range. The study shows, that while maintaining the accuracy of the functional sensitivity derivatives within the reasonable range for engineering prediction purposes, the variational methods show a substantial gain in computational efficiency, i.e., computer time and memory, when compared with the finite difference sensitivity analysis.
Data Analysis Tools and Methods for Improving the Interaction Design in E-Learning
ERIC Educational Resources Information Center
Popescu, Paul Stefan
2015-01-01
In this digital era, learning from data gathered from different software systems may have a great impact on the quality of the interaction experience. There are two main directions that come to enhance this emerging research domain, Intelligent Data Analysis (IDA) and Human Computer Interaction (HCI). HCI specific research methodologies can be…
1994-03-25
metrics [DISA93b]. " The Software Engineering Institute (SET) has developed a domain analysis process (Feature-Oriented Domain Analysis - FODA ) and is...and expresses the range of variability of these decisions. 3.2.2.3 Feature Oriented Domain Analysis Feature Oriented Domain Analysis ( FODA ) is a domain...documents created in this phase. From a purely profit-oriented business point of view, a company may develop its own analysis of a government or commercial
Ion mobility spectrometer using frequency-domain separation
Martin, Stephen J.; Butler, Michael A.; Frye, Gregory C.; Schubert, W. Kent
1998-01-01
An apparatus and method is provided for separating and analyzing chemical species in an ion mobility spectrometer using a frequency-domain technique wherein the ions generated from the chemical species are selectively transported through an ion flow channel having a moving electrical potential therein. The moving electrical potential allows the ions to be selected according to ion mobility, with certain of the ions being transported to an ion detector and other of the ions being effectively discriminated against. The apparatus and method have applications for sensitive chemical detection and analysis for monitoring of exhaust gases, hazardous waste sites, industrial processes, aerospace systems, non-proliferation, and treaty verification. The apparatus can be formed as a microelectromechanical device (i.e. a micromachine).
A parallel algorithm for nonlinear convection-diffusion equations
NASA Technical Reports Server (NTRS)
Scroggs, Jeffrey S.
1990-01-01
A parallel algorithm for the efficient solution of nonlinear time-dependent convection-diffusion equations with small parameter on the diffusion term is presented. The method is based on a physically motivated domain decomposition that is dictated by singular perturbation analysis. The analysis is used to determine regions where certain reduced equations may be solved in place of the full equation. The method is suitable for the solution of problems arising in the simulation of fluid dynamics. Experimental results for a nonlinear equation in two-dimensions are presented.
Development of the Expert System Domain Advisor and Analysis Tool
1991-09-01
analysis. Typical of the current methods in use at this time is the " tarot metric". This method defines a decision rule whose output is whether to go...B - TAROT METRIC B. ::TTRODUCTION The system chart of ESEM, Figure 1, shows the following three risk-based decision points: i. At prolect initiation...34 decisions. B-I 201 PRELIMINARY T" B-I. Evaluais Factan for ES Deyelopsineg FACTORS POSSIBLE VALUE RATINGS TAROT metric (overall suitability) Poor, Fair
An efficient solution procedure for the thermoelastic analysis of truss space structures
NASA Technical Reports Server (NTRS)
Givoli, D.; Rand, O.
1992-01-01
A solution procedure is proposed for the thermal and thermoelastic analysis of truss space structures in periodic motion. In this method, the spatial domain is first descretized using a consistent finite element formulation. Then the resulting semi-discrete equations in time are solved analytically by using Fourier decomposition. Geometrical symmetry is taken advantage of completely. An algorithm is presented for the calculation of heat flux distribution. The method is demonstrated via a numerical example of a cylindrically shaped space structure.
Kubios HRV--heart rate variability analysis software.
Tarvainen, Mika P; Niskanen, Juha-Pekka; Lipponen, Jukka A; Ranta-Aho, Perttu O; Karjalainen, Pasi A
2014-01-01
Kubios HRV is an advanced and easy to use software for heart rate variability (HRV) analysis. The software supports several input data formats for electrocardiogram (ECG) data and beat-to-beat RR interval data. It includes an adaptive QRS detection algorithm and tools for artifact correction, trend removal and analysis sample selection. The software computes all the commonly used time-domain and frequency-domain HRV parameters and several nonlinear parameters. There are several adjustable analysis settings through which the analysis methods can be optimized for different data. The ECG derived respiratory frequency is also computed, which is important for reliable interpretation of the analysis results. The analysis results can be saved as an ASCII text file (easy to import into MS Excel or SPSS), Matlab MAT-file, or as a PDF report. The software is easy to use through its compact graphical user interface. The software is available free of charge for Windows and Linux operating systems at http://kubios.uef.fi. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Domain Decomposition By the Advancing-Partition Method for Parallel Unstructured Grid Generation
NASA Technical Reports Server (NTRS)
Pirzadeh, Shahyar Z.; Zagaris, George
2009-01-01
A new method of domain decomposition has been developed for generating unstructured grids in subdomains either sequentially or using multiple computers in parallel. Domain decomposition is a crucial and challenging step for parallel grid generation. Prior methods are generally based on auxiliary, complex, and computationally intensive operations for defining partition interfaces and usually produce grids of lower quality than those generated in single domains. The new technique, referred to as "Advancing Partition," is based on the Advancing-Front method, which partitions a domain as part of the volume mesh generation in a consistent and "natural" way. The benefits of this approach are: 1) the process of domain decomposition is highly automated, 2) partitioning of domain does not compromise the quality of the generated grids, and 3) the computational overhead for domain decomposition is minimal. The new method has been implemented in NASA's unstructured grid generation code VGRID.
Domain Decomposition By the Advancing-Partition Method
NASA Technical Reports Server (NTRS)
Pirzadeh, Shahyar Z.
2008-01-01
A new method of domain decomposition has been developed for generating unstructured grids in subdomains either sequentially or using multiple computers in parallel. Domain decomposition is a crucial and challenging step for parallel grid generation. Prior methods are generally based on auxiliary, complex, and computationally intensive operations for defining partition interfaces and usually produce grids of lower quality than those generated in single domains. The new technique, referred to as "Advancing Partition," is based on the Advancing-Front method, which partitions a domain as part of the volume mesh generation in a consistent and "natural" way. The benefits of this approach are: 1) the process of domain decomposition is highly automated, 2) partitioning of domain does not compromise the quality of the generated grids, and 3) the computational overhead for domain decomposition is minimal. The new method has been implemented in NASA's unstructured grid generation code VGRID.
NASA Technical Reports Server (NTRS)
Walston, W. H., Jr.
1986-01-01
The comparative computational efficiencies of the finite element (FEM), boundary element (BEM), and hybrid boundary element-finite element (HVFEM) analysis techniques are evaluated for representative bounded domain interior and unbounded domain exterior problems in elastostatics. Computational efficiency is carefully defined in this study as the computer time required to attain a specified level of solution accuracy. The study found the FEM superior to the BEM for the interior problem, while the reverse was true for the exterior problem. The hybrid analysis technique was found to be comparable or superior to both the FEM and BEM for both the interior and exterior problems.
NASA Technical Reports Server (NTRS)
Yao, Tse-Min; Choi, Kyung K.
1987-01-01
An automatic regridding method and a three dimensional shape design parameterization technique were constructed and integrated into a unified theory of shape design sensitivity analysis. An algorithm was developed for general shape design sensitivity analysis of three dimensional eleastic solids. Numerical implementation of this shape design sensitivity analysis method was carried out using the finite element code ANSYS. The unified theory of shape design sensitivity analysis uses the material derivative of continuum mechanics with a design velocity field that represents shape change effects over the structural design. Automatic regridding methods were developed by generating a domain velocity field with boundary displacement method. Shape design parameterization for three dimensional surface design problems was illustrated using a Bezier surface with boundary perturbations that depend linearly on the perturbation of design parameters. A linearization method of optimization, LINRM, was used to obtain optimum shapes. Three examples from different engineering disciplines were investigated to demonstrate the accuracy and versatility of this shape design sensitivity analysis method.
Alison, Jennifer A; Zafiropoulos, Bill; Heard, Robert
2017-01-01
Objective The aim of this study was to identify key factors affecting research capacity and engagement of allied health professionals working in a large metropolitan health service. Identifying such factors will assist in determining strategies for building research capacity in allied health. Materials and methods A total of 276 allied health professionals working within the Sydney Local Health District (SLHD) completed the Research Capacity in Context Tool (RCCT) that measures research capacity and culture across three domains: organization, team, and individual. An exploratory factor analysis was undertaken to identify common themes within each of these domains. Correlations were performed between demographic variables and the identified factors to determine possible relationships. Results Research capacity and culture success/skill levels were reported to be higher within the organization and team domains compared to the individual domain (median [interquartile range, IQR] 6 [5–8], 6 [5–8], 5 [3–7], respectively; Friedman χ2(2)=42.04, p<0.001). Exploratory factor analyses were performed to identify factors that were perceived by allied health respondents to affect research capacity. Factors identified within the organization domain were infrastructure for research (eg, funds and equipment) and research culture (eg, senior manager’s support for research); within the team domain the factors were research orientation (eg, dissemination of results at research seminars) and research support (eg, providing staff research training). Within the individual domain, only one factor was identified which was the research skill of the individual (eg, literature evaluation, submitting ethics applications and data analysis, and writing for publication). Conclusion The reported skill/success levels in research were lower for the individual domain compared to the organization or team domains. Key factors were identified in each domain that impacted on allied health research capacity. As these factors were different in each domain, various strategies may be required at the level of the organization, team, and individual to support and build allied health research capacity. PMID:28860795
Biometric identification based on novel frequency domain facial asymmetry measures
NASA Astrophysics Data System (ADS)
Mitra, Sinjini; Savvides, Marios; Vijaya Kumar, B. V. K.
2005-03-01
In the modern world, the ever-growing need to ensure a system's security has spurred the growth of the newly emerging technology of biometric identification. The present paper introduces a novel set of facial biometrics based on quantified facial asymmetry measures in the frequency domain. In particular, we show that these biometrics work well for face images showing expression variations and have the potential to do so in presence of illumination variations as well. A comparison of the recognition rates with those obtained from spatial domain asymmetry measures based on raw intensity values suggests that the frequency domain representation is more robust to intra-personal distortions and is a novel approach for performing biometric identification. In addition, some feature analysis based on statistical methods comparing the asymmetry measures across different individuals and across different expressions is presented.
Frequency-domain method for discrete frequency noise prediction of rotors in arbitrary steady motion
NASA Astrophysics Data System (ADS)
Gennaretti, M.; Testa, C.; Bernardini, G.
2012-12-01
A novel frequency-domain formulation for the prediction of the tonal noise emitted by rotors in arbitrary steady motion is presented. It is derived from Farassat's 'Formulation 1A', that is a time-domain boundary integral representation for the solution of the Ffowcs-Williams and Hawkings equation, and represents noise as harmonic response to body kinematics and aerodynamic loads via frequency-response-function matrices. The proposed frequency-domain solver is applicable to rotor configurations for which sound pressure levels of discrete tones are much higher than those of broadband noise. The numerical investigation concerns the analysis of noise produced by an advancing helicopter rotor in blade-vortex interaction conditions, as well as the examination of pressure disturbances radiated by the interaction of a marine propeller with a non-uniform inflow.
NASA Technical Reports Server (NTRS)
Kreider, Kevin L.; Baumeister, Kenneth J.
1996-01-01
An explicit finite difference real time iteration scheme is developed to study harmonic sound propagation in aircraft engine nacelles. To reduce storage requirements for future large 3D problems, the time dependent potential form of the acoustic wave equation is used. To insure that the finite difference scheme is both explicit and stable for a harmonic monochromatic sound field, a parabolic (in time) approximation is introduced to reduce the order of the governing equation. The analysis begins with a harmonic sound source radiating into a quiescent duct. This fully explicit iteration method then calculates stepwise in time to obtain the 'steady state' harmonic solutions of the acoustic field. For stability, applications of conventional impedance boundary conditions requires coupling to explicit hyperbolic difference equations at the boundary. The introduction of the time parameter eliminates the large matrix storage requirements normally associated with frequency domain solutions, and time marching attains the steady-state quickly enough to make the method favorable when compared to frequency domain methods. For validation, this transient-frequency domain method is applied to sound propagation in a 2D hard wall duct with plug flow.
The signaling helix: a common functional theme in diverse signaling proteins
Anantharaman, Vivek; Balaji, S; Aravind, L
2006-01-01
Background The mechanism by which the signals are transmitted between receptor and effector domains in multi-domain signaling proteins is poorly understood. Results Using sensitive sequence analysis methods we identify a conserved helical segment of around 40 residues in a wide range of signaling proteins, including numerous sensor histidine kinases such as Sln1p, and receptor guanylyl cyclases such as the atrial natriuretic peptide receptor and nitric oxide receptors. We term this helical segment the signaling (S)-helix and present evidence that it forms a novel parallel coiled-coil element, distinct from previously known helical segments in signaling proteins, such as the Dimerization-Histidine phosphotransfer module of histidine kinases, the intra-cellular domains of the chemotaxis receptors, inter-GAF domain helical linkers and the α-helical HAMP module. Analysis of domain architectures allowed us to reconstruct the domain-neighborhood graph for the S-helix, which showed that the S-helix almost always occurs between two signaling domains. Several striking patterns in the domain neighborhood of the S-helix also became evident from the graph. It most often separates diverse N-terminal sensory domains from various C-terminal catalytic signaling domains such as histidine kinases, cNMP cyclase, PP2C phosphatases, NtrC-like AAA+ ATPases and diguanylate cyclases. It might also occur between two sensory domains such as PAS domains and occasionally between a DNA-binding HTH domain and a sensory domain. The sequence conservation pattern of the S-helix revealed the presence of a unique constellation of polar residues in the dimer-interface positions within the central heptad of the coiled-coil formed by the S-helix. Conclusion Combining these observations with previously reported mutagenesis studies on different S-helix-containing proteins we suggest that it functions as a switch that prevents constitutive activation of linked downstream signaling domains. However, upon occurrence of specific conformational changes due to binding of ligand or other sensory inputs in a linked upstream domain it transmits the signal to the downstream domain. Thus, the S-helix represents one of the most prevalent functional themes involved in the flow of signals between modules in diverse prokaryote-type multi-domain signaling proteins. Reviewers This article was reviewed by Frank Eisenhaber, Arcady Mushegian and Sandor Pongor. PMID:16953892
Analysis of Technique to Extract Data from the Web for Improved Performance
NASA Astrophysics Data System (ADS)
Gupta, Neena; Singh, Manish
2010-11-01
The World Wide Web rapidly guides the world into a newly amazing electronic world, where everyone can publish anything in electronic form and extract almost all the information. Extraction of information from semi structured or unstructured documents, such as web pages, is a useful yet complex task. Data extraction, which is important for many applications, extracts the records from the HTML files automatically. Ontologies can achieve a high degree of accuracy in data extraction. We analyze method for data extraction OBDE (Ontology-Based Data Extraction), which automatically extracts the query result records from the web with the help of agents. OBDE first constructs an ontology for a domain according to information matching between the query interfaces and query result pages from different web sites within the same domain. Then, the constructed domain ontology is used during data extraction to identify the query result section in a query result page and to align and label the data values in the extracted records. The ontology-assisted data extraction method is fully automatic and overcomes many of the deficiencies of current automatic data extraction methods.
Discussion summary: Fictitious domain methods
NASA Technical Reports Server (NTRS)
Glowinski, Rowland; Rodrigue, Garry
1991-01-01
Fictitious Domain methods are constructed in the following manner: Suppose a partial differential equation is to be solved on an open bounded set, Omega, in 2-D or 3-D. Let R be a rectangle domain containing the closure of Omega. The partial differential equation is first solved on R. Using the solution on R, the solution of the equation on Omega is then recovered by some procedure. The advantage of the fictitious domain method is that in many cases the solution of a partial differential equation on a rectangular region is easier to compute than on a nonrectangular region. Fictitious domain methods for solving elliptic PDEs on general regions are also very efficient when used on a parallel computer. The reason is that one can use the many domain decomposition methods that are available for solving the PDE on the fictitious rectangular region. The discussion on fictitious domain methods began with a talk by R. Glowinski in which he gave some examples of a variational approach to ficititious domain methods for solving the Helmholtz and Navier-Stokes equations.
Parallel eigenanalysis of finite element models in a completely connected architecture
NASA Technical Reports Server (NTRS)
Akl, F. A.; Morel, M. R.
1989-01-01
A parallel algorithm is presented for the solution of the generalized eigenproblem in linear elastic finite element analysis, (K)(phi) = (M)(phi)(omega), where (K) and (M) are of order N, and (omega) is order of q. The concurrent solution of the eigenproblem is based on the multifrontal/modified subspace method and is achieved in a completely connected parallel architecture in which each processor is allowed to communicate with all other processors. The algorithm was successfully implemented on a tightly coupled multiple-instruction multiple-data parallel processing machine, Cray X-MP. A finite element model is divided into m domains each of which is assumed to process n elements. Each domain is then assigned to a processor or to a logical processor (task) if the number of domains exceeds the number of physical processors. The macrotasking library routines are used in mapping each domain to a user task. Computational speed-up and efficiency are used to determine the effectiveness of the algorithm. The effect of the number of domains, the number of degrees-of-freedom located along the global fronts and the dimension of the subspace on the performance of the algorithm are investigated. A parallel finite element dynamic analysis program, p-feda, is documented and the performance of its subroutines in parallel environment is analyzed.
Nie, Guangning; Yang, Hongyan; Liu, Jian; Zhao, ChunMei; Wang, Xiaoyun
2017-01-01
Abstract Objective: The Menopause-Specific Quality-of-Life (MENQOL) questionnaire was developed as a specific tool to measure the health-related quality-of-life of postmenopausal women. Thus far, the Chinese version questionnaire has not been subjected to psychometric assessment with a large sample. This study aims to evaluate the validity and reliability of the Chinese version of the MENQOL specific to postmenopausal women in China. Methods: A total of 1,137 menopausal symptomatic and 491 menopausal asymptomatic women from eight cities in China were recruited using a convenience sampling method. Psychometric properties were evaluated by descriptive statistics, validity, and reliability. Reliability was assessed for each subscale of the MENQOL through internal consistency reliability with Cronbach's α and intersubscale correlations. Item-domain correlations, principal components analysis (PCA), and confirmatory factor analysis were performed to determine construct validity. t tests were used to compare the differences between the menopausal symptomatic and asymptomatic women and to evaluate the discriminate validity. Pearson correlation coefficients were calculated between MENQOL scores and the Kupperman index to assess criterion-related validity. Results: The most common symptoms in Chinese menopausal symptomatic women were “experiencing poor memory” (94.4%), “feeling tired or worn out” (93.8%), “aching in muscle and joints” (89.4%), “low backache” (86.9%), “decrease in physical strength” (86.6%), “aches in back of neck or head” (86.2%), “difficulty sleeping” (83.6%), “accomplishing less than I used to” (83.4%), “feeling a lack of energy” (83.3%), “change in your sexual desire” (81%), and “hot flash” (80.7%) among others. The symptoms of “increased facial hair” were rarely seen (9.9%). The vasomotor domain, as well as psychosocial, physical, and sexual domains showed high reliability (Cronbach's α 0.84, 0.87, 0.89, and 0.86, respectively). Item-domain correlation analysis showed that all items correlated more strongly with their own domains than with other domains. In the PCA, after deleting the “increased facial hair” item, items in the vasomotor, sexual, and psychosocial subscales loaded on their respective domains by and large, and items in the physical subscale divided into two factors. The PCA revealed a latent structure of the Chinese version of MENQOL nearly identical to the original MENQOL domains. The confirmatory factor analysis demonstrated that the questionnaire fits well with a four-domain model. The MENQOL can discriminate between menopausal symptomatic women with asymptomatic women as it showed good discriminate validity. Criterion-related validity was confirmed by a significant correlation between MENQOL scores and the Kupperman index. Conclusions: This study showed that Chinese version of MENQOL has good psychometric properties and would be suitable to measure the health-related quality-of-life of Chinese menopausal women except for item 21 (increased facial hair). PMID:27922934
Satisfaction Domains Differ between the Patient and Their Family in Adult Intensive Care Units
Song, Ge; Sim, Pei Zhen; Ting, Kit Cheng; Yoo, Jeffrey Kwang Sui; Wang, Qing Li; Mascuri, Raudhah Binte Haji Mohamad; Ong, Venetia Hui Ling; Phua, Jason; Kowitlawakul, Yanika
2016-01-01
Background. Patients' and family's satisfaction data from the Asian intensive care units (ICUs) is lacking. Objective. Domains between patient and family satisfaction and contribution of each domain to the general satisfaction were studied. Method. Over 3 months, adult patients across 4 ICUs staying for more than 48 hours with abbreviated mental test score of 7 or above and able to understand English and immediate family members were surveyed by separate validated satisfaction questionnaires. Results. Two hundred patients and 194 families were included in the final analysis. Significant difference in the satisfaction scores was observed between the ICUs. Patients were most and least satisfied in the communication (4.2 out of 5) and decision-making (2.9 out of 5) domains, respectively. Families were most and least satisfied in the relationship with doctors (3.9 out of 5) and family's involvement domains (3.3 out of 5), respectively. Domains contributing most to the general satisfaction were the illness management domain for patients (β coefficient = 0.44) and characteristics of doctors and nurses domain for family (β coefficient = 0.45). Discussion. In an Asian ICU community, patients and families differ in their expectations and valuations of health care processes. Health care providers have difficult tasks in attending to these different domains. PMID:28044138
Multi-Domain Transfer Learning for Early Diagnosis of Alzheimer's Disease.
Cheng, Bo; Liu, Mingxia; Shen, Dinggang; Li, Zuoyong; Zhang, Daoqiang
2017-04-01
Recently, transfer learning has been successfully applied in early diagnosis of Alzheimer's Disease (AD) based on multi-domain data. However, most of existing methods only use data from a single auxiliary domain, and thus cannot utilize the intrinsic useful correlation information from multiple domains. Accordingly, in this paper, we consider the joint learning of tasks in multi-auxiliary domains and the target domain, and propose a novel Multi-Domain Transfer Learning (MDTL) framework for early diagnosis of AD. Specifically, the proposed MDTL framework consists of two key components: 1) a multi-domain transfer feature selection (MDTFS) model that selects the most informative feature subset from multi-domain data, and 2) a multi-domain transfer classification (MDTC) model that can identify disease status for early AD detection. We evaluate our method on 807 subjects from the Alzheimer's Disease Neuroimaging Initiative (ADNI) database using baseline magnetic resonance imaging (MRI) data. The experimental results show that the proposed MDTL method can effectively utilize multi-auxiliary domain data for improving the learning performance in the target domain, compared with several state-of-the-art methods.
Probabilistic analysis of wind-induced vibration mitigation of structures by fluid viscous dampers
NASA Astrophysics Data System (ADS)
Chen, Jianbing; Zeng, Xiaoshu; Peng, Yongbo
2017-11-01
The high-rise buildings usually suffer from excessively large wind-induced vibrations, and thus vibration control systems might be necessary. Fluid viscous dampers (FVDs) with nonlinear power law against velocity are widely employed. With the transition of design method from traditional frequency domain approaches to more refined direct time domain approaches, the difficulty of time integration of these systems occurs sometimes. In the present paper, firstly the underlying reason of the difficulty is revealed by identifying that the equations of motion of high-rise buildings installed with FVDs are sometimes stiff differential equations. Thus, an approach effective for stiff differential systems, i.e., the backward difference formula (BDF), is then introduced, and verified to be effective for the equation of motion of wind-induced vibration controlled systems. Comparative studies are performed among some methods, including the Newmark method, KR-alpha method, energy-based linearization method and the statistical linearization method. Based on the above results, a 20-story steel frame structure is taken as a practical example. Particularly, the randomness of structural parameters and of wind loading input is emphasized. The extreme values of the responses are examined, showing the effectiveness of the proposed approach, and also necessitating the refined probabilistic analysis in the design of wind-induced vibration mitigation systems.
Event extraction of bacteria biotopes: a knowledge-intensive NLP-based approach.
Ratkovic, Zorana; Golik, Wiktoria; Warnier, Pierre
2012-06-26
Bacteria biotopes cover a wide range of diverse habitats including animal and plant hosts, natural, medical and industrial environments. The high volume of publications in the microbiology domain provides a rich source of up-to-date information on bacteria biotopes. This information, as found in scientific articles, is expressed in natural language and is rarely available in a structured format, such as a database. This information is of great importance for fundamental research and microbiology applications (e.g., medicine, agronomy, food, bioenergy). The automatic extraction of this information from texts will provide a great benefit to the field. We present a new method for extracting relationships between bacteria and their locations using the Alvis framework. Recognition of bacteria and their locations was achieved using a pattern-based approach and domain lexical resources. For the detection of environment locations, we propose a new approach that combines lexical information and the syntactic-semantic analysis of corpus terms to overcome the incompleteness of lexical resources. Bacteria location relations extend over sentence borders, and we developed domain-specific rules for dealing with bacteria anaphors. We participated in the BioNLP 2011 Bacteria Biotope (BB) task with the Alvis system. Official evaluation results show that it achieves the best performance of participating systems. New developments since then have increased the F-score by 4.1 points. We have shown that the combination of semantic analysis and domain-adapted resources is both effective and efficient for event information extraction in the bacteria biotope domain. We plan to adapt the method to deal with a larger set of location types and a large-scale scientific article corpus to enable microbiologists to integrate and use the extracted knowledge in combination with experimental data.
Dynamics and allostery of the ionotropic glutamate receptors and the ligand binding domain.
Tobi, Dror
2016-02-01
The dynamics of the ligand-binding domain (LBD) and the intact ionotropic glutamate receptor (iGluR) were studied using Gaussian Network Model (GNM) analysis. The dynamics of LBDs with various allosteric modulators is compared using a novel method of multiple alignment of GNM modes of motion. The analysis reveals that allosteric effectors change the dynamics of amino acids at the upper lobe interface of the LBD dimer as well as at the hinge region between the upper- and lower- lobes. For the intact glutamate receptor the analysis show that the clamshell-like movement of the LBD upper and lower lobes is coupled to the bending of the trans-membrane domain (TMD) helices which may open the channel pore. The results offer a new insight on the mechanism of action of allosteric modulators on the iGluR and support the notion of TMD helices bending as a possible mechanism for channel opening. In addition, the study validates the methodology of multiple GNM modes alignment as a useful tool to study allosteric effect and its relation to proteins dynamics. © 2015 Wiley Periodicals, Inc.
3-D surface profilometry based on modulation measurement by applying wavelet transform method
NASA Astrophysics Data System (ADS)
Zhong, Min; Chen, Feng; Xiao, Chao; Wei, Yongchao
2017-01-01
A new analysis of 3-D surface profilometry based on modulation measurement technique by the application of Wavelet Transform method is proposed. As a tool excelling for its multi-resolution and localization in the time and frequency domains, Wavelet Transform method with good localized time-frequency analysis ability and effective de-noizing capacity can extract the modulation distribution more accurately than Fourier Transform method. Especially for the analysis of complex object, more details of the measured object can be well remained. In this paper, the theoretical derivation of Wavelet Transform method that obtains the modulation values from a captured fringe pattern is given. Both computer simulation and elementary experiment are used to show the validity of the proposed method by making a comparison with the results of Fourier Transform method. The results show that the Wavelet Transform method has a better performance than the Fourier Transform method in modulation values retrieval.
Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa
2012-06-01
The use of sophisticated information and communication technologies (ICTs) in the health care domain is a way to improve the quality of services. However, there are also hazards associated with the introduction of ICTs in this domain and a great number of projects have failed due to the lack of systematic consideration of human and other non-technology issues throughout the design or implementation process, particularly in the requirements engineering process. This paper presents the methodological approach followed in the design process of a web-based information system (WbIS) for managing the clinical information in hemophilia care, which integrates the values and practices of user-centered design (UCD) activities into the principles of software engineering, particularly in the phase of requirements engineering (RE). This process followed a paradigm that combines a grounded theory for data collection with an evolutionary design based on constant development and refinement of the generic domain model using three well-known methodological approaches: (a) object-oriented system analysis; (b) task analysis; and, (c) prototyping, in a triangulation work. This approach seems to be a good solution for the requirements engineering process in this particular case of the health care domain, since the inherent weaknesses of individual methods are reduced, and emergent requirements are easier to elicit. Moreover, the requirements triangulation matrix gives the opportunity to look across the results of all used methods and decide what requirements are critical for the system success. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Terascale Optimal PDE Simulations (TOPS) Center
DOE Office of Scientific and Technical Information (OSTI.GOV)
Professor Olof B. Widlund
2007-07-09
Our work has focused on the development and analysis of domain decomposition algorithms for a variety of problems arising in continuum mechanics modeling. In particular, we have extended and analyzed FETI-DP and BDDC algorithms; these iterative solvers were first introduced and studied by Charbel Farhat and his collaborators, see [11, 45, 12], and by Clark Dohrmann of SANDIA, Albuquerque, see [43, 2, 1], respectively. These two closely related families of methods are of particular interest since they are used more extensively than other iterative substructuring methods to solve very large and difficult problems. Thus, the FETI algorithms are part ofmore » the SALINAS system developed by the SANDIA National Laboratories for very large scale computations, and as already noted, BDDC was first developed by a SANDIA scientist, Dr. Clark Dohrmann. The FETI algorithms are also making inroads in commercial engineering software systems. We also note that the analysis of these algorithms poses very real mathematical challenges. The success in developing this theory has, in several instances, led to significant improvements in the performance of these algorithms. A very desirable feature of these iterative substructuring and other domain decomposition algorithms is that they respect the memory hierarchy of modern parallel and distributed computing systems, which is essential for approaching peak floating point performance. The development of improved methods, together with more powerful computer systems, is making it possible to carry out simulations in three dimensions, with quite high resolution, relatively easily. This work is supported by high quality software systems, such as Argonne's PETSc library, which facilitates code development as well as the access to a variety of parallel and distributed computer systems. The success in finding scalable and robust domain decomposition algorithms for very large number of processors and very large finite element problems is, e.g., illustrated in [24, 25, 26]. This work is based on [29, 31]. Our work over these five and half years has, in our opinion, helped advance the knowledge of domain decomposition methods significantly. We see these methods as providing valuable alternatives to other iterative methods, in particular, those based on multi-grid. In our opinion, our accomplishments also match the goals of the TOPS project quite closely.« less
A GPU-based calculation using the three-dimensional FDTD method for electromagnetic field analysis.
Nagaoka, Tomoaki; Watanabe, Soichi
2010-01-01
Numerical simulations with the numerical human model using the finite-difference time domain (FDTD) method have recently been performed frequently in a number of fields in biomedical engineering. However, the FDTD calculation runs too slowly. We focus, therefore, on general purpose programming on the graphics processing unit (GPGPU). The three-dimensional FDTD method was implemented on the GPU using Compute Unified Device Architecture (CUDA). In this study, we used the NVIDIA Tesla C1060 as a GPGPU board. The performance of the GPU is evaluated in comparison with the performance of a conventional CPU and a vector supercomputer. The results indicate that three-dimensional FDTD calculations using a GPU can significantly reduce run time in comparison with that using a conventional CPU, even a native GPU implementation of the three-dimensional FDTD method, while the GPU/CPU speed ratio varies with the calculation domain and thread block size.
Bright, T.J.
2013-01-01
Summary Background Many informatics studies use content analysis to generate functional requirements for system development. Explication of this translational process from qualitative data to functional requirements can strengthen the understanding and scientific rigor when applying content analysis in informatics studies. Objective To describe a user-centered approach transforming emergent themes derived from focus group data into functional requirements for informatics solutions and to illustrate these methods to the development of an antibiotic clinical decision support system (CDS). Methods The approach consisted of five steps: 1) identify unmet therapeutic planning information needs via Focus Group Study-I, 2) develop a coding framework of therapeutic planning themes to refine the domain scope to antibiotic therapeutic planning, 3) identify functional requirements of an antibiotic CDS system via Focus Group Study-II, 4) discover informatics solutions and functional requirements from coded data, and 5) determine the types of information needed to support the antibiotic CDS system and link with the identified informatics solutions and functional requirements. Results The coding framework for Focus Group Study-I revealed unmet therapeutic planning needs. Twelve subthemes emerged and were clustered into four themes; analysis indicated a need for an antibiotic CDS intervention. Focus Group Study-II included five types of information needs. Comments from the Barrier/Challenge to information access and Function/Feature themes produced three informatics solutions and 13 functional requirements of an antibiotic CDS system. Comments from the Patient, Institution, and Domain themes generated required data elements for each informatics solution. Conclusion This study presents one example explicating content analysis of focus group data and the analysis process to functional requirements from narrative data. Illustration of this 5-step method was used to develop an antibiotic CDS system, resolving unmet antibiotic prescribing needs. As a reusable approach, these techniques can be refined and applied to resolve unmet information needs with informatics interventions in additional domains. PMID:24454586
Walkup, Ward G; Kennedy, Mary B
2014-06-01
PDZ (PSD-95, DiscsLarge, ZO1) domains function in nature as protein binding domains within scaffold and membrane-associated proteins. They comprise ∼90 residues and make specific, high affinity interactions with complementary C-terminal peptide sequences, with other PDZ domains, and with phospholipids. We hypothesized that the specific, strong interactions of PDZ domains with their ligands would make them well suited for use in affinity chromatography. Here we describe a novel affinity chromatography method applicable for the purification of proteins that contain PDZ domain-binding ligands, either naturally or introduced by genetic engineering. We created a series of affinity resins comprised of PDZ domains from the scaffold protein PSD-95, or from neuronal nitric oxide synthase (nNOS), coupled to solid supports. We used them to purify heterologously expressed neuronal proteins or protein domains containing endogenous PDZ domain ligands, eluting the proteins with free PDZ domain peptide ligands. We show that Proteins of Interest (POIs) lacking endogenous PDZ domain ligands can be engineered as fusion products containing C-terminal PDZ domain ligand peptides or internal, N- or C-terminal PDZ domains and then can be purified by the same method. Using this method, we recovered recombinant GFP fused to a PDZ domain ligand in active form as verified by fluorescence yield. Similarly, chloramphenicol acetyltransferase (CAT) and β-Galactosidase (LacZ) fused to a C-terminal PDZ domain ligand or an N-terminal PDZ domain were purified in active form as assessed by enzymatic assay. In general, PDZ domains and ligands derived from PSD-95 were superior to those from nNOS for this method. PDZ Domain Affinity Chromatography promises to be a versatile and effective method for purification of a wide variety of natural and recombinant proteins. Copyright © 2014 Elsevier Inc. All rights reserved.
Transitioning Domain Analysis: An Industry Experience.
1996-06-01
References 6 Implementation 6.1 Analysis of Operator Services’ Requirements Process 21 6.2 Preliminary Planning for FODA Training by SEI 21...an academic and industry partnership took feature oriented domain analysis ( FODA ) from a methodology that is still being defined to a well-documented...to pilot the use of the Software Engineering Institute (SEI) domain analysis methodology known as feature-oriented domain analysis ( FODA ). Supported
An equivalent domain integral method for three-dimensional mixed-mode fracture problems
NASA Technical Reports Server (NTRS)
Shivakumar, K. N.; Raju, I. S.
1991-01-01
A general formulation of the equivalent domain integral (EDI) method for mixed mode fracture problems in cracked solids is presented. The method is discussed in the context of a 3-D finite element analysis. The J integral consists of two parts: the volume integral of the crack front potential over a torus enclosing the crack front and the crack surface integral due to the crack front potential plus the crack face loading. In mixed mode crack problems the total J integral is split into J sub I, J sub II, and J sub III representing the severity of the crack front in three modes of deformations. The direct and decomposition methods are used to separate the modes. These two methods were applied to several mixed mode fracture problems, were analyzed, and results were found to agree well with those available in the literature. The method lends itself to be used as a post-processing subroutine in a general purpose finite element program.
An equivalent domain integral method for three-dimensional mixed-mode fracture problems
NASA Technical Reports Server (NTRS)
Shivakumar, K. N.; Raju, I. S.
1992-01-01
A general formulation of the equivalent domain integral (EDI) method for mixed mode fracture problems in cracked solids is presented. The method is discussed in the context of a 3-D finite element analysis. The J integral consists of two parts: the volume integral of the crack front potential over a torus enclosing the crack front and the crack surface integral due to the crack front potential plus the crack face loading. In mixed mode crack problems the total J integral is split into J sub I, J sub II, and J sub III representing the severity of the crack front in three modes of deformations. The direct and decomposition methods are used to separate the modes. These two methods were applied to several mixed mode fracture problems, were analyzed, and results were found to agree well with those available in the literature. The method lends itself to be used as a post-processing subroutine in a general purpose finite element program.
NASA Astrophysics Data System (ADS)
Pandey, Rishi Kumar; Mishra, Hradyesh Kumar
2017-11-01
In this paper, the semi-analytic numerical technique for the solution of time-space fractional telegraph equation is applied. This numerical technique is based on coupling of the homotopy analysis method and sumudu transform. It shows the clear advantage with mess methods like finite difference method and also with polynomial methods similar to perturbation and Adomian decomposition methods. It is easily transform the complex fractional order derivatives in simple time domain and interpret the results in same meaning.
Requirements analysis, domain knowledge, and design
NASA Technical Reports Server (NTRS)
Potts, Colin
1988-01-01
Two improvements to current requirements analysis practices are suggested: domain modeling, and the systematic application of analysis heuristics. Domain modeling is the representation of relevant application knowledge prior to requirements specification. Artificial intelligence techniques may eventually be applicable for domain modeling. In the short term, however, restricted domain modeling techniques, such as that in JSD, will still be of practical benefit. Analysis heuristics are standard patterns of reasoning about the requirements. They usually generate questions of clarification or issues relating to completeness. Analysis heuristics can be represented and therefore systematically applied in an issue-based framework. This is illustrated by an issue-based analysis of JSD's domain modeling and functional specification heuristics. They are discussed in the context of the preliminary design of simple embedded systems.
Gap analysis: a method to assess core competency development in the curriculum.
Fater, Kerry H
2013-01-01
To determine the extent to which safety and quality improvement core competency development occurs in an undergraduate nursing program. Rapid change and increased complexity of health care environments demands that health care professionals are adequately prepared to provide high quality, safe care. A gap analysis compared the present state of competency development to a desirable (ideal) state. The core competencies, Nurse of the Future Nursing Core Competencies, reflect the ideal state and represent minimal expectations for entry into practice from pre-licensure programs. Findings from the gap analysis suggest significant strengths in numerous competency domains, deficiencies in two competency domains, and areas of redundancy in the curriculum. Gap analysis provides valuable data to direct curriculum revision. Opportunities for competency development were identified, and strategies were created jointly with the practice partner, thereby enhancing relevant knowledge, attitudes, and skills nurses need for clinical practice currently and in the future.
Modeling and Analysis of Power Processing Systems (MAPPS), initial phase 2
NASA Technical Reports Server (NTRS)
Yu, Y.; Lee, F. C.; Wangenheim, H.; Warren, D.
1977-01-01
The overall objective of the program is to provide the engineering tools to reduce the analysis, design, and development effort, and thus the cost, in achieving the required performances for switching regulators and dc-dc converter systems. The program was both tutorial and application oriented. Various analytical methods were described in detail and supplemented with examples, and those with standardization appeals were reduced into computer-based subprograms. Major program efforts included those concerning small and large signal control-dependent performance analysis and simulation, control circuit design, power circuit design and optimization, system configuration study, and system performance simulation. Techniques including discrete time domain, conventional frequency domain, Lagrange multiplier, nonlinear programming, and control design synthesis were employed in these efforts. To enhance interactive conversation between the modeling and analysis subprograms and the user, a working prototype of the Data Management Program was also developed to facilitate expansion as future subprogram capabilities increase.
Data-Flow Based Model Analysis
NASA Technical Reports Server (NTRS)
Saad, Christian; Bauer, Bernhard
2010-01-01
The concept of (meta) modeling combines an intuitive way of formalizing the structure of an application domain with a high expressiveness that makes it suitable for a wide variety of use cases and has therefore become an integral part of many areas in computer science. While the definition of modeling languages through the use of meta models, e.g. in Unified Modeling Language (UML), is a well-understood process, their validation and the extraction of behavioral information is still a challenge. In this paper we present a novel approach for dynamic model analysis along with several fields of application. Examining the propagation of information along the edges and nodes of the model graph allows to extend and simplify the definition of semantic constraints in comparison to the capabilities offered by e.g. the Object Constraint Language. Performing a flow-based analysis also enables the simulation of dynamic behavior, thus providing an "abstract interpretation"-like analysis method for the modeling domain.
Patterning by area selective oxidation
Nam, Chang-Yong; Kamcev, Jovan; Black, Charles T.; Grubbs, Robert
2015-12-29
Technologies are described for methods for producing a pattern of a material on a substrate. The methods may comprise receiving a patterned block copolymer on a substrate. The patterned block copolymer may include a first polymer block domain and a second polymer block domain. The method may comprise exposing the patterned block copolymer to a light effective to oxidize the first polymer block domain in the patterned block copolymer. The method may comprise applying a precursor to the block copolymer. The precursor may infuse into the oxidized first polymer block domain and generate the material. The method may comprise applying a removal agent to the block copolymer. The removal agent may be effective to remove the first polymer block domain and the second polymer block domain from the substrate, and may not be effective to remove the material in the oxidized first polymer block domain.
Accuracy of the domain method for the material derivative approach to shape design sensitivities
NASA Technical Reports Server (NTRS)
Yang, R. J.; Botkin, M. E.
1987-01-01
Numerical accuracy for the boundary and domain methods of the material derivative approach to shape design sensitivities is investigated through the use of mesh refinement. The results show that the domain method is generally more accurate than the boundary method, using the finite element technique. It is also shown that the domain method is equivalent, under certain assumptions, to the implicit differentiation approach not only theoretically but also numerically.
Chang, Shan; Zhang, Da-Wei; Xu, Lei; Wan, Hua; Hou, Ting-Jun; Kong, Ren
2016-11-01
RNA-binding protein with multiple splicing (RBPMS) is critical for axon guidance, smooth muscle plasticity, and regulation of cancer cell proliferation and migration. Recently, different states of the RNA-recognition motif (RRM) of RBPMS, one in its free form and another in complex with CAC-containing RNA, were determined by X-ray crystallography. In this article, the free RRM domain, its wild type complex and 2 mutant complex systems are studied by molecular dynamics (MD) simulations. Through comparison of free RRM domain and complex systems, it's found that the RNA binding facilitates stabilizing the RNA-binding interface of RRM domain, especially the C-terminal loop. Although both R38Q and T103A/K104A mutations reduce the binding affinity of RRM domain and RNA, the underlining mechanisms are different. Principal component analysis (PCA) and Molecular mechanics Poisson-Boltzmann surface area (MM/PBSA) methods were used to explore the dynamical and recognition mechanisms of RRM domain and RNA. R38Q mutation is positioned on the homodimerization interface and mainly induces the large fluctuations of RRM domains. This mutation does not directly act on the RNA-binding interface, but some interfacial hydrogen bonds are weakened. In contrast, T103A/K104A mutations are located on the RNA-binding interface of RRM domain. These mutations obviously break most of high occupancy hydrogen bonds in the RNA-binding interface. Meanwhile, the key interfacial residues lose their favorable energy contributions upon RNA binding. The ranking of calculated binding energies in 3 complex systems is well consistent with that of experimental binding affinities. These results will be helpful in understanding the RNA recognition mechanisms of RRM domain.
Chang, Shan; Zhang, Da-Wei; Xu, Lei; Wan, Hua; Hou, Ting-Jun; Kong, Ren
2016-01-01
ABSTRACT RNA-binding protein with multiple splicing (RBPMS) is critical for axon guidance, smooth muscle plasticity, and regulation of cancer cell proliferation and migration. Recently, different states of the RNA-recognition motif (RRM) of RBPMS, one in its free form and another in complex with CAC-containing RNA, were determined by X-ray crystallography. In this article, the free RRM domain, its wild type complex and 2 mutant complex systems are studied by molecular dynamics (MD) simulations. Through comparison of free RRM domain and complex systems, it's found that the RNA binding facilitates stabilizing the RNA-binding interface of RRM domain, especially the C-terminal loop. Although both R38Q and T103A/K104A mutations reduce the binding affinity of RRM domain and RNA, the underlining mechanisms are different. Principal component analysis (PCA) and Molecular mechanics Poisson-Boltzmann surface area (MM/PBSA) methods were used to explore the dynamical and recognition mechanisms of RRM domain and RNA. R38Q mutation is positioned on the homodimerization interface and mainly induces the large fluctuations of RRM domains. This mutation does not directly act on the RNA-binding interface, but some interfacial hydrogen bonds are weakened. In contrast, T103A/K104A mutations are located on the RNA-binding interface of RRM domain. These mutations obviously break most of high occupancy hydrogen bonds in the RNA-binding interface. Meanwhile, the key interfacial residues lose their favorable energy contributions upon RNA binding. The ranking of calculated binding energies in 3 complex systems is well consistent with that of experimental binding affinities. These results will be helpful in understanding the RNA recognition mechanisms of RRM domain. PMID:27592836
Statistical analysis of life history calendar data.
Eerola, Mervi; Helske, Satu
2016-04-01
The life history calendar is a data-collection tool for obtaining reliable retrospective data about life events. To illustrate the analysis of such data, we compare the model-based probabilistic event history analysis and the model-free data mining method, sequence analysis. In event history analysis, we estimate instead of transition hazards the cumulative prediction probabilities of life events in the entire trajectory. In sequence analysis, we compare several dissimilarity metrics and contrast data-driven and user-defined substitution costs. As an example, we study young adults' transition to adulthood as a sequence of events in three life domains. The events define the multistate event history model and the parallel life domains in multidimensional sequence analysis. The relationship between life trajectories and excess depressive symptoms in middle age is further studied by their joint prediction in the multistate model and by regressing the symptom scores on individual-specific cluster indices. The two approaches complement each other in life course analysis; sequence analysis can effectively find typical and atypical life patterns while event history analysis is needed for causal inquiries. © The Author(s) 2012.
Identification of YTH Domain-Containing Proteins as the Readers for N1-Methyladenosine in RNA.
Dai, Xiaoxia; Wang, Tianlu; Gonzalez, Gwendolyn; Wang, Yinsheng
2018-06-05
N1-methyladenosine (m 1 A) is an important post-transcriptional modification in RNA; however, the exact biological role of m 1 A remains to be determined. By employing a quantitative proteomics method, we identified multiple putative protein readers of m 1 A in RNA, including several YTH domain family proteins. We showed that YTHDF1-3 and YTHDC1, but not YTHDC2, could bind directly to m 1 A in RNA. We also found that Trp 432 in YTHDF2, a conserved residue in the hydrophobic pocket of the YTH domain that is necessary for its binding to N 6 -methyladenosine (m 6 A), is required for its recognition of m 1 A. An analysis of previously published data revealed transcriptome-wide colocalization of YTH domain-containing proteins and m 1 A sites in HeLa cells, suggesting that YTH domain-containing proteins can bind to m 1 A in cells. Together, our results uncovered YTH domain-containing proteins as readers for m 1 A in RNA and provided new insight into the functions of m 1 A in RNA biology.
Superbinder SH2 domains act as antagonists of cell signaling.
Kaneko, Tomonori; Huang, Haiming; Cao, Xuan; Li, Xing; Li, Chengjun; Voss, Courtney; Sidhu, Sachdev S; Li, Shawn S C
2012-09-25
Protein-ligand interactions mediated by modular domains, which often play important roles in regulating cellular functions, are generally of moderate affinities. We examined the Src homology 2 (SH2) domain, a modular domain that recognizes phosphorylated tyrosine (pTyr) residues, to investigate how the binding affinity of a modular domain for its ligand influences the structure and cellular function of the protein. We used the phage display method to perform directed evolution of the pTyr-binding residues in the SH2 domain of the tyrosine kinase Fyn and identified three amino acid substitutions that critically affected binding. We generated three SH2 domain triple-point mutants that were "superbinders" with much higher affinities for pTyr-containing peptides than the natural domain. Crystallographic analysis of one of these superbinders revealed that the superbinder SH2 domain recognized the pTyr moiety in a bipartite binding mode: A hydrophobic surface encompassed the phenyl ring, and a positively charged site engaged the phosphate. When expressed in mammalian cells, the superbinder SH2 domains blocked epidermal growth factor receptor signaling and inhibited anchorage-independent cell proliferation, suggesting that pTyr superbinders might be explored for therapeutic applications and useful as biological research tools. Although the SH2 domain fold can support much higher affinity for its ligand than is observed in nature, our results suggest that natural SH2 domains are not optimized for ligand binding but for specificity and flexibility, which are likely properties important for their function in signaling and regulatory processes.
Motor current signature analysis method for diagnosing motor operated devices
Haynes, Howard D.; Eissenberg, David M.
1990-01-01
A motor current noise signature analysis method and apparatus for remotely monitoring the operating characteristics of an electric motor-operated device such as a motor-operated valve. Frequency domain signal analysis techniques are applied to a conditioned motor current signal to distinctly identify various operating parameters of the motor driven device from the motor current signature. The signature may be recorded and compared with subsequent signatures to detect operating abnormalities and degradation of the device. This diagnostic method does not require special equipment to be installed on the motor-operated device, and the current sensing may be performed at remote control locations, e.g., where the motor-operated devices are used in accessible or hostile environments.
On the unsupervised analysis of domain-specific Chinese texts
Deng, Ke; Bol, Peter K.; Li, Kate J.; Liu, Jun S.
2016-01-01
With the growing availability of digitized text data both publicly and privately, there is a great need for effective computational tools to automatically extract information from texts. Because the Chinese language differs most significantly from alphabet-based languages in not specifying word boundaries, most existing Chinese text-mining methods require a prespecified vocabulary and/or a large relevant training corpus, which may not be available in some applications. We introduce an unsupervised method, top-down word discovery and segmentation (TopWORDS), for simultaneously discovering and segmenting words and phrases from large volumes of unstructured Chinese texts, and propose ways to order discovered words and conduct higher-level context analyses. TopWORDS is particularly useful for mining online and domain-specific texts where the underlying vocabulary is unknown or the texts of interest differ significantly from available training corpora. When outputs from TopWORDS are fed into context analysis tools such as topic modeling, word embedding, and association pattern finding, the results are as good as or better than that from using outputs of a supervised segmentation method. PMID:27185919
Transient thermal stresses of work roll by coupled thermoelasticity
NASA Astrophysics Data System (ADS)
Lai, W. B.; Chen, T. C.; Weng, C. I.
1991-01-01
A numerical method, based on a two-dimensional plane strain model, is developed to predict the transient responses (that include distributions of temperature, thermal deformation, and thermal stress) of work roll during strip rolling by coupled thermoelasticity. The method consists of discretizing the space domain of the problem by finite element method first, and then treating the time domain by implicit time integration techniques. In order to avoid the difficulty in analysis due to relative movement between work roll and its thermal boundary, the energy equation is formulated with respect to a fixed Eulerian reference frame. The effect of thermoelastic coupling term, that is generally disregarded in strip rolling, can be considered and assessed. The influences of some important process parameters, such as rotational speed of the roll and intensity of heat flux, on transient solutions are also included and discussed. Furthermore, since the stress history at any point of the roll in both transient and steady state could be accurately evaluated, it is available to perform the analysis of thermal fatigue for the roll by means of previous data.
NASA Astrophysics Data System (ADS)
Huan, Huiting; Mandelis, Andreas; Liu, Lixian
2018-04-01
Determining and keeping track of a material's mechanical performance is very important for safety in the aerospace industry. The mechanical strength of alloy materials is precisely quantified in terms of its stress-strain relation. It has been proven that frequency-domain photothermoacoustic (FD-PTA) techniques are effective methods for characterizing the stress-strain relation of metallic alloys. PTA methodologies include photothermal (PT) diffusion and laser thermoelastic photoacoustic ultrasound (PAUS) generation which must be separately discussed because the relevant frequency ranges and signal detection principles are widely different. In this paper, a detailed theoretical analysis of the connection between thermoelastic parameters and stress/strain tensor is presented with respect to FD-PTA nondestructive testing. Based on the theoretical model, a finite element method (FEM) was further implemented to simulate the PT and PAUS signals at very different frequency ranges as an important analysis tool of experimental data. The change in the stress-strain relation has an impact on both thermal and elastic properties, verified by FEM and results/signals from both PT and PAUS experiments.
A method of directly extracting multiwave angle-domain common-image gathers
NASA Astrophysics Data System (ADS)
Han, Jianguang; Wang, Yun
2017-10-01
Angle-domain common-image gathers (ADCIGs) can provide an effective way for migration velocity analysis and amplitude versus angle analysis in oil-gas seismic exploration. On the basis of multi-component Gaussian beam prestack depth migration (GB-PSDM), an alternative method of directly extracting multiwave ADCIGs is presented in this paper. We first introduce multi-component GB-PSDM, where a wavefield separation is proceeded to obtain the separated PP- and PS-wave seismic records before migration imaging for multiwave seismic data. Then, the principle of extracting PP- and PS-ADCIGs using GB-PSDM is presented. The propagation angle can be obtained using the real-value travel time of Gaussian beam in the course of GB-PSDM, which can be used to calculate the incidence and reflection angles. Two kinds of ADCIGs can be extracted for the PS-wave, one of which is P-wave incidence ADCIGs and the other one is S-wave reflection ADCIGs. In this paper, we use the incident angle to plot the ADCIGs for both PP- and PS-waves. Finally, tests of synthetic examples show that the method introduced here is accurate and effective.
Semi-automated contour recognition using DICOMautomaton
NASA Astrophysics Data System (ADS)
Clark, H.; Wu, J.; Moiseenko, V.; Lee, R.; Gill, B.; Duzenli, C.; Thomas, S.
2014-03-01
Purpose: A system has been developed which recognizes and classifies Digital Imaging and Communication in Medicine contour data with minimal human intervention. It allows researchers to overcome obstacles which tax analysis and mining systems, including inconsistent naming conventions and differences in data age or resolution. Methods: Lexicographic and geometric analysis is used for recognition. Well-known lexicographic methods implemented include Levenshtein-Damerau, bag-of-characters, Double Metaphone, Soundex, and (word and character)-N-grams. Geometrical implementations include 3D Fourier Descriptors, probability spheres, boolean overlap, simple feature comparison (e.g. eccentricity, volume) and rule-based techniques. Both analyses implement custom, domain-specific modules (e.g. emphasis differentiating left/right organ variants). Contour labels from 60 head and neck patients are used for cross-validation. Results: Mixed-lexicographical methods show an effective improvement in more than 10% of recognition attempts compared with a pure Levenshtein-Damerau approach when withholding 70% of the lexicon. Domain-specific and geometrical techniques further boost performance. Conclusions: DICOMautomaton allows users to recognize contours semi-automatically. As usage increases and the lexicon is filled with additional structures, performance improves, increasing the overall utility of the system.
2017-01-01
Background Health care is a complex sociotechnical system. Patient treatment is evolving and needs to incorporate the use of technology and new patient-centered treatment paradigms. Cognitive work analysis (CWA) is an effective framework for understanding complex systems, and work domain analysis (WDA) is useful for understanding complex ecologies. Although previous applications of CWA have described patient treatment, due to their scope of work patients were previously characterized as biomedical machines, rather than patient actors involved in their own care. Objective An abstraction hierarchy that characterizes patients as beings with complex social values and priorities is needed. This can help better understand treatment in a modern approach to care. The purpose of this study was to perform a WDA to represent the treatment of patients with medical records. Methods The methods to develop this model included the analysis of written texts and collaboration with subject matter experts. Our WDA represents the ecology through its functional purposes, abstract functions, generalized functions, physical functions, and physical forms. Results Compared with other work domain models, this model is able to articulate the nuanced balance between medical treatment, patient education, and limited health care resources. Concepts in the analysis were similar to the modeling choices of other WDAs but combined them in as a comprehensive, systematic, and contextual overview. The model is helpful to understand user competencies and needs. Future models could be developed to model the patient’s domain and enable the exploration of the shared decision-making (SDM) paradigm. Conclusion Our work domain model links treatment goals, decision-making constraints, and task workflows. This model can be used by system developers who would like to use ecological interface design (EID) to improve systems. Our hierarchy is the first in a future set that could explore new treatment paradigms. Future hierarchies could model the patient as a controller and could be useful for mobile app development. PMID:28754650
A Fast Alignment-Free Approach for De Novo Detection of Protein Conserved Regions
Abnousi, Armen; Broschat, Shira L.; Kalyanaraman, Ananth
2016-01-01
Background Identifying conserved regions in protein sequences is a fundamental operation, occurring in numerous sequence-driven analysis pipelines. It is used as a way to decode domain-rich regions within proteins, to compute protein clusters, to annotate sequence function, and to compute evolutionary relationships among protein sequences. A number of approaches exist for identifying and characterizing protein families based on their domains, and because domains represent conserved portions of a protein sequence, the primary computation involved in protein family characterization is identification of such conserved regions. However, identifying conserved regions from large collections (millions) of protein sequences presents significant challenges. Methods In this paper we present a new, alignment-free method for detecting conserved regions in protein sequences called NADDA (No-Alignment Domain Detection Algorithm). Our method exploits the abundance of exact matching short subsequences (k-mers) to quickly detect conserved regions, and the power of machine learning is used to improve the prediction accuracy of detection. We present a parallel implementation of NADDA using the MapReduce framework and show that our method is highly scalable. Results We have compared NADDA with Pfam and InterPro databases. For known domains annotated by Pfam, accuracy is 83%, sensitivity 96%, and specificity 44%. For sequences with new domains not present in the training set an average accuracy of 63% is achieved when compared to Pfam. A boost in results in comparison with InterPro demonstrates the ability of NADDA to capture conserved regions beyond those present in Pfam. We have also compared NADDA with ADDA and MKDOM2, assuming Pfam as ground-truth. On average NADDA shows comparable accuracy, more balanced sensitivity and specificity, and being alignment-free, is significantly faster. Excluding the one-time cost of training, runtimes on a single processor were 49s, 10,566s, and 456s for NADDA, ADDA, and MKDOM2, respectively, for a data set comprised of approximately 2500 sequences. PMID:27552220
Spectral element method for elastic and acoustic waves in frequency domain
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shi, Linlin; Zhou, Yuanguo; Wang, Jia-Min
Numerical techniques in time domain are widespread in seismic and acoustic modeling. In some applications, however, frequency-domain techniques can be advantageous over the time-domain approach when narrow band results are desired, especially if multiple sources can be handled more conveniently in the frequency domain. Moreover, the medium attenuation effects can be more accurately and conveniently modeled in the frequency domain. In this paper, we present a spectral-element method (SEM) in frequency domain to simulate elastic and acoustic waves in anisotropic, heterogeneous, and lossy media. The SEM is based upon the finite-element framework and has exponential convergence because of the usemore » of GLL basis functions. The anisotropic perfectly matched layer is employed to truncate the boundary for unbounded problems. Compared with the conventional finite-element method, the number of unknowns in the SEM is significantly reduced, and higher order accuracy is obtained due to its spectral accuracy. To account for the acoustic-solid interaction, the domain decomposition method (DDM) based upon the discontinuous Galerkin spectral-element method is proposed. Numerical experiments show the proposed method can be an efficient alternative for accurate calculation of elastic and acoustic waves in frequency domain.« less
ERIC Educational Resources Information Center
Koenig, Lane; Fields, Errol L.; Dall, Timothy M.; Ameen, Ansari Z.; Harwood, Henrick J.
This report demonstrates three applications of case-mix methods using regression analysis. The results are used to assess the relative effectiveness of substance abuse treatment providers. The report also examines the ability of providers to improve client employment outcomes, an outcome domain relatively unexamined in the assessment of provider…
Sensitivity Analysis of Hydraulic Head to Locations of Model Boundaries
Lu, Zhiming
2018-01-30
Sensitivity analysis is an important component of many model activities in hydrology. Numerous studies have been conducted in calculating various sensitivities. Most of these sensitivity analysis focus on the sensitivity of state variables (e.g. hydraulic head) to parameters representing medium properties such as hydraulic conductivity or prescribed values such as constant head or flux at boundaries, while few studies address the sensitivity of the state variables to some shape parameters or design parameters that control the model domain. Instead, these shape parameters are typically assumed to be known in the model. In this study, based on the flow equation, wemore » derive the equation (and its associated initial and boundary conditions) for sensitivity of hydraulic head to shape parameters using continuous sensitivity equation (CSE) approach. These sensitivity equations can be solved numerically in general or analytically in some simplified cases. Finally, the approach has been demonstrated through two examples and the results are compared favorably to those from analytical solutions or numerical finite difference methods with perturbed model domains, while numerical shortcomings of the finite difference method are avoided.« less
Sensitivity Analysis of Hydraulic Head to Locations of Model Boundaries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Zhiming
Sensitivity analysis is an important component of many model activities in hydrology. Numerous studies have been conducted in calculating various sensitivities. Most of these sensitivity analysis focus on the sensitivity of state variables (e.g. hydraulic head) to parameters representing medium properties such as hydraulic conductivity or prescribed values such as constant head or flux at boundaries, while few studies address the sensitivity of the state variables to some shape parameters or design parameters that control the model domain. Instead, these shape parameters are typically assumed to be known in the model. In this study, based on the flow equation, wemore » derive the equation (and its associated initial and boundary conditions) for sensitivity of hydraulic head to shape parameters using continuous sensitivity equation (CSE) approach. These sensitivity equations can be solved numerically in general or analytically in some simplified cases. Finally, the approach has been demonstrated through two examples and the results are compared favorably to those from analytical solutions or numerical finite difference methods with perturbed model domains, while numerical shortcomings of the finite difference method are avoided.« less
Computational domain discretization in numerical analysis of flow within granular materials
NASA Astrophysics Data System (ADS)
Sosnowski, Marcin
2018-06-01
The discretization of computational domain is a crucial step in Computational Fluid Dynamics (CFD) because it influences not only the numerical stability of the analysed model but also the agreement of obtained results and real data. Modelling flow in packed beds of granular materials is a very challenging task in terms of discretization due to the existence of narrow spaces between spherical granules contacting tangentially in a single point. Standard approach to this issue results in a low quality mesh and unreliable results in consequence. Therefore the common method is to reduce the diameter of the modelled granules in order to eliminate the single-point contact between the individual granules. The drawback of such method is the adulteration of flow and contact heat resistance among others. Therefore an innovative method is proposed in the paper: single-point contact is extended to a cylinder-shaped volume contact. Such approach eliminates the low quality mesh elements and simultaneously introduces only slight distortion to the flow as well as contact heat transfer. The performed analysis of numerous test cases prove the great potential of the proposed method of meshing the packed beds of granular materials.
Structural domains and main-chain flexibility in prion proteins.
Blinov, N; Berjanskii, M; Wishart, D S; Stepanova, M
2009-02-24
In this study we describe a novel approach to define structural domains and to characterize the local flexibility in both human and chicken prion proteins. The approach we use is based on a comprehensive theory of collective dynamics in proteins that was recently developed. This method determines the essential collective coordinates, which can be found from molecular dynamics trajectories via principal component analysis. Under this particular framework, we are able to identify the domains where atoms move coherently while at the same time to determine the local main-chain flexibility for each residue. We have verified this approach by comparing our results for the predicted dynamic domain systems with the computed main-chain flexibility profiles and the NMR-derived random coil indexes for human and chicken prion proteins. The three sets of data show excellent agreement. Additionally, we demonstrate that the dynamic domains calculated in this fashion provide a highly sensitive measure of protein collective structure and dynamics. Furthermore, such an analysis is capable of revealing structural and dynamic properties of proteins that are inaccessible to the conventional assessment of secondary structure. Using the collective dynamic simulation approach described here along with a high-temperature simulations of unfolding of human prion protein, we have explored whether locations of relatively low stability could be identified where the unfolding process could potentially be facilitated. According to our analysis, the locations of relatively low stability may be associated with the beta-sheet formed by strands S1 and S2 and the adjacent loops, whereas helix HC appears to be a relatively stable part of the protein. We suggest that this kind of structural analysis may provide a useful background for a more quantitative assessment of potential routes of spontaneous misfolding in prion proteins.
Multi-Domain Transfer Learning for Early Diagnosis of Alzheimer’s Disease
Cheng, Bo; Liu, Mingxia; Li, Zuoyong
2017-01-01
Recently, transfer learning has been successfully applied in early diagnosis of Alzheimer’s Disease (AD) based on multi-domain data. However, most of existing methods only use data from a single auxiliary domain, and thus cannot utilize the intrinsic useful correlation information from multiple domains. Accordingly, in this paper, we consider the joint learning of tasks in multi-auxiliary domains and the target domain, and propose a novel Multi-Domain Transfer Learning (MDTL) framework for early diagnosis of AD. Specifically, the proposed MDTL framework consists of two key components: 1) a multi-domain transfer feature selection (MDTFS) model that selects the most informative feature subset from multi-domain data, and 2) a multidomain transfer classification (MDTC) model that can identify disease status for early AD detection. We evaluate our method on 807 subjects from the Alzheimer’s Disease Neuroimaging Initiative (ADNI) database using baseline magnetic resonance imaging (MRI) data. The experimental results show that the proposed MDTL method can effectively utilize multi-auxiliary domain data for improving the learning performance in the target domain, compared with several state-of-the-art methods. PMID:27928657
Review on the Modeling of Electrostatic MEMS
Chuang, Wan-Chun; Lee, Hsin-Li; Chang, Pei-Zen; Hu, Yuh-Chung
2010-01-01
Electrostatic-driven microelectromechanical systems devices, in most cases, consist of couplings of such energy domains as electromechanics, optical electricity, thermoelectricity, and electromagnetism. Their nonlinear working state makes their analysis complex and complicated. This article introduces the physical model of pull-in voltage, dynamic characteristic analysis, air damping effect, reliability, numerical modeling method, and application of electrostatic-driven MEMS devices. PMID:22219707
Koehler Leman, Julia; Bonneau, Richard
2018-04-03
Membrane proteins composed of soluble and membrane domains are often studied one domain at a time. However, to understand the biological function of entire protein systems and their interactions with each other and drugs, knowledge of full-length structures or models is required. Although few computational methods exist that could potentially be used to model full-length constructs of membrane proteins, none of these methods are perfectly suited for the problem at hand. Existing methods require an interface or knowledge of the relative orientations of the domains or are not designed for domain assembly, and none of them are developed for membrane proteins. Here we describe the first domain assembly protocol specifically designed for membrane proteins that assembles intra- and extracellular soluble domains and the transmembrane domain into models of the full-length membrane protein. Our protocol does not require an interface between the domains and samples possible domain orientations based on backbone dihedrals in the flexible linker regions, created via fragment insertion, while keeping the transmembrane domain fixed in the membrane. For five examples tested, our method mp_domain_assembly, implemented in RosettaMP, samples domain orientations close to the known structure and is best used in conjunction with experimental data to reduce the conformational search space.
DOE R&D Accomplishments Database
Chandonia, John-Marc; Hon, Gary; Walker, Nigel S.; Lo Conte, Loredana; Koehl, Patrice; Levitt, Michael; Brenner, Steven E.
2003-09-15
The ASTRAL compendium provides several databases and tools to aid in the analysis of protein structures, particularly through the use of their sequences. Partially derived from the SCOP database of protein structure domains, it includes sequences for each domain and other resources useful for studying these sequences and domain structures. The current release of ASTRAL contains 54,745 domains, more than three times as many as the initial release four years ago. ASTRAL has undergone major transformations in the past two years. In addition to several complete updates each year, ASTRAL is now updated on a weekly basis with preliminary classifications of domains from newly released PDB structures. These classifications are available as a stand-alone database, as well as available integrated into other ASTRAL databases such as representative subsets. To enhance the utility of ASTRAL to structural biologists, all SCOP domains are now made available as PDB-style coordinate files as well as sequences. In addition to sequences and representative subsets based on SCOP domains, sequences and subsets based on PDB chains are newly included in ASTRAL. Several search tools have been added to ASTRAL to facilitate retrieval of data by individual users and automated methods.
Experience report: Using formal methods for requirements analysis of critical spacecraft software
NASA Technical Reports Server (NTRS)
Lutz, Robyn R.; Ampo, Yoko
1994-01-01
Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.
Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.
Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing
2017-01-01
Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.
TDR method for determine IC's parameters
NASA Astrophysics Data System (ADS)
Timoshenkov, V.; Rodionov, D.; Khlybov, A.
2016-12-01
Frequency domain simulation is a widely used approach for determine integrated circuits parameters. This approach can be found in most of software tools used in IC industry. Time domain simulation approach shows intensive usage last years due to some advantages. In particular it applicable for analysis of nonlinear and nonstationary systems where frequency domain is inapplicable. Resolution of time domain systems allow see heterogeneities on distance 1mm, determine it parameters and properties. Authors used approach based on detecting reflected signals from heterogeneities - time domain reflectometry (TDR). Field effect transistor technology scaling up to 30-60nm gate length and 10nm gate dielectric, heterojunction bi-polar transistors with 10-30nm base width allows fabricate digital IC's with 20GHz clock frequency and RF-IC's with tens GHz bandwidth. Such devices and operation speed suppose transit signal by use microwave lines. There are local heterogeneities can be found inside of the signal path due to connections between different parts of signal lines (stripe line-RF-connector pin, stripe line - IC package pin). These heterogeneities distort signals that cause bandwidth decrease for RF-devices. Time domain research methods of transmission and reflected signals give the opportunities to determine heterogeneities, it properties, parameters and built up equivalent circuits. Experimental results are provided and show possibility for inductance and capacitance measurement up to 25GHz. Measurements contains result of signal path research on IC and printed circuit board (PCB) used for 12GHz RF chips. Also dielectric constant versus frequency was measured up to 35GHz.
Choi, Seong Hee; Zhang, Yu; Jiang, Jack J.; Bless, Diane M.; Welham, Nathan V.
2011-01-01
Objective The primary goal of this study was to evaluate a nonlinear dynamic approach to the acoustic analysis of dysphonia associated with vocal fold scar and sulcus vocalis. Study Design Case-control study. Methods Acoustic voice samples from scar/sulcus patients and age/sex-matched controls were analyzed using correlation dimension (D2) and phase plots, time-domain based perturbation indices (jitter, shimmer, signal-to-noise ratio [SNR]), and an auditory-perceptual rating scheme. Signal typing was performed to identify samples with bifurcations and aperiodicity. Results Type 2 and 3 acoustic signals were highly represented in the scar/sulcus patient group. When data were analyzed irrespective of signal type, all perceptual and acoustic indices successfully distinguished scar/sulcus patients from controls. Removal of type 2 and 3 signals eliminated the previously identified differences between experimental groups for all acoustic indices except D2. The strongest perceptual-acoustic correlation in our dataset was observed for SNR; the weakest correlation was observed for D2. Conclusions These findings suggest that D2 is inferior to time-domain based perturbation measures for the analysis of dysphonia associated with scar/sulcus; however, time-domain based algorithms are inherently susceptible to inflation under highly aperiodic (i.e., type 2 and 3) signal conditions. Auditory-perceptual analysis, unhindered by signal aperiodicity, is therefore a robust strategy for distinguishing scar/sulcus patient voices from normal voices. Future acoustic analysis research in this area should consider alternative (e.g., frequency- and quefrency-domain based) measures alongside additional nonlinear approaches. PMID:22516315