An equivalent domain integral method in the two-dimensional analysis of mixed mode crack problems
NASA Technical Reports Server (NTRS)
Raju, I. S.; Shivakumar, K. N.
1990-01-01
An equivalent domain integral (EDI) method for calculating J-integrals for two-dimensional cracked elastic bodies is presented. The details of the method and its implementation are presented for isoparametric elements. The EDI method gave accurate values of the J-integrals for two mode I and two mixed mode problems. Numerical studies showed that domains consisting of one layer of elements are sufficient to obtain accurate J-integral values. Two procedures for separating the individual modes from the domain integrals are presented.
NASA Technical Reports Server (NTRS)
Raju, I. S.; Shivakumar, K. N.
1989-01-01
An equivalent domain integral (EDI) method for calculating J-intergrals for two-dimensional cracked elastic bodies is presented. The details of the method and its implementation are presented for isoparametric elements. The total and product integrals consist of the sum of an area of domain integral and line integrals on the crack faces. The line integrals vanish only when the crack faces are traction free and the loading is either pure mode 1 or pure mode 2 or a combination of both with only the square-root singular term in the stress field. The EDI method gave accurate values of the J-integrals for two mode I and two mixed mode problems. Numerical studies showed that domains consisting of one layer of elements are sufficient to obtain accurate J-integral values. Two procedures for separating the individual modes from the domain integrals are presented. The procedure that uses the symmetric and antisymmetric components of the stress and displacement fields to calculate the individual modes gave accurate values of the integrals for all problems analyzed. The EDI method when applied to a problem of an interface crack in two different materials showed that the mode 1 and mode 2 components are domain dependent while the total integral is not. This behavior is caused by the presence of the oscillatory part of the singularity in bimaterial crack problems. The EDI method, thus, shows behavior similar to the virtual crack closure method for bimaterial problems.
An equivalent domain integral for analysis of two-dimensional mixed mode problems
NASA Technical Reports Server (NTRS)
Raju, I. S.; Shivakumar, K. N.
1989-01-01
An equivalent domain integral (EDI) method for calculating J-integrals for two-dimensional cracked elastic bodies subjected to mixed mode loading is presented. The total and product integrals consist of the sum of an area or domain integral and line integrals on the crack faces. The EDI method gave accurate values of the J-integrals for two mode I and two mixed mode problems. Numerical studies showed that domains consisting of one layer of elements are sufficient to obtain accurate J-integral values. Two procedures for separating the individual modes from the domain integrals are presented. The procedure that uses the symmetric and antisymmetric components of the stress and displacement fields to calculate the individual modes gave accurate values of the integrals for all the problems analyzed.
Efficient integration method for fictitious domain approaches
NASA Astrophysics Data System (ADS)
Duczek, Sascha; Gabbert, Ulrich
2015-10-01
In the current article, we present an efficient and accurate numerical method for the integration of the system matrices in fictitious domain approaches such as the finite cell method (FCM). In the framework of the FCM, the physical domain is embedded in a geometrically larger domain of simple shape which is discretized using a regular Cartesian grid of cells. Therefore, a spacetree-based adaptive quadrature technique is normally deployed to resolve the geometry of the structure. Depending on the complexity of the structure under investigation this method accounts for most of the computational effort. To reduce the computational costs for computing the system matrices an efficient quadrature scheme based on the divergence theorem (Gauß-Ostrogradsky theorem) is proposed. Using this theorem the dimension of the integral is reduced by one, i.e. instead of solving the integral for the whole domain only its contour needs to be considered. In the current paper, we present the general principles of the integration method and its implementation. The results to several two-dimensional benchmark problems highlight its properties. The efficiency of the proposed method is compared to conventional spacetree-based integration techniques.
An extension of the finite cell method using boolean operations
NASA Astrophysics Data System (ADS)
Abedian, Alireza; Düster, Alexander
2017-05-01
In the finite cell method, the fictitious domain approach is combined with high-order finite elements. The geometry of the problem is taken into account by integrating the finite cell formulation over the physical domain to obtain the corresponding stiffness matrix and load vector. In this contribution, an extension of the FCM is presented wherein both the physical and fictitious domain of an element are simultaneously evaluated during the integration. In the proposed extension of the finite cell method, the contribution of the stiffness matrix over the fictitious domain is subtracted from the cell, resulting in the desired stiffness matrix which reflects the contribution of the physical domain only. This method results in an exponential rate of convergence for porous domain problems with a smooth solution and accurate integration. In addition, it reduces the computational cost, especially when applying adaptive integration schemes based on the quadtree/octree. Based on 2D and 3D problems of linear elastostatics, numerical examples serve to demonstrate the efficiency and accuracy of the proposed method.
Jenkins, Chris; Pierson, Lyndon G.
2016-10-25
Techniques and mechanism to selectively provide resource access to a functional domain of a platform. In an embodiment, the platform includes both a report domain to monitor the functional domain and a policy domain to identify, based on such monitoring, a transition of the functional domain from a first integrity level to a second integrity level. In response to a change in integrity level, the policy domain may configure the enforcement domain to enforce against the functional domain one or more resource accessibility rules corresponding to the second integrity level. In another embodiment, the policy domain automatically initiates operations in aid of transitioning the platform from the second integrity level to a higher integrity level.
A multi-domain spectral method for time-fractional differential equations
NASA Astrophysics Data System (ADS)
Chen, Feng; Xu, Qinwu; Hesthaven, Jan S.
2015-07-01
This paper proposes an approach for high-order time integration within a multi-domain setting for time-fractional differential equations. Since the kernel is singular or nearly singular, two main difficulties arise after the domain decomposition: how to properly account for the history/memory part and how to perform the integration accurately. To address these issues, we propose a novel hybrid approach for the numerical integration based on the combination of three-term-recurrence relations of Jacobi polynomials and high-order Gauss quadrature. The different approximations used in the hybrid approach are justified theoretically and through numerical examples. Based on this, we propose a new multi-domain spectral method for high-order accurate time integrations and study its stability properties by identifying the method as a generalized linear method. Numerical experiments confirm hp-convergence for both time-fractional differential equations and time-fractional partial differential equations.
Numerical time-domain electromagnetics based on finite-difference and convolution
NASA Astrophysics Data System (ADS)
Lin, Yuanqu
Time-domain methods posses a number of advantages over their frequency-domain counterparts for the solution of wideband, nonlinear, and time varying electromagnetic scattering and radiation phenomenon. Time domain integral equation (TDIE)-based methods, which incorporate the beneficial properties of integral equation method, are thus well suited for solving broadband scattering problems for homogeneous scatterers. Widespread adoption of TDIE solvers has been retarded relative to other techniques by their inefficiency, inaccuracy and instability. Moreover, two-dimensional (2D) problems are especially problematic, because 2D Green's functions have infinite temporal support, exacerbating these difficulties. This thesis proposes a finite difference delay modeling (FDDM) scheme for the solution of the integral equations of 2D transient electromagnetic scattering problems. The method discretizes the integral equations temporally using first- and second-order finite differences to map Laplace-domain equations into the Z domain before transforming to the discrete time domain. The resulting procedure is unconditionally stable because of the nature of the Laplace- to Z-domain mapping. The first FDDM method developed in this thesis uses second-order Lagrange basis functions with Galerkin's method for spatial discretization. The second application of the FDDM method discretizes the space using a locally-corrected Nystrom method, which accelerates the precomputation phase and achieves high order accuracy. The Fast Fourier Transform (FFT) is applied to accelerate the marching-on-time process in both methods. While FDDM methods demonstrate impressive accuracy and stability in solving wideband scattering problems for homogeneous scatterers, they still have limitations in analyzing interactions between several inhomogenous scatterers. Therefore, this thesis devises a multi-region finite-difference time-domain (MR-FDTD) scheme based on domain-optimal Green's functions for solving sparsely-populated problems. The scheme uses a discrete Green's function (DGF) on the FDTD lattice to truncate the local subregions, and thus reduces reflection error on the local boundary. A continuous Green's function (CGF) is implemented to pass the influence of external fields into each FDTD region which mitigates the numerical dispersion and anisotropy of standard FDTD. Numerical results will illustrate the accuracy and stability of the proposed techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ochiai, Yoshihiro
Heat-conduction analysis under steady state without heat generation can easily be treated by the boundary element method. However, in the case with heat conduction with heat generation can approximately be solved without a domain integral by an improved multiple-reciprocity boundary element method. The convention multiple-reciprocity boundary element method is not suitable for complicated heat generation. In the improved multiple-reciprocity boundary element method, on the other hand, the domain integral in each step is divided into point, line, and area integrals. In order to solve the problem, the contour lines of heat generation, which approximate the actual heat generation, are used.
NASA Astrophysics Data System (ADS)
Zheng, Chang-Jun; Bi, Chuan-Xing; Zhang, Chuanzeng; Gao, Hai-Feng; Chen, Hai-Bo
2018-04-01
The vibration behavior of thin elastic structures can be noticeably influenced by the surrounding water, which represents a kind of heavy fluid. Since the feedback of the acoustic pressure onto the structure cannot be neglected in this case, a strong coupled scheme between the structural and fluid domains is usually required. In this work, a coupled finite element and boundary element (FE-BE) solver is developed for the free vibration analysis of structures submerged in an infinite fluid domain or a semi-infinite fluid domain with a free water surface. The structure is modeled by the finite element method (FEM). The compressibility of the fluid is taken into account, and hence the Helmholtz equation serves as the governing equation of the fluid domain. The boundary element method (BEM) is employed to model the fluid domain, and a boundary integral formulation with a half-space fundamental solution is used to satisfy the Dirichlet boundary condition on the free water surface exactly. The resulting nonlinear eigenvalue problem (NEVP) is converted into a small linear one by using a contour integral method. Adequate modifications are suggested to improve the efficiency of the contour integral method and avoid missing the eigenfrequencies of interest. The Burton-Miller method is used to filter out the fictitious eigenfrequencies of the boundary integral formulations. Numerical examples are given to demonstrate the accuracy and applicability of the developed eigensolver, and also show that the fluid-loading effect strongly depends on both the water depth and the mode shapes.
An equivalent domain integral method for three-dimensional mixed-mode fracture problems
NASA Technical Reports Server (NTRS)
Shivakumar, K. N.; Raju, I. S.
1991-01-01
A general formulation of the equivalent domain integral (EDI) method for mixed mode fracture problems in cracked solids is presented. The method is discussed in the context of a 3-D finite element analysis. The J integral consists of two parts: the volume integral of the crack front potential over a torus enclosing the crack front and the crack surface integral due to the crack front potential plus the crack face loading. In mixed mode crack problems the total J integral is split into J sub I, J sub II, and J sub III representing the severity of the crack front in three modes of deformations. The direct and decomposition methods are used to separate the modes. These two methods were applied to several mixed mode fracture problems, were analyzed, and results were found to agree well with those available in the literature. The method lends itself to be used as a post-processing subroutine in a general purpose finite element program.
An equivalent domain integral method for three-dimensional mixed-mode fracture problems
NASA Technical Reports Server (NTRS)
Shivakumar, K. N.; Raju, I. S.
1992-01-01
A general formulation of the equivalent domain integral (EDI) method for mixed mode fracture problems in cracked solids is presented. The method is discussed in the context of a 3-D finite element analysis. The J integral consists of two parts: the volume integral of the crack front potential over a torus enclosing the crack front and the crack surface integral due to the crack front potential plus the crack face loading. In mixed mode crack problems the total J integral is split into J sub I, J sub II, and J sub III representing the severity of the crack front in three modes of deformations. The direct and decomposition methods are used to separate the modes. These two methods were applied to several mixed mode fracture problems, were analyzed, and results were found to agree well with those available in the literature. The method lends itself to be used as a post-processing subroutine in a general purpose finite element program.
Indicators and Measurement Tools for Health Systems Integration: A Knowledge Synthesis
Oelke, Nelly D.; da Silva Lima, Maria Alice Dias; Stiphout, Michelle; Janke, Robert; Witt, Regina Rigatto; Van Vliet-Brown, Cheryl; Schill, Kaela; Rostami, Mahnoush; Hepp, Shelanne; Birney, Arden; Al-Roubaiai, Fatima; Marques, Giselda Quintana
2017-01-01
Background: Despite far reaching support for integrated care, conceptualizing and measuring integrated care remains challenging. This knowledge synthesis aimed to identify indicator domains and tools to measure progress towards integrated care. Methods: We used an established framework and a Delphi survey with integration experts to identify relevant measurement domains. For each domain, we searched and reviewed the literature for relevant tools. Findings: From 7,133 abstracts, we retrieved 114 unique tools. We found many quality tools to measure care coordination, patient engagement and team effectiveness/performance. In contrast, there were few tools in the domains of performance measurement and information systems, alignment of organizational goals and resource allocation. The search yielded 12 tools that measure overall integration or three or more indicator domains. Discussion: Our findings highlight a continued gap in tools to measure foundational components that support integrated care. In the absence of such targeted tools, “overall integration” tools may be useful for a broad assessment of the overall state of a system. Conclusions: Continued progress towards integrated care depends on our ability to evaluate the success of strategies across different levels and context. This study has identified 114 tools that measure integrated care across 16 domains, supporting efforts towards a unified measurement framework. PMID:29588637
Time-domain hybrid method for simulating large amplitude motions of ships advancing in waves
NASA Astrophysics Data System (ADS)
Liu, Shukui; Papanikolaou, Apostolos D.
2011-03-01
Typical results obtained by a newly developed, nonlinear time domain hybrid method for simulating large amplitude motions of ships advancing with constant forward speed in waves are presented. The method is hybrid in the way of combining a time-domain transient Green function method and a Rankine source method. The present approach employs a simple double integration algorithm with respect to time to simulate the free-surface boundary condition. During the simulation, the diffraction and radiation forces are computed by pressure integration over the mean wetted surface, whereas the incident wave and hydrostatic restoring forces/moments are calculated on the instantaneously wetted surface of the hull. Typical numerical results of application of the method to the seakeeping performance of a standard containership, namely the ITTC S175, are herein presented. Comparisons have been made between the results from the present method, the frequency domain 3D panel method (NEWDRIFT) of NTUA-SDL and available experimental data and good agreement has been observed for all studied cases between the results of the present method and comparable other data.
Jang, Hae-Won; Ih, Jeong-Guon
2013-03-01
The time domain boundary element method (TBEM) to calculate the exterior sound field using the Kirchhoff integral has difficulties in non-uniqueness and exponential divergence. In this work, a method to stabilize TBEM calculation for the exterior problem is suggested. The time domain CHIEF (Combined Helmholtz Integral Equation Formulation) method is newly formulated to suppress low order fictitious internal modes. This method constrains the surface Kirchhoff integral by forcing the pressures at the additional interior points to be zero when the shortest retarded time between boundary nodes and an interior point elapses. However, even after using the CHIEF method, the TBEM calculation suffers the exponential divergence due to the remaining unstable high order fictitious modes at frequencies higher than the frequency limit of the boundary element model. For complete stabilization, such troublesome modes are selectively adjusted by projecting the time response onto the eigenspace. In a test example for a transiently pulsating sphere, the final average error norm of the stabilized response compared to the analytic solution is 2.5%.
Enhancing Autonomy of Aerial Systems Via Integration of Visual Sensors into Their Avionics Suite
2016-09-01
aerial platform for subsequent visual sensor integration. 14. SUBJECT TERMS autonomous system, quadrotors, direct method, inverse ...CONTROLLER ARCHITECTURE .....................................................43 B. INVERSE DYNAMICS IN THE VIRTUAL DOMAIN ......................45 1...control station GPS Global-Positioning System IDVD inverse dynamics in the virtual domain ILP integer linear program INS inertial-navigation system
A cross-domain communication resource scheduling method for grid-enabled communication networks
NASA Astrophysics Data System (ADS)
Zheng, Xiangquan; Wen, Xiang; Zhang, Yongding
2011-10-01
To support a wide range of different grid applications in environments where various heterogeneous communication networks coexist, it is important to enable advanced capabilities in on-demand and dynamical integration and efficient co-share with cross-domain heterogeneous communication resource, thus providing communication services which are impossible for single communication resource to afford. Based on plug-and-play co-share and soft integration with communication resource, Grid-enabled communication network is flexibly built up to provide on-demand communication services for gird applications with various requirements on quality of service. Based on the analysis of joint job and communication resource scheduling in grid-enabled communication networks (GECN), this paper presents a cross multi-domain communication resource cooperatively scheduling method and describes the main processes such as traffic requirement resolution for communication services, cross multi-domain negotiation on communication resource, on-demand communication resource scheduling, and so on. The presented method is to afford communication service capability to cross-domain traffic delivery in GECNs. Further research work towards validation and implement of the presented method is pointed out at last.
NASA Astrophysics Data System (ADS)
Sablik, Thomas; Velten, Jörg; Kummert, Anton
2015-03-01
An novel system for automatic privacy protection in digital media based on spectral domain watermarking and JPEG compression is described in the present paper. In a first step private areas are detected. Therefore a detection method is presented. The implemented method uses Haar cascades to detects faces. Integral images are used to speed up calculations and the detection. Multiple detections of one face are combined. Succeeding steps comprise embedding the data into the image as part of JPEG compression using spectral domain methods and protecting the area of privacy. The embedding process is integrated into and adapted to JPEG compression. A Spread Spectrum Watermarking method is used to embed the size and position of the private areas into the cover image. Different methods for embedding regarding their robustness are compared. Moreover the performance of the method concerning tampered images is presented.
NASA Astrophysics Data System (ADS)
Liang, Hui; Chen, Xiaobo
2017-10-01
A novel multi-domain method based on an analytical control surface is proposed by combining the use of free-surface Green function and Rankine source function. A cylindrical control surface is introduced to subdivide the fluid domain into external and internal domains. Unlike the traditional domain decomposition strategy or multi-block method, the control surface here is not panelized, on which the velocity potential and normal velocity components are analytically expressed as a series of base functions composed of Laguerre function in vertical coordinate and Fourier series in the circumference. Free-surface Green function is applied in the external domain, and the boundary integral equation is constructed on the control surface in the sense of Galerkin collocation via integrating test functions orthogonal to base functions over the control surface. The external solution gives rise to the so-called Dirichlet-to-Neumann [DN2] and Neumann-to-Dirichlet [ND2] relations on the control surface. Irregular frequencies, which are only dependent on the radius of the control surface, are present in the external solution, and they are removed by extending the boundary integral equation to the interior free surface (circular disc) on which the null normal derivative of potential is imposed, and the dipole distribution is expressed as Fourier-Bessel expansion on the disc. In the internal domain, where the Rankine source function is adopted, new boundary integral equations are formulated. The point collocation is imposed over the body surface and free surface, while the collocation of the Galerkin type is applied on the control surface. The present method is valid in the computation of both linear and second-order mean drift wave loads. Furthermore, the second-order mean drift force based on the middle-field formulation can be calculated analytically by using the coefficients of the Fourier-Laguerre expansion.
Kitto, Simon; Bell, Mary; Peller, Jennifer; Sargeant, Joan; Etchells, Edward; Reeves, Scott; Silver, Ivan
2013-03-01
Public and professional concern about health care quality, safety and efficiency is growing. Continuing education, knowledge translation, patient safety and quality improvement have made concerted efforts to address these issues. However, a coordinated and integrated effort across these domains is lacking. This article explores and discusses the similarities and differences amongst the four domains in relation to their missions, stakeholders, methods, and limitations. This paper highlights the potential for a more integrated and collaborative partnership to promote networking and information sharing amongst the four domains. This potential rests on the premise that an integrated approach may result in the development and implementation of more holistic and effective interdisciplinary interventions. In conclusion, an outline of current research that is informed by the preliminary findings in this paper is also briefly discussed. The research concerns a comprehensive mapping of the relationships between the domains to gain an understanding of potential dissonances between how the domains represent themselves, their work and the work of their 'partner' domains.
Meshfree Modeling of Munitions Penetration in Soils
2017-04-01
discretization ...................... 8 Figure 2. Nodal smoothing domain for the modified stabilized nonconforming nodal integration...projectile ............................................................................................... 36 Figure 17. Discretization for the...List of Acronyms DEM: discrete element methods FEM: finite element methods MSNNI: modified stabilized nonconforming nodal integration RK
Integrated Semantics Service Platform for the Internet of Things: A Case Study of a Smart Office
Ryu, Minwoo; Kim, Jaeho; Yun, Jaeseok
2015-01-01
The Internet of Things (IoT) allows machines and devices in the world to connect with each other and generate a huge amount of data, which has a great potential to provide useful knowledge across service domains. Combining the context of IoT with semantic technologies, we can build integrated semantic systems to support semantic interoperability. In this paper, we propose an integrated semantic service platform (ISSP) to support ontological models in various IoT-based service domains of a smart city. In particular, we address three main problems for providing integrated semantic services together with IoT systems: semantic discovery, dynamic semantic representation, and semantic data repository for IoT resources. To show the feasibility of the ISSP, we develop a prototype service for a smart office using the ISSP, which can provide a preset, personalized office environment by interpreting user text input via a smartphone. We also discuss a scenario to show how the ISSP-based method would help build a smart city, where services in each service domain can discover and exploit IoT resources that are wanted across domains. We expect that our method could eventually contribute to providing people in a smart city with more integrated, comprehensive services based on semantic interoperability. PMID:25608216
Integrated semantics service platform for the Internet of Things: a case study of a smart office.
Ryu, Minwoo; Kim, Jaeho; Yun, Jaeseok
2015-01-19
The Internet of Things (IoT) allows machines and devices in the world to connect with each other and generate a huge amount of data, which has a great potential to provide useful knowledge across service domains. Combining the context of IoT with semantic technologies, we can build integrated semantic systems to support semantic interoperability. In this paper, we propose an integrated semantic service platform (ISSP) to support ontological models in various IoT-based service domains of a smart city. In particular, we address three main problems for providing integrated semantic services together with IoT systems: semantic discovery, dynamic semantic representation, and semantic data repository for IoT resources. To show the feasibility of the ISSP, we develop a prototype service for a smart office using the ISSP, which can provide a preset, personalized office environment by interpreting user text input via a smartphone. We also discuss a scenario to show how the ISSP-based method would help build a smart city, where services in each service domain can discover and exploit IoT resources that are wanted across domains. We expect that our method could eventually contribute to providing people in a smart city with more integrated, comprehensive services based on semantic interoperability.
Domain identification in impedance computed tomography by spline collocation method
NASA Technical Reports Server (NTRS)
Kojima, Fumio
1990-01-01
A method for estimating an unknown domain in elliptic boundary value problems is considered. The problem is formulated as an inverse problem of integral equations of the second kind. A computational method is developed using a splice collocation scheme. The results can be applied to the inverse problem of impedance computed tomography (ICT) for image reconstruction.
Enhanced line integral convolution with flow feature detection
DOT National Transportation Integrated Search
1995-01-01
Prepared ca. 1995. The Line Integral Convolution (LIC) method, which blurs white noise textures along a vector field, is an effective way to visualize overall flow patterns in a 2D domain [Cabral & Leedom '93]. The method produces a flow texture imag...
An Operator-Integration-Factor Splitting (OIFS) method for Incompressible Flows in Moving Domains
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patel, Saumil S.; Fischer, Paul F.; Min, Misun
In this paper, we present a characteristic-based numerical procedure for simulating incompressible flows in domains with moving boundaries. Our approach utilizes an operator-integration-factor splitting technique to help produce an effcient and stable numerical scheme. Using the spectral element method and an arbitrary Lagrangian-Eulerian formulation, we investigate flows where the convective acceleration effects are non-negligible. Several examples, ranging from laminar to turbulent flows, are considered. Comparisons with a standard, semi-implicit time-stepping procedure illustrate the improved performance of the scheme.
NASA Astrophysics Data System (ADS)
Nguyen-Thanh, Nhon; Li, Weidong; Zhou, Kun
2018-03-01
This paper develops a coupling approach which integrates the meshfree method and isogeometric analysis (IGA) for static and free-vibration analyses of cracks in thin-shell structures. In this approach, the domain surrounding the cracks is represented by the meshfree method while the rest domain is meshed by IGA. The present approach is capable of preserving geometry exactness and high continuity of IGA. The local refinement is achieved by adding the nodes along the background cells in the meshfree domain. Moreover, the equivalent domain integral technique for three-dimensional problems is derived from the additional Kirchhoff-Love theory to compute the J-integral for the thin-shell model. The proposed approach is able to address the problems involving through-the-thickness cracks without using additional rotational degrees of freedom, which facilitates the enrichment strategy for crack tips. The crack tip enrichment effects and the stress distribution and displacements around the crack tips are investigated. Free vibrations of cracks in thin shells are also analyzed. Numerical examples are presented to demonstrate the accuracy and computational efficiency of the coupling approach.
ERIC Educational Resources Information Center
Muis, Krista R.; Trevors, Gregory; Duffy, Melissa; Ranellucci, John; Foy, Michael J.
2016-01-01
The purpose of this study was to empirically scrutinize Muis, Bendixen, and Haerle's (2006) Theory of Integrated Domains in Epistemology framework. Secondary, college, undergraduate, and graduate students completed self-reports designed to measure their domain-specific and domain-general epistemic beliefs for mathematics, psychology, and general…
Cause and Effect: Testing a Mechanism and Method for the Cognitive Integration of Basic Science.
Kulasegaram, Kulamakan; Manzone, Julian C; Ku, Cheryl; Skye, Aimee; Wadey, Veronica; Woods, Nicole N
2015-11-01
Methods of integrating basic science with clinical knowledge are still debated in medical training. One possibility is increasing the spatial and temporal proximity of clinical content to basic science. An alternative model argues that teaching must purposefully expose relationships between the domains. The authors compared different methods of integrating basic science: causal explanations linking basic science to clinical features, presenting both domains separately but in proximity, and simply presenting clinical features First-year undergraduate health professions students were randomized to four conditions: (1) science-causal explanations (SC), (2) basic science before clinical concepts (BC), (3) clinical concepts before basic science (CB), and (4) clinical features list only (FL). Based on assigned conditions, participants were given explanations for four disorders in neurology or rheumatology followed by a memory quiz and diagnostic test consisting of 12 cases which were repeated after one week. Ninety-four participants completed the study. No difference was found on memory test performance, but on the diagnostic test, a condition by time interaction was found (F[3,88] = 3.05, P < .03, ηp = 0.10). Although all groups had similar immediate performance, the SC group had a minimal decrease in performance on delayed testing; the CB and FL groups had the greatest decreases. These results suggest that creating proximity between basic science and clinical concepts may not guarantee cognitive integration. Although cause-and-effect explanations may not be possible for all domains, making explicit and specific connections between domains will likely facilitate the benefits of integration for learners.
A Fourier collocation time domain method for numerically solving Maxwell's equations
NASA Technical Reports Server (NTRS)
Shebalin, John V.
1991-01-01
A new method for solving Maxwell's equations in the time domain for arbitrary values of permittivity, conductivity, and permeability is presented. Spatial derivatives are found by a Fourier transform method and time integration is performed using a second order, semi-implicit procedure. Electric and magnetic fields are collocated on the same grid points, rather than on interleaved points, as in the Finite Difference Time Domain (FDTD) method. Numerical results are presented for the propagation of a 2-D Transverse Electromagnetic (TEM) mode out of a parallel plate waveguide and into a dielectric and conducting medium.
High-speed extended-term time-domain simulation for online cascading analysis of power system
NASA Astrophysics Data System (ADS)
Fu, Chuan
A high-speed extended-term (HSET) time domain simulator (TDS), intended to become a part of an energy management system (EMS), has been newly developed for use in online extended-term dynamic cascading analysis of power systems. HSET-TDS includes the following attributes for providing situational awareness of high-consequence events: (i) online analysis, including n-1 and n-k events, (ii) ability to simulate both fast and slow dynamics for 1-3 hours in advance, (iii) inclusion of rigorous protection-system modeling, (iv) intelligence for corrective action ID, storage, and fast retrieval, and (v) high-speed execution. Very fast on-line computational capability is the most desired attribute of this simulator. Based on the process of solving algebraic differential equations describing the dynamics of power system, HSET-TDS seeks to develop computational efficiency at each of the following hierarchical levels, (i) hardware, (ii) strategies, (iii) integration methods, (iv) nonlinear solvers, and (v) linear solver libraries. This thesis first describes the Hammer-Hollingsworth 4 (HH4) implicit integration method. Like the trapezoidal rule, HH4 is symmetrically A-Stable but it possesses greater high-order precision (h4 ) than the trapezoidal rule. Such precision enables larger integration steps and therefore improves simulation efficiency for variable step size implementations. This thesis provides the underlying theory on which we advocate use of HH4 over other numerical integration methods for power system time-domain simulation. Second, motivated by the need to perform high speed extended-term time domain simulation (HSET-TDS) for on-line purposes, this thesis presents principles for designing numerical solvers of differential algebraic systems associated with power system time-domain simulation, including DAE construction strategies (Direct Solution Method), integration methods(HH4), nonlinear solvers(Very Dishonest Newton), and linear solvers(SuperLU). We have implemented a design appropriate for HSET-TDS, and we compare it to various solvers, including the commercial grade PSSE program, with respect to computational efficiency and accuracy, using as examples the New England 39 bus system, the expanded 8775 bus system, and PJM 13029 buses system. Third, we have explored a stiffness-decoupling method, intended to be part of parallel design of time domain simulation software for super computers. The stiffness-decoupling method is able to combine the advantages of implicit methods (A-stability) and explicit method(less computation). With the new stiffness detection method proposed herein, the stiffness can be captured. The expanded 975 buses system is used to test simulation efficiency. Finally, several parallel strategies for super computer deployment to simulate power system dynamics are proposed and compared. Design A partitions the task via scale with the stiffness decoupling method, waveform relaxation, and parallel linear solver. Design B partitions the task via the time axis using a highly precise integration method, the Kuntzmann-Butcher Method - order 8 (KB8). The strategy of partitioning events is designed to partition the whole simulation via the time axis through a simulated sequence of cascading events. For all strategies proposed, a strategy of partitioning cascading events is recommended, since the sub-tasks for each processor are totally independent, and therefore minimum communication time is needed.
On the Analysis Methods for the Time Domain and Frequency Domain Response of a Buried Objects*
NASA Astrophysics Data System (ADS)
Poljak, Dragan; Šesnić, Silvestar; Cvetković, Mario
2014-05-01
There has been a continuous interest in the analysis of ground-penetrating radar systems and related applications in civil engineering [1]. Consequently, a deeper insight of scattering phenomena occurring in a lossy half-space, as well as the development of sophisticated numerical methods based on Finite Difference Time Domain (FDTD) method, Finite Element Method (FEM), Boundary Element Method (BEM), Method of Moments (MoM) and various hybrid methods, is required, e.g. [2], [3]. The present paper deals with certain techniques for time and frequency domain analysis, respectively, of buried conducting and dielectric objects. Time domain analysis is related to the assessment of a transient response of a horizontal straight thin wire buried in a lossy half-space using a rigorous antenna theory (AT) approach. The AT approach is based on the space-time integral equation of the Pocklington type (time domain electric field integral equation for thin wires). The influence of the earth-air interface is taken into account via the simplified reflection coefficient arising from the Modified Image Theory (MIT). The obtained results for the transient current induced along the electrode due to the transmitted plane wave excitation are compared to the numerical results calculated via an approximate transmission line (TL) approach and the AT approach based on the space-frequency variant of the Pocklington integro-differential approach, respectively. It is worth noting that the space-frequency Pocklington equation is numerically solved via the Galerkin-Bubnov variant of the Indirect Boundary Element Method (GB-IBEM) and the corresponding transient response is obtained by the aid of inverse fast Fourier transform (IFFT). The results calculated by means of different approaches agree satisfactorily. Frequency domain analysis is related to the assessment of frequency domain response of dielectric sphere using the full wave model based on the set of coupled electric field integral equations for surfaces. The numerical solution is carried out by means of the improved variant of the Method of Moments (MoM) providing numerically stable and an efficient procedure for the extraction of singularities arising in integral expressions. The proposed analysis method is compared to the results obtained by using some commercial software packages. A satisfactory agreement has been achieved. Both approaches discussed throughout this work and demonstrated on canonical geometries could be also useful for benchmark purpose. References [1] L. Pajewski et al., Applications of Ground Penetrating Radar in Civil Engineering - COST Action TU1208, 2013. [2] U. Oguz, L. Gurel, Frequency Responses of Ground-Penetrating Radars Operating Over Highly Lossy Grounds, IEEE Trans. Geosci. and Remote sensing, Vol. 40, No 6, 2002. [3] D.Poljak, Advanced Modeling in Computational electromagnetic Compatibility, John Wiley and Sons, New York 2007. *This work benefited from networking activities carried out within the EU funded COST Action TU1208 "Civil Engineering Applications of Ground Penetrating Radar."
NASA Astrophysics Data System (ADS)
Elag, M.; Goodall, J. L.
2013-12-01
Hydrologic modeling often requires the re-use and integration of models from different disciplines to simulate complex environmental systems. Component-based modeling introduces a flexible approach for integrating physical-based processes across disciplinary boundaries. Several hydrologic-related modeling communities have adopted the component-based approach for simulating complex physical systems by integrating model components across disciplinary boundaries in a workflow. However, it is not always straightforward to create these interdisciplinary models due to the lack of sufficient knowledge about a hydrologic process. This shortcoming is a result of using informal methods for organizing and sharing information about a hydrologic process. A knowledge-based ontology provides such standards and is considered the ideal approach for overcoming this challenge. The aims of this research are to present the methodology used in analyzing the basic hydrologic domain in order to identify hydrologic processes, the ontology itself, and how the proposed ontology is integrated with the Water Resources Component (WRC) ontology. The proposed ontology standardizes the definitions of a hydrologic process, the relationships between hydrologic processes, and their associated scientific equations. The objective of the proposed Hydrologic Process (HP) Ontology is to advance the idea of creating a unified knowledge framework for components' metadata by introducing a domain-level ontology for hydrologic processes. The HP ontology is a step toward an explicit and robust domain knowledge framework that can be evolved through the contribution of domain users. Analysis of the hydrologic domain is accomplished using the Formal Concept Approach (FCA), in which the infiltration process, an important hydrologic process, is examined. Two infiltration methods, the Green-Ampt and Philip's methods, were used to demonstrate the implementation of information in the HP ontology. Furthermore, a SPARQL service is provided for semantic-based querying of the ontology.
Fluid-structure interaction of turbulent boundary layer over a compliant surface
NASA Astrophysics Data System (ADS)
Anantharamu, Sreevatsa; Mahesh, Krishnan
2016-11-01
Turbulent flows induce unsteady loads on surfaces in contact with them, which affect material stresses, surface vibrations and far-field acoustics. We are developing a numerical methodology to study the coupled interaction of a turbulent boundary layer with the underlying surface. The surface is modeled as a linear elastic solid, while the fluid follows the spatially filtered incompressible Navier-Stokes equations. An incompressible Large Eddy Simulation finite volume flow approach based on the algorithm of Mahesh et al. is used in the fluid domain. The discrete kinetic energy conserving property of the method ensures robustness at high Reynolds number. The linear elastic model in the solid domain is integrated in space using finite element method and in time using the Newmark time integration method. The fluid and solid domain solvers are coupled using both weak and strong coupling methods. Details of the algorithm, validation, and relevant results will be presented. This work is supported by NSWCCD, ONR.
Calculation of Moment Matrix Elements for Bilinear Quadrilaterals and Higher-Order Basis Functions
2016-01-06
methods are known as boundary integral equation (BIE) methods and the present study falls into this category. The numerical solution of the BIE is...iterated integrals. The inner integral involves the product of the free-space Green’s function for the Helmholtz equation multiplied by an appropriate...Website: http://www.wipl-d.com/ 5. Y. Zhang and T. K. Sarkar, Parallel Solution of Integral Equation -Based EM Problems in the Frequency Domain. New
Angle-domain inverse scattering migration/inversion in isotropic media
NASA Astrophysics Data System (ADS)
Li, Wuqun; Mao, Weijian; Li, Xuelei; Ouyang, Wei; Liang, Quan
2018-07-01
The classical seismic asymptotic inversion can be transformed into a problem of inversion of generalized Radon transform (GRT). In such methods, the combined parameters are linearly attached to the scattered wave-field by Born approximation and recovered by applying an inverse GRT operator to the scattered wave-field data. Typical GRT-style true-amplitude inversion procedure contains an amplitude compensation process after the weighted migration via dividing an illumination associated matrix whose elements are integrals of scattering angles. It is intuitional to some extent that performs the generalized linear inversion and the inversion of GRT together by this process for direct inversion. However, it is imprecise to carry out such operation when the illumination at the image point is limited, which easily leads to the inaccuracy and instability of the matrix. This paper formulates the GRT true-amplitude inversion framework in an angle-domain version, which naturally degrades the external integral term related to the illumination in the conventional case. We solve the linearized integral equation for combined parameters of different fixed scattering angle values. With this step, we obtain high-quality angle-domain common-image gathers (CIGs) in the migration loop which provide correct amplitude-versus-angle (AVA) behavior and reasonable illumination range for subsurface image points. Then we deal with the over-determined problem to solve each parameter in the combination by a standard optimization operation. The angle-domain GRT inversion method keeps away from calculating the inaccurate and unstable illumination matrix. Compared with the conventional method, the angle-domain method can obtain more accurate amplitude information and wider amplitude-preserved range. Several model tests demonstrate the effectiveness and practicability.
NASA Astrophysics Data System (ADS)
Zheng, Chang-Jun; Chen, Hai-Bo; Chen, Lei-Lei
2013-04-01
This paper presents a novel wideband fast multipole boundary element approach to 3D half-space/plane-symmetric acoustic wave problems. The half-space fundamental solution is employed in the boundary integral equations so that the tree structure required in the fast multipole algorithm is constructed for the boundary elements in the real domain only. Moreover, a set of symmetric relations between the multipole expansion coefficients of the real and image domains are derived, and the half-space fundamental solution is modified for the purpose of applying such relations to avoid calculating, translating and saving the multipole/local expansion coefficients of the image domain. The wideband adaptive multilevel fast multipole algorithm associated with the iterative solver GMRES is employed so that the present method is accurate and efficient for both lowand high-frequency acoustic wave problems. As for exterior acoustic problems, the Burton-Miller method is adopted to tackle the fictitious eigenfrequency problem involved in the conventional boundary integral equation method. Details on the implementation of the present method are described, and numerical examples are given to demonstrate its accuracy and efficiency.
IDEF5 Ontology Description Capture Method: Concept Paper
NASA Technical Reports Server (NTRS)
Menzel, Christopher P.; Mayer, Richard J.
1990-01-01
The results of research towards an ontology capture method referred to as IDEF5 are presented. Viewed simply as the study of what exists in a domain, ontology is an activity that can be understood to be at work across the full range of human inquiry prompted by the persistent effort to understand the world in which it has found itself - and which it has helped to shape. In the contest of information management, ontology is the task of extracting the structure of a given engineering, manufacturing, business, or logistical domain and storing it in an usable representational medium. A key to effective integration is a system ontology that can be accessed and modified across domains and which captures common features of the overall system relevant to the goals of the disparate domains. If the focus is on information integration, then the strongest motivation for ontology comes from the need to support data sharing and function interoperability. In the correct architecture, an enterprise ontology base would allow th e construction of an integrated environment in which legacy systems appear to be open architecture integrated resources. If the focus is on system/software development, then support for the rapid acquisition of reliable systems is perhaps the strongest motivation for ontology. Finally, ontological analysis was demonstrated to be an effective first step in the construction of robust knowledge based systems.
The integrated manual and automatic control of complex flight systems
NASA Technical Reports Server (NTRS)
Schmidt, D. K.
1983-01-01
Development of a unified control synthesis methodology for complex and/or non-conventional flight vehicles, and prediction techniques for the handling characteristics of such vehicles are reported. Identification of pilot dynamics and objectives, using time domain and frequency domain methods is proposed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ciraolo, Giulio, E-mail: g.ciraolo@math.unipa.it; Gargano, Francesco, E-mail: gargano@math.unipa.it; Sciacca, Vincenzo, E-mail: sciacca@math.unipa.it
2013-08-01
We study a new approach to the problem of transparent boundary conditions for the Helmholtz equation in unbounded domains. Our approach is based on the minimization of an integral functional arising from a volume integral formulation of the radiation condition. The index of refraction does not need to be constant at infinity and may have some angular dependency as well as perturbations. We prove analytical results on the convergence of the approximate solution. Numerical examples for different shapes of the artificial boundary and for non-constant indexes of refraction will be presented.
Fictitious domain method for fully resolved reacting gas-solid flow simulation
NASA Astrophysics Data System (ADS)
Zhang, Longhui; Liu, Kai; You, Changfu
2015-10-01
Fully resolved simulation (FRS) for gas-solid multiphase flow considers solid objects as finite sized regions in flow fields and their behaviours are predicted by solving equations in both fluid and solid regions directly. Fixed mesh numerical methods, such as fictitious domain method, are preferred in solving FRS problems and have been widely researched. However, for reacting gas-solid flows no suitable fictitious domain numerical method has been developed. This work presents a new fictitious domain finite element method for FRS of reacting particulate flows. Low Mach number reacting flow governing equations are solved sequentially on a regular background mesh. Particles are immersed in the mesh and driven by their surface forces and torques integrated on immersed interfaces. Additional treatments on energy and surface reactions are developed. Several numerical test cases validated the method and a burning carbon particles array falling simulation proved the capability for solving moving reacting particle cluster problems.
Investigation of system integration methods for bubble domain flight recorders
NASA Technical Reports Server (NTRS)
Chen, T. T.; Bohning, O. D.
1975-01-01
System integration methods for bubble domain flight records are investigated. Bubble memory module packaging and assembly, the control electronics design and construction, field coils, and permanent magnet bias structure design are studied. A small 60-k bit engineering model was built and tested to demonstrate the feasibility of the bubble recorder. Based on the various studies performed, a projection is made on a 50,000,000-bit prototype recorder. It is estimated that the recorder will occupy 190 cubic in., weigh 12 lb, and consume 12 w power when all of its four tracks are operated in parallel at 150 kHz data rate.
DIMA 3.0: Domain Interaction Map.
Luo, Qibin; Pagel, Philipp; Vilne, Baiba; Frishman, Dmitrij
2011-01-01
Domain Interaction MAp (DIMA, available at http://webclu.bio.wzw.tum.de/dima) is a database of predicted and known interactions between protein domains. It integrates 5807 structurally known interactions imported from the iPfam and 3did databases and 46,900 domain interactions predicted by four computational methods: domain phylogenetic profiling, domain pair exclusion algorithm correlated mutations and domain interaction prediction in a discriminative way. Additionally predictions are filtered to exclude those domain pairs that are reported as non-interacting by the Negatome database. The DIMA Web site allows to calculate domain interaction networks either for a domain of interest or for entire organisms, and to explore them interactively using the Flash-based Cytoscape Web software.
Stuit, Marco; Wortmann, Hans; Szirbik, Nick; Roodenburg, Jan
2011-12-01
In the healthcare domain, human collaboration processes (HCPs), which consist of interactions between healthcare workers from different (para)medical disciplines and departments, are of growing importance as healthcare delivery becomes increasingly integrated. Existing workflow-based process modelling tools for healthcare process management, which are the most commonly applied, are not suited for healthcare HCPs mainly due to their focus on the definition of task sequences instead of the graphical description of human interactions. This paper uses a case study of a healthcare HCP at a Dutch academic hospital to evaluate a novel interaction-centric process modelling method. The HCP under study is the care pathway performed by the head and neck oncology team. The evaluation results show that the method brings innovative, effective, and useful features. First, it collects and formalizes the tacit domain knowledge of the interviewed healthcare workers in individual interaction diagrams. Second, the method automatically integrates these local diagrams into a single global interaction diagram that reflects the consolidated domain knowledge. Third, the case study illustrates how the method utilizes a graphical modelling language for effective tree-based description of interactions, their composition and routing relations, and their roles. A process analysis of the global interaction diagram is shown to identify HCP improvement opportunities. The proposed interaction-centric method has wider applicability since interactions are the core of most multidisciplinary patient-care processes. A discussion argues that, although (multidisciplinary) collaboration is in many cases not optimal in the healthcare domain, it is increasingly considered a necessity to improve integration, continuity, and quality of care. The proposed method is helpful to describe, analyze, and improve the functioning of healthcare collaboration. Copyright © 2011 Elsevier Inc. All rights reserved.
Explicit finite-difference simulation of optical integrated devices on massive parallel computers.
Sterkenburgh, T; Michels, R M; Dress, P; Franke, H
1997-02-20
An explicit method for the numerical simulation of optical integrated circuits by means of the finite-difference time-domain (FDTD) method is presented. This method, based on an explicit solution of Maxwell's equations, is well established in microwave technology. Although the simulation areas are small, we verified the behavior of three interesting problems, especially nonparaxial problems, with typical aspects of integrated optical devices. Because numerical losses are within acceptable limits, we suggest the use of the FDTD method to achieve promising quantitative simulation results.
Unsteady transonic flows - Introduction, current trends, applications
NASA Technical Reports Server (NTRS)
Yates, E. C., Jr.
1985-01-01
The computational treatment of unsteady transonic flows is discussed, reviewing the historical development and current techniques. The fundamental physical principles are outlined; the governing equations are introduced; three-dimensional linearized and two-dimensional linear-perturbation theories in frequency domain are described in detail; and consideration is given to frequency-domain FEMs and time-domain finite-difference and integral-equation methods. Extensive graphs and diagrams are included.
Musick, Charles R [Castro Valley, CA; Critchlow, Terence [Livermore, CA; Ganesh, Madhaven [San Jose, CA; Slezak, Tom [Livermore, CA; Fidelis, Krzysztof [Brentwood, CA
2006-12-19
A system and method is disclosed for integrating and accessing multiple data sources within a data warehouse architecture. The metadata formed by the present method provide a way to declaratively present domain specific knowledge, obtained by analyzing data sources, in a consistent and useable way. Four types of information are represented by the metadata: abstract concepts, databases, transformations and mappings. A mediator generator automatically generates data management computer code based on the metadata. The resulting code defines a translation library and a mediator class. The translation library provides a data representation for domain specific knowledge represented in a data warehouse, including "get" and "set" methods for attributes that call transformation methods and derive a value of an attribute if it is missing. The mediator class defines methods that take "distinguished" high-level objects as input and traverse their data structures and enter information into the data warehouse.
NASA Astrophysics Data System (ADS)
Nastos, C. V.; Theodosiou, T. C.; Rekatsinas, C. S.; Saravanos, D. A.
2018-03-01
An efficient numerical method is developed for the simulation of dynamic response and the prediction of the wave propagation in composite plate structures. The method is termed finite wavelet domain method and takes advantage of the outstanding properties of compactly supported 2D Daubechies wavelet scaling functions for the spatial interpolation of displacements in a finite domain of a plate structure. The development of the 2D wavelet element, based on the first order shear deformation laminated plate theory is described and equivalent stiffness, mass matrices and force vectors are calculated and synthesized in the wavelet domain. The transient response is predicted using the explicit central difference time integration scheme. Numerical results for the simulation of wave propagation in isotropic, quasi-isotropic and cross-ply laminated plates are presented and demonstrate the high spatial convergence and problem size reduction obtained by the present method.
ERIC Educational Resources Information Center
Wäschle, Kristin; Lehmann, Thomas; Brauch, Nicola; Nückles, Matthias
2015-01-01
Becoming a history teacher requires the integration of pedagogical knowledge, pedagogical content knowledge, and content knowledge. Because the integration of knowledge from different disciplines is a complex task, we investigated prompted learning journals as a method to support teacher students' knowledge integration. Fifty-two preservice…
Method and apparatus to debug an integrated circuit chip via synchronous clock stop and scan
Bellofatto, Ralph E [Ridgefield, CT; Ellavsky, Matthew R [Rochester, MN; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Gooding, Thomas M [Rochester, MN; Haring, Rudolf A [Cortlandt Manor, NY; Hehenberger, Lance G [Leander, TX; Ohmacht, Martin [Yorktown Heights, NY
2012-03-20
An apparatus and method for evaluating a state of an electronic or integrated circuit (IC), each IC including one or more processor elements for controlling operations of IC sub-units, and each the IC supporting multiple frequency clock domains. The method comprises: generating a synchronized set of enable signals in correspondence with one or more IC sub-units for starting operation of one or more IC sub-units according to a determined timing configuration; counting, in response to one signal of the synchronized set of enable signals, a number of main processor IC clock cycles; and, upon attaining a desired clock cycle number, generating a stop signal for each unique frequency clock domain to synchronously stop a functional clock for each respective frequency clock domain; and, upon synchronously stopping all on-chip functional clocks on all frequency clock domains in a deterministic fashion, scanning out data values at a desired IC chip state. The apparatus and methodology enables construction of a cycle-by-cycle view of any part of the state of a running IC chip, using a combination of on-chip circuitry and software.
Time-Domain Computation Of Electromagnetic Fields In MMICs
NASA Technical Reports Server (NTRS)
Lansing, Faiza S.; Rascoe, Daniel L.
1995-01-01
Maxwell's equations solved on three-dimensional, conformed orthogonal grids by finite-difference techniques. Method of computing frequency-dependent electrical parameters of monolithic microwave integrated circuit (MMIC) involves time-domain computation of propagation of electromagnetic field in response to excitation by single pulse at input terminal, followed by computation of Fourier transforms to obtain frequency-domain response from time-domain response. Parameters computed include electric and magnetic fields, voltages, currents, impedances, scattering parameters, and effective dielectric constants. Powerful and efficient means for analyzing performance of even complicated MMIC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ming, Yang; Wu, Zi-jian; Xu, Fei, E-mail: feixu@nju.edu.cn
The nonmaximally entangled state is a special kind of entangled state, which has important applications in quantum information processing. It has been generated in quantum circuits based on bulk optical elements. However, corresponding schemes in integrated quantum circuits have been rarely considered. In this Letter, we propose an effective solution for this problem. An electro-optically tunable nonmaximally mode-entangled photon state is generated in an on-chip domain-engineered lithium niobate (LN) waveguide. Spontaneous parametric down-conversion and electro-optic interaction are effectively combined through suitable domain design to transform the entangled state into our desired formation. Moreover, this is a flexible approach to entanglementmore » architectures. Other kinds of reconfigurable entanglements are also achievable through this method. LN provides a very promising platform for future quantum circuit integration.« less
A method for integrating multiple components in a decision support system
Donald Nute; Walter D. Potter; Zhiyuan Cheng; Mayukh Dass; Astrid Glende; Frederick Maierv; Cy Routh; Hajime Uchiyama; Jin Wang; Sarah Witzig; Mark Twery; Peter Knopp; Scott Thomasma; H. Michael Rauscher
2005-01-01
We present a flexible, extensible method for integrating multiple tools into a single large decision support system (DSS) using a forest ecosystem management DSS (NED-2) as an example. In our approach, a rich ontology for the target domain is developed and implemented in the internal data model for the DSS. Semi-autonomous agents control external components and...
The boundary element method applied to 3D magneto-electro-elastic dynamic problems
NASA Astrophysics Data System (ADS)
Igumnov, L. A.; Markov, I. P.; Kuznetsov, Iu A.
2017-11-01
Due to the coupling properties, the magneto-electro-elastic materials possess a wide number of applications. They exhibit general anisotropic behaviour. Three-dimensional transient analyses of magneto-electro-elastic solids can hardly be found in the literature. 3D direct boundary element formulation based on the weakly-singular boundary integral equations in Laplace domain is presented in this work for solving dynamic linear magneto-electro-elastic problems. Integral expressions of the three-dimensional fundamental solutions are employed. Spatial discretization is based on a collocation method with mixed boundary elements. Convolution quadrature method is used as a numerical inverse Laplace transform scheme to obtain time domain solutions. Numerical examples are provided to illustrate the capability of the proposed approach to treat highly dynamic problems.
NASA Astrophysics Data System (ADS)
Leuchter, S.; Reinert, F.; Müller, W.
2014-06-01
Procurement and design of system architectures capable of network centric operations demand for an assessment scheme in order to compare different alternative realizations. In this contribution an assessment method for system architectures targeted at the C4ISR domain is presented. The method addresses the integration capability of software systems from a complex and distributed software system perspective focusing communication, interfaces and software. The aim is to evaluate the capability to integrate a system or its functions within a system-of-systems network. This method uses approaches from software architecture quality assessment and applies them on the system architecture level. It features a specific goal tree of several dimensions that are relevant for enterprise integration. These dimensions have to be weighed against each other and totalized using methods from the normative decision theory in order to reflect the intention of the particular enterprise integration effort. The indicators and measurements for many of the considered quality features rely on a model based view on systems, networks, and the enterprise. That means it is applicable to System-of-System specifications based on enterprise architectural frameworks relying on defined meta-models or domain ontologies for defining views and viewpoints. In the defense context we use the NATO Architecture Framework (NAF) to ground respective system models. The proposed assessment method allows evaluating and comparing competing system designs regarding their future integration potential. It is a contribution to the system-of-systems engineering methodology.
NASA Technical Reports Server (NTRS)
Claus, R. O.; Bennett, K. D.; Jackson, B. S.
1986-01-01
The application of fiber-optical time domain reflectometry (OTDR) to nondestructive quantitative measurements of distributed internal strain in graphite-epoxy composites, using optical fiber waveguides imbedded between plies, is discussed. The basic OTDR measurement system is described, together with the methods used to imbed optical fibers within composites. Measurement results, system limitations, and the effect of the imbedded fiber on the integrity of the host composite material are considered.
Libbrecht, Maxwell W.; Ay, Ferhat; Hoffman, Michael M.; Gilbert, David M.; Bilmes, Jeffrey A.; Noble, William Stafford
2015-01-01
The genomic neighborhood of a gene influences its activity, a behavior that is attributable in part to domain-scale regulation. Previous genomic studies have identified many types of regulatory domains. However, due to the difficulty of integrating genomics data sets, the relationships among these domain types are poorly understood. Semi-automated genome annotation (SAGA) algorithms facilitate human interpretation of heterogeneous collections of genomics data by simultaneously partitioning the human genome and assigning labels to the resulting genomic segments. However, existing SAGA methods cannot integrate inherently pairwise chromatin conformation data. We developed a new computational method, called graph-based regularization (GBR), for expressing a pairwise prior that encourages certain pairs of genomic loci to receive the same label in a genome annotation. We used GBR to exploit chromatin conformation information during genome annotation by encouraging positions that are close in 3D to occupy the same type of domain. Using this approach, we produced a model of chromatin domains in eight human cell types, thereby revealing the relationships among known domain types. Through this model, we identified clusters of tightly regulated genes expressed in only a small number of cell types, which we term “specific expression domains.” We found that domain boundaries marked by promoters and CTCF motifs are consistent between cell types even when domain activity changes. Finally, we showed that GBR can be used to transfer information from well-studied cell types to less well-characterized cell types during genome annotation, making it possible to produce high-quality annotations of the hundreds of cell types with limited available data. PMID:25677182
Libbrecht, Maxwell W; Ay, Ferhat; Hoffman, Michael M; Gilbert, David M; Bilmes, Jeffrey A; Noble, William Stafford
2015-04-01
The genomic neighborhood of a gene influences its activity, a behavior that is attributable in part to domain-scale regulation. Previous genomic studies have identified many types of regulatory domains. However, due to the difficulty of integrating genomics data sets, the relationships among these domain types are poorly understood. Semi-automated genome annotation (SAGA) algorithms facilitate human interpretation of heterogeneous collections of genomics data by simultaneously partitioning the human genome and assigning labels to the resulting genomic segments. However, existing SAGA methods cannot integrate inherently pairwise chromatin conformation data. We developed a new computational method, called graph-based regularization (GBR), for expressing a pairwise prior that encourages certain pairs of genomic loci to receive the same label in a genome annotation. We used GBR to exploit chromatin conformation information during genome annotation by encouraging positions that are close in 3D to occupy the same type of domain. Using this approach, we produced a model of chromatin domains in eight human cell types, thereby revealing the relationships among known domain types. Through this model, we identified clusters of tightly regulated genes expressed in only a small number of cell types, which we term "specific expression domains." We found that domain boundaries marked by promoters and CTCF motifs are consistent between cell types even when domain activity changes. Finally, we showed that GBR can be used to transfer information from well-studied cell types to less well-characterized cell types during genome annotation, making it possible to produce high-quality annotations of the hundreds of cell types with limited available data. © 2015 Libbrecht et al.; Published by Cold Spring Harbor Laboratory Press.
A Kernel-free Boundary Integral Method for Elliptic Boundary Value Problems ⋆
Ying, Wenjun; Henriquez, Craig S.
2013-01-01
This paper presents a class of kernel-free boundary integral (KFBI) methods for general elliptic boundary value problems (BVPs). The boundary integral equations reformulated from the BVPs are solved iteratively with the GMRES method. During the iteration, the boundary and volume integrals involving Green's functions are approximated by structured grid-based numerical solutions, which avoids the need to know the analytical expressions of Green's functions. The KFBI method assumes that the larger regular domain, which embeds the original complex domain, can be easily partitioned into a hierarchy of structured grids so that fast elliptic solvers such as the fast Fourier transform (FFT) based Poisson/Helmholtz solvers or those based on geometric multigrid iterations are applicable. The structured grid-based solutions are obtained with standard finite difference method (FDM) or finite element method (FEM), where the right hand side of the resulting linear system is appropriately modified at irregular grid nodes to recover the formal accuracy of the underlying numerical scheme. Numerical results demonstrating the efficiency and accuracy of the KFBI methods are presented. It is observed that the number of GM-RES iterations used by the method for solving isotropic and moderately anisotropic BVPs is independent of the sizes of the grids that are employed to approximate the boundary and volume integrals. With the standard second-order FEMs and FDMs, the KFBI method shows a second-order convergence rate in accuracy for all of the tested Dirichlet/Neumann BVPs when the anisotropy of the diffusion tensor is not too strong. PMID:23519600
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morris, John L.
1998-11-09
Leaks are detected in a multi-layered geomembrane liner by a two-dimensional time domain reflectometry (TDR) technique. The TDR geomembrane liner is constructed with an electrically conductive detection layer positioned between two electrically non-conductive dielectric layers, which are each positioned between the detection layer and an electrically conductive reference layer. The integrity of the TDR geomembrane liner is determined by generating electrical pulses within the detection layer and measuring the time delay for any reflected electrical energy caused by absorption of moisture by a dielectric layer.
Morrison, John L [Idaho Falls, ID
2001-04-24
Leaks are detected in a multi-layered geomembrane liner by a two-dimensional time domain reflectometry (TDR) technique. The TDR geomembrane liner is constructed with an electrically conductive detection layer positioned between two electrically non-conductive dielectric layers, which are each positioned between the detection layer and an electrically conductive reference layer. The integrity of the TDR geomembrane liner is determined by generating electrical pulses within the detection layer and measuring the time delay for any reflected electrical energy caused by absorption of moisture by a dielectric layer.
A web-based system architecture for ontology-based data integration in the domain of IT benchmarking
NASA Astrophysics Data System (ADS)
Pfaff, Matthias; Krcmar, Helmut
2018-03-01
In the domain of IT benchmarking (ITBM), a variety of data and information are collected. Although these data serve as the basis for business analyses, no unified semantic representation of such data yet exists. Consequently, data analysis across different distributed data sets and different benchmarks is almost impossible. This paper presents a system architecture and prototypical implementation for an integrated data management of distributed databases based on a domain-specific ontology. To preserve the semantic meaning of the data, the ITBM ontology is linked to data sources and functions as the central concept for database access. Thus, additional databases can be integrated by linking them to this domain-specific ontology and are directly available for further business analyses. Moreover, the web-based system supports the process of mapping ontology concepts to external databases by introducing a semi-automatic mapping recommender and by visualizing possible mapping candidates. The system also provides a natural language interface to easily query linked databases. The expected result of this ontology-based approach of knowledge representation and data access is an increase in knowledge and data sharing in this domain, which will enhance existing business analysis methods.
Multi-domain boundary element method for axi-symmetric layered linear acoustic systems
NASA Astrophysics Data System (ADS)
Reiter, Paul; Ziegelwanger, Harald
2017-12-01
Homogeneous porous materials like rock wool or synthetic foam are the main tool for acoustic absorption. The conventional absorbing structure for sound-proofing consists of one or multiple absorbers placed in front of a rigid wall, with or without air-gaps in between. Various models exist to describe these so called multi-layered acoustic systems mathematically for incoming plane waves. However, there is no efficient method to calculate the sound field in a half space above a multi layered acoustic system for an incoming spherical wave. In this work, an axi-symmetric multi-domain boundary element method (BEM) for absorbing multi layered acoustic systems and incoming spherical waves is introduced. In the proposed BEM formulation, a complex wave number is used to model absorbing materials as a fluid and a coordinate transformation is introduced which simplifies singular integrals of the conventional BEM to non-singular radial and angular integrals. The radial and angular part are integrated analytically and numerically, respectively. The output of the method can be interpreted as a numerical half space Green's function for grounds consisting of layered materials.
Comparison of direct and flow integration based charge density population analyses.
Francisco, E; Martín Pendas, A; Blanco, M A; Costales, A
2007-12-06
Different exhaustive and fuzzy partitions of the molecular electron density (rho) into atomic densities (rho(A)) are used to compute the atomic charges (Q(A)) of a representative set of molecules. The Q(A)'s derived from a direct integration of rho(A) are compared to those obtained from integrating the deformation density rho(def) = rho - rho(0) within each atomic domain. Our analysis shows that the latter methods tend to give Q(A)'s similar to those of the (arbitrary) reference atomic densities rho(A)(0) used in the definition of the promolecular density, rho(0) = SigmaArho(A)(0). Moreover, we show that the basis set independence of these charges is a sign not of their intrinsic quality, as commonly stated, but of the practical insensitivity on the basis set of the atomic domains that are employed in this type of methods.
NASA Technical Reports Server (NTRS)
Hu, Fang Q.; Pizzo, Michelle E.; Nark, Douglas M.
2016-01-01
Based on the time domain boundary integral equation formulation of the linear convective wave equation, a computational tool dubbed Time Domain Fast Acoustic Scattering Toolkit (TD-FAST) has recently been under development. The time domain approach has a distinct advantage that the solutions at all frequencies are obtained in a single computation. In this paper, the formulation of the integral equation, as well as its stabilization by the Burton-Miller type reformulation, is extended to cases of a constant mean flow in an arbitrary direction. In addition, a "Source Surface" is also introduced in the formulation that can be employed to encapsulate regions of noise sources and to facilitate coupling with CFD simulations. This is particularly useful for applications where the noise sources are not easily described by analytical source terms. Numerical examples are presented to assess the accuracy of the formulation, including a computation of noise shielding by a thin barrier motivated by recent Historical Baseline F31A31 open rotor noise shielding experiments. Furthermore, spatial resolution requirements of the time domain boundary element method are also assessed using point per wavelength metrics. It is found that, using only constant basis functions and high-order quadrature for surface integration, relative errors of less than 2% may be obtained when the surface spatial resolution is 5 points-per-wavelength (PPW) or 25 points-per-wavelength squared (PPW2).
Modeling of Graphene Planar Grating in the THz Range by the Method of Singular Integral Equations
NASA Astrophysics Data System (ADS)
Kaliberda, Mstislav E.; Lytvynenko, Leonid M.; Pogarsky, Sergey A.
2018-04-01
Diffraction of the H-polarized electromagnetic wave by the planar graphene grating in the THz range is considered. The scattering and absorption characteristics are studied. The scattered field is represented in the spectral domain via unknown spectral function. The mathematical model is based on the graphene surface impedance and the method of singular integral equations. The numerical solution is obtained by the Nystrom-type method of discrete singularities.
Enhanced Line Integral Convolution with Flow Feature Detection
NASA Technical Reports Server (NTRS)
Lane, David; Okada, Arthur
1996-01-01
The Line Integral Convolution (LIC) method, which blurs white noise textures along a vector field, is an effective way to visualize overall flow patterns in a 2D domain. The method produces a flow texture image based on the input velocity field defined in the domain. Because of the nature of the algorithm, the texture image tends to be blurry. This sometimes makes it difficult to identify boundaries where flow separation and reattachments occur. We present techniques to enhance LIC texture images and use colored texture images to highlight flow separation and reattachment boundaries. Our techniques have been applied to several flow fields defined in 3D curvilinear multi-block grids and scientists have found the results to be very useful.
ERIC Educational Resources Information Center
Horner, Jennifer; Minifie, Fred D.
2011-01-01
Purpose: In this series of articles--"Research Ethics I", "Research Ethics II", and "Research Ethics III"--the authors provide a comprehensive review of the 9 core domains for the responsible conduct of research (RCR) as articulated by the Office of Research Integrity. Method: In "Research Ethics III", they review the RCR domains of publication…
Multiple-image hiding using super resolution reconstruction in high-frequency domains
NASA Astrophysics Data System (ADS)
Li, Xiao-Wei; Zhao, Wu-Xiang; Wang, Jun; Wang, Qiong-Hua
2017-12-01
In this paper, a robust multiple-image hiding method using the computer-generated integral imaging and the modified super-resolution reconstruction algorithm is proposed. In our work, the host image is first transformed into frequency domains by cellular automata (CA), to assure the quality of the stego-image, the secret images are embedded into the CA high-frequency domains. The proposed method has the following advantages: (1) robustness to geometric attacks because of the memory-distributed property of elemental images, (2) increasing quality of the reconstructed secret images as the scheme utilizes the modified super-resolution reconstruction algorithm. The simulation results show that the proposed multiple-image hiding method outperforms other similar hiding methods and is robust to some geometric attacks, e.g., Gaussian noise and JPEG compression attacks.
Large scale healthcare data integration and analysis using the semantic web.
Timm, John; Renly, Sondra; Farkash, Ariel
2011-01-01
Healthcare data interoperability can only be achieved when the semantics of the content is well defined and consistently implemented across heterogeneous data sources. Achieving these objectives of interoperability requires the collaboration of experts from several domains. This paper describes tooling that integrates Semantic Web technologies with common tools to facilitate cross-domain collaborative development for the purposes of data interoperability. Our approach is divided into stages of data harmonization and representation, model transformation, and instance generation. We applied our approach on Hypergenes, an EU funded project, where we use our method to the Essential Hypertension disease model using a CDA template. Our domain expert partners include clinical providers, clinical domain researchers, healthcare information technology experts, and a variety of clinical data consumers. We show that bringing Semantic Web technologies into the healthcare interoperability toolkit increases opportunities for beneficial collaboration thus improving patient care and clinical research outcomes.
Ghadie, Mohamed Ali; Lambourne, Luke; Vidal, Marc; Xia, Yu
2017-08-01
Alternative splicing is known to remodel protein-protein interaction networks ("interactomes"), yet large-scale determination of isoform-specific interactions remains challenging. We present a domain-based method to predict the isoform interactome from the reference interactome. First, we construct the domain-resolved reference interactome by mapping known domain-domain interactions onto experimentally-determined interactions between reference proteins. Then, we construct the isoform interactome by predicting that an isoform loses an interaction if it loses the domain mediating the interaction. Our prediction framework is of high-quality when assessed by experimental data. The predicted human isoform interactome reveals extensive network remodeling by alternative splicing. Protein pairs interacting with different isoforms of the same gene tend to be more divergent in biological function, tissue expression, and disease phenotype than protein pairs interacting with the same isoforms. Our prediction method complements experimental efforts, and demonstrates that integrating structural domain information with interactomes provides insights into the functional impact of alternative splicing.
Lambourne, Luke; Vidal, Marc
2017-01-01
Alternative splicing is known to remodel protein-protein interaction networks (“interactomes”), yet large-scale determination of isoform-specific interactions remains challenging. We present a domain-based method to predict the isoform interactome from the reference interactome. First, we construct the domain-resolved reference interactome by mapping known domain-domain interactions onto experimentally-determined interactions between reference proteins. Then, we construct the isoform interactome by predicting that an isoform loses an interaction if it loses the domain mediating the interaction. Our prediction framework is of high-quality when assessed by experimental data. The predicted human isoform interactome reveals extensive network remodeling by alternative splicing. Protein pairs interacting with different isoforms of the same gene tend to be more divergent in biological function, tissue expression, and disease phenotype than protein pairs interacting with the same isoforms. Our prediction method complements experimental efforts, and demonstrates that integrating structural domain information with interactomes provides insights into the functional impact of alternative splicing. PMID:28846689
Integrating language models into classifiers for BCI communication: a review
NASA Astrophysics Data System (ADS)
Speier, W.; Arnold, C.; Pouratian, N.
2016-06-01
Objective. The present review systematically examines the integration of language models to improve classifier performance in brain-computer interface (BCI) communication systems. Approach. The domain of natural language has been studied extensively in linguistics and has been used in the natural language processing field in applications including information extraction, machine translation, and speech recognition. While these methods have been used for years in traditional augmentative and assistive communication devices, information about the output domain has largely been ignored in BCI communication systems. Over the last few years, BCI communication systems have started to leverage this information through the inclusion of language models. Main results. Although this movement began only recently, studies have already shown the potential of language integration in BCI communication and it has become a growing field in BCI research. BCI communication systems using language models in their classifiers have progressed down several parallel paths, including: word completion; signal classification; integration of process models; dynamic stopping; unsupervised learning; error correction; and evaluation. Significance. Each of these methods have shown significant progress, but have largely been addressed separately. Combining these methods could use the full potential of language model, yielding further performance improvements. This integration should be a priority as the field works to create a BCI system that meets the needs of the amyotrophic lateral sclerosis population.
Integrating language models into classifiers for BCI communication: a review.
Speier, W; Arnold, C; Pouratian, N
2016-06-01
The present review systematically examines the integration of language models to improve classifier performance in brain-computer interface (BCI) communication systems. The domain of natural language has been studied extensively in linguistics and has been used in the natural language processing field in applications including information extraction, machine translation, and speech recognition. While these methods have been used for years in traditional augmentative and assistive communication devices, information about the output domain has largely been ignored in BCI communication systems. Over the last few years, BCI communication systems have started to leverage this information through the inclusion of language models. Although this movement began only recently, studies have already shown the potential of language integration in BCI communication and it has become a growing field in BCI research. BCI communication systems using language models in their classifiers have progressed down several parallel paths, including: word completion; signal classification; integration of process models; dynamic stopping; unsupervised learning; error correction; and evaluation. Each of these methods have shown significant progress, but have largely been addressed separately. Combining these methods could use the full potential of language model, yielding further performance improvements. This integration should be a priority as the field works to create a BCI system that meets the needs of the amyotrophic lateral sclerosis population.
A time domain inverse dynamic method for the end point tracking control of a flexible manipulator
NASA Technical Reports Server (NTRS)
Kwon, Dong-Soo; Book, Wayne J.
1991-01-01
The inverse dynamic equation of a flexible manipulator was solved in the time domain. By dividing the inverse system equation into the causal part and the anticausal part, we calculated the torque and the trajectories of all state variables for a given end point trajectory. The interpretation of this method in the frequency domain was explained in detail using the two-sided Laplace transform and the convolution integral. The open loop control of the inverse dynamic method shows an excellent result in simulation. For real applications, a practical control strategy is proposed by adding a feedback tracking control loop to the inverse dynamic feedforward control, and its good experimental performance is presented.
High-efficiency power transfer for silicon-based photonic devices
NASA Astrophysics Data System (ADS)
Son, Gyeongho; Yu, Kyoungsik
2018-02-01
We demonstrate an efficient coupling of guided light of 1550 nm from a standard single-mode optical fiber to a silicon waveguide using the finite-difference time-domain method and propose a fabrication method of tapered optical fibers for efficient power transfer to silicon-based photonic integrated circuits. Adiabatically-varying fiber core diameters with a small tapering angle can be obtained using the tube etching method with hydrofluoric acid and standard single-mode fibers covered by plastic jackets. The optical power transmission of the fundamental HE11 and TE-like modes between the fiber tapers and the inversely-tapered silicon waveguides was calculated with the finite-difference time-domain method to be more than 99% at a wavelength of 1550 nm. The proposed method for adiabatic fiber tapering can be applied in quantum optics, silicon-based photonic integrated circuits, and nanophotonics. Furthermore, efficient coupling within the telecommunication C-band is a promising approach for quantum networks in the future.
Wavelet-like bases for thin-wire integral equations in electromagnetics
NASA Astrophysics Data System (ADS)
Francomano, E.; Tortorici, A.; Toscano, E.; Ala, G.; Viola, F.
2005-03-01
In this paper, wavelets are used in solving, by the method of moments, a modified version of the thin-wire electric field integral equation, in frequency domain. The time domain electromagnetic quantities, are obtained by using the inverse discrete fast Fourier transform. The retarded scalar electric and vector magnetic potentials are employed in order to obtain the integral formulation. The discretized model generated by applying the direct method of moments via point-matching procedure, results in a linear system with a dense matrix which have to be solved for each frequency of the Fourier spectrum of the time domain impressed source. Therefore, orthogonal wavelet-like basis transform is used to sparsify the moment matrix. In particular, dyadic and M-band wavelet transforms have been adopted, so generating different sparse matrix structures. This leads to an efficient solution in solving the resulting sparse matrix equation. Moreover, a wavelet preconditioner is used to accelerate the convergence rate of the iterative solver employed. These numerical features are used in analyzing the transient behavior of a lightning protection system. In particular, the transient performance of the earth termination system of a lightning protection system or of the earth electrode of an electric power substation, during its operation is focused. The numerical results, obtained by running a complex structure, are discussed and the features of the used method are underlined.
Spacer capture and integration by a type I-F Cas1-Cas2-3 CRISPR adaptation complex.
Fagerlund, Robert D; Wilkinson, Max E; Klykov, Oleg; Barendregt, Arjan; Pearce, F Grant; Kieper, Sebastian N; Maxwell, Howard W R; Capolupo, Angela; Heck, Albert J R; Krause, Kurt L; Bostina, Mihnea; Scheltema, Richard A; Staals, Raymond H J; Fineran, Peter C
2017-06-27
CRISPR-Cas adaptive immune systems capture DNA fragments from invading bacteriophages and plasmids and integrate them as spacers into bacterial CRISPR arrays. In type I-E and II-A CRISPR-Cas systems, this adaptation process is driven by Cas1-Cas2 complexes. Type I-F systems, however, contain a unique fusion of Cas2, with the type I effector helicase and nuclease for invader destruction, Cas3. By using biochemical, structural, and biophysical methods, we present a structural model of the 400-kDa Cas1 4 -Cas2-3 2 complex from Pectobacterium atrosepticum with bound protospacer substrate DNA. Two Cas1 dimers assemble on a Cas2 domain dimeric core, which is flanked by two Cas3 domains forming a groove where the protospacer binds to Cas1-Cas2. We developed a sensitive in vitro assay and demonstrated that Cas1-Cas2-3 catalyzed spacer integration into CRISPR arrays. The integrase domain of Cas1 was necessary, whereas integration was independent of the helicase or nuclease activities of Cas3. Integration required at least partially duplex protospacers with free 3'-OH groups, and leader-proximal integration was stimulated by integration host factor. In a coupled capture and integration assay, Cas1-Cas2-3 processed and integrated protospacers independent of Cas3 activity. These results provide insight into the structure of protospacer-bound type I Cas1-Cas2-3 adaptation complexes and their integration mechanism.
ERIC Educational Resources Information Center
Markic, Silvija; Eilks, Ingo
2012-01-01
The study presented in this paper integrates data from four combined research studies, which are both qualitative and quantitative in nature. The studies describe freshman science student teachers' beliefs about teaching and learning. These freshmen intend to become teachers in Germany in one of four science teaching domains (secondary biology,…
NASA Astrophysics Data System (ADS)
Liliawati, W.; Utama, J. A.; Fauziah, H.
2016-08-01
The curriculum in Indonesia recommended that science teachers in the elementary and intermediate schools should have interdisciplinary ability in science. However, integrated learning still has not been implemented optimally. This research is designing and applying integrated learning with Susan Loucks-Horsley model in light pollution theme. It can be showed how the student's achievements based on new taxonomy of science education with five domains: knowing & understanding, science process skill, creativity, attitudinal and connecting & applying. This research use mixed methods with concurrent embedded design. The subject is grade 8 of junior high school students in Bandung as many as 27 students. The Instrument have been employed has 28 questions test mastery of concepts, observations sheet and moral dilemma test. The result shows that integrated learning with model Susan Loucks-Horsley is able to increase student's achievement and positive characters on light pollution theme. As the results are the average normalized gain of knowing and understanding domain reach in lower category, the average percentage of science process skill domain reach in good category, the average percentage of creativity and connecting domain reach respectively in good category and attitudinal domain the average percentage is over 75% in moral knowing and moral feeling.
NASA Astrophysics Data System (ADS)
Lovell, Amy Elizabeth
Computational electromagnetics (CEM) provides numerical methods to simulate electromagnetic waves interacting with its environment. Boundary integral equation (BIE) based methods, that solve the Maxwell's equations in the homogeneous or piecewise homogeneous medium, are both efficient and accurate, especially for scattering and radiation problems. Development and analysis electromagnetic BIEs has been a very active topic in CEM research. Indeed, there are still many open problems that need to be addressed or further studied. A short and important list includes (1) closed-form or quasi-analytical solutions to time-domain integral equations, (2) catastrophic cancellations at low frequencies, (3) ill-conditioning due to high mesh density, multi-scale discretization, and growing electrical size, and (4) lack of flexibility due to re-meshing when increasing number of forward numerical simulations are involved in the electromagnetic design process. This dissertation will address those several aspects of boundary integral equations in computational electromagnetics. The first contribution of the dissertation is to construct quasi-analytical solutions to time-dependent boundary integral equations using a direct approach. Direct inverse Fourier transform of the time-harmonic solutions is not stable due to the non-existence of the inverse Fourier transform of spherical Hankel functions. Using new addition theorems for the time-domain Green's function and dyadic Green's functions, time-domain integral equations governing transient scattering problems of spherical objects are solved directly and stably for the first time. Additional, the direct time-dependent solutions, together with the newly proposed time-domain dyadic Green's functions, can enrich the time-domain spherical multipole theory. The second contribution is to create a novel method of moments (MoM) framework to solve electromagnetic boundary integral equation on subdivision surfaces. The aim is to avoid the meshing and re-meshing stages to accelerate the design process when the geometry needs to be updated. Two schemes to construct basis functions on the subdivision surface have been explored. One is to use the div-conforming basis function, and the other one is to create a rigorous iso-geometric approach based on the subdivision basis function with better smoothness properties. This new framework provides us better accuracy, more stability and high flexibility. The third contribution is a new stable integral equation formulation to avoid catastrophic cancellations due to low-frequency breakdown or dense-mesh breakdown. Many of the conventional integral equations and their associated post-processing operations suffer from numerical catastrophic cancellations, which can lead to ill-conditioning of the linear systems or serious accuracy problems. Examples includes low-frequency breakdown and dense mesh breakdown. Another instability may come from nontrivial null spaces of involving integral operators that might be related with spurious resonance or topology breakdown. This dissertation presents several sets of new boundary integral equations and studies their analytical properties. The first proposed formulation leads to the scalar boundary integral equations where only scalar unknowns are involved. Besides the requirements of gaining more stability and better conditioning in the resulting linear systems, multi-physics simulation is another driving force for new formulations. Scalar and vector potentials (rather than electromagnetic field) based formulation have been studied for this purpose. Those new contributions focus on different stages of boundary integral equations in an almost independent manner, e.g. isogeometric analysis framework can be used to solve different boundary integral equations, and the time-dependent solutions to integral equations from different formulations can be achieved through the same methodology proposed.
ERIC Educational Resources Information Center
Kamruzzaman, M.
2014-01-01
This study reports an action research undertaken at Queensland University of Technology. It evaluates the effectiveness of the integration of geographic information systems (GIS) within the substantive domains of an existing land use planning course in 2011. Using student performance, learning experience survey, and questionnaire survey data, it…
Klaseboer, Evert; Sepehrirahnama, Shahrokh; Chan, Derek Y C
2017-08-01
The general space-time evolution of the scattering of an incident acoustic plane wave pulse by an arbitrary configuration of targets is treated by employing a recently developed non-singular boundary integral method to solve the Helmholtz equation in the frequency domain from which the space-time solution of the wave equation is obtained using the fast Fourier transform. The non-singular boundary integral solution can enforce the radiation boundary condition at infinity exactly and can account for multiple scattering effects at all spacings between scatterers without adverse effects on the numerical precision. More generally, the absence of singular kernels in the non-singular integral equation confers high numerical stability and precision for smaller numbers of degrees of freedom. The use of fast Fourier transform to obtain the time dependence is not constrained to discrete time steps and is particularly efficient for studying the response to different incident pulses by the same configuration of scatterers. The precision that can be attained using a smaller number of Fourier components is also quantified.
Adhikari, Badri; Hou, Jie; Cheng, Jianlin
2018-03-01
In this study, we report the evaluation of the residue-residue contacts predicted by our three different methods in the CASP12 experiment, focusing on studying the impact of multiple sequence alignment, residue coevolution, and machine learning on contact prediction. The first method (MULTICOM-NOVEL) uses only traditional features (sequence profile, secondary structure, and solvent accessibility) with deep learning to predict contacts and serves as a baseline. The second method (MULTICOM-CONSTRUCT) uses our new alignment algorithm to generate deep multiple sequence alignment to derive coevolution-based features, which are integrated by a neural network method to predict contacts. The third method (MULTICOM-CLUSTER) is a consensus combination of the predictions of the first two methods. We evaluated our methods on 94 CASP12 domains. On a subset of 38 free-modeling domains, our methods achieved an average precision of up to 41.7% for top L/5 long-range contact predictions. The comparison of the three methods shows that the quality and effective depth of multiple sequence alignments, coevolution-based features, and machine learning integration of coevolution-based features and traditional features drive the quality of predicted protein contacts. On the full CASP12 dataset, the coevolution-based features alone can improve the average precision from 28.4% to 41.6%, and the machine learning integration of all the features further raises the precision to 56.3%, when top L/5 predicted long-range contacts are evaluated. And the correlation between the precision of contact prediction and the logarithm of the number of effective sequences in alignments is 0.66. © 2017 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Madsen, Niel K.
1992-01-01
Several new discrete surface integral (DSI) methods for solving Maxwell's equations in the time-domain are presented. These methods, which allow the use of general nonorthogonal mixed-polyhedral unstructured grids, are direct generalizations of the canonical staggered-grid finite difference method. These methods are conservative in that they locally preserve divergence or charge. Employing mixed polyhedral cells, (hexahedral, tetrahedral, etc.) these methods allow more accurate modeling of non-rectangular structures and objects because the traditional stair-stepped boundary approximations associated with the orthogonal grid based finite difference methods can be avoided. Numerical results demonstrating the accuracy of these new methods are presented.
Integrative methods for analyzing big data in precision medicine.
Gligorijević, Vladimir; Malod-Dognin, Noël; Pržulj, Nataša
2016-03-01
We provide an overview of recent developments in big data analyses in the context of precision medicine and health informatics. With the advance in technologies capturing molecular and medical data, we entered the area of "Big Data" in biology and medicine. These data offer many opportunities to advance precision medicine. We outline key challenges in precision medicine and present recent advances in data integration-based methods to uncover personalized information from big data produced by various omics studies. We survey recent integrative methods for disease subtyping, biomarkers discovery, and drug repurposing, and list the tools that are available to domain scientists. Given the ever-growing nature of these big data, we highlight key issues that big data integration methods will face. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Wavelet transformation to determine impedance spectra of lithium-ion rechargeable battery
NASA Astrophysics Data System (ADS)
Hoshi, Yoshinao; Yakabe, Natsuki; Isobe, Koichiro; Saito, Toshiki; Shitanda, Isao; Itagaki, Masayuki
2016-05-01
A new analytical method is proposed to determine the electrochemical impedance of lithium-ion rechargeable batteries (LIRB) from time domain data by wavelet transformation (WT). The WT is a waveform analysis method that can transform data in the time domain to the frequency domain while retaining time information. In this transformation, the frequency domain data are obtained by the convolution integral of a mother wavelet and original time domain data. A complex Morlet mother wavelet (CMMW) is used to obtain the complex number data in the frequency domain. The CMMW is expressed by combining a Gaussian function and sinusoidal term. The theory to select a set of suitable conditions for variables and constants related to the CMMW, i.e., band, scale, and time parameters, is established by determining impedance spectra from wavelet coefficients using input voltage to the equivalent circuit and the output current. The impedance spectrum of LIRB determined by WT agrees well with that measured using a frequency response analyzer.
NASA Astrophysics Data System (ADS)
Tanaka, Yoshiyuki; Klemann, Volker; Okuno, Jun'ichi
2009-09-01
Normal mode approaches for calculating viscoelastic responses of self-gravitating and compressible spherical earth models have an intrinsic problem of determining the roots of the secular equation and the associated residues in the Laplace domain. To bypass this problem, a method based on numerical inverse Laplace integration was developed by T anaka et al. (2006, 2007) for computations of viscoelastic deformation caused by an internal dislocation. The advantage of this approach is that the root-finding problem is avoided without imposing additional constraints on the governing equations and earth models. In this study, we apply the same algorithm to computations of viscoelastic responses to a surface load and show that the results obtained by this approach agree well with those obtained by a time-domain approach that does not need determinations of the normal modes in the Laplace domain. Using the elastic earth model PREM and a convex viscosity profile, we calculate viscoelastic load Love numbers ( h, l, k) for compressible and incompressible models. Comparisons between the results show that effects due to compressibility are consistent with results obtained by previous studies and that the rate differences between the two models total 10-40%. This will serve as an independent method to confirm results obtained by time-domain approaches and will usefully increase the reliability when modeling postglacial rebound.
Video teleconsultation service: who is needed to do what, to get it implemented in daily care?
Visser, Jacqueline J W; Bloo, J K C; Grobbe, F A; Vollenbroek-Hutten, M M R
2010-05-01
In telemedicine, technology is used to deliver services. Because of this, it is expected that various actors other than those involved in traditional care are involved in and need to cooperate, to deliver these services. The aim of this study was to establish a clear understanding of these actors and their roles and interrelationships in the delivery of telemedicine. A video teleconsultation service is used as a study case. A business modeling approach as described in the Freeband Business Blueprint Method was used. The method brings together the four domains that make up a business model, that is, service, technology, organization, and finance, and covers the integration of these domains. The method uses several multidisciplinary workshops, addressing each of the four domains. Results of the four domains addressed showed that (1) the video teleconsultation service is a store and put-forward video teleconsult for healthcare providers. The service is accepted and has added value for the quality of care. However, the market is small; (2) the technology consists of a secured Internet Web-based application, standard personal computer, broadband Internet connection, and a digital camera; (3) a new role and probably entity, responsible for delivering the integrated service to the healthcare professionals, was identified; and finally (4) financial reimbursement for the service delivery is expected to be most successful when set up through healthcare insurance companies. Pricing needs to account for the fee of healthcare professionals as well as for technical aspects, education, and future innovation. Implementation of the video teleconsult service requires multidisciplinary cooperation and integration. Challenging aspects are the small market size and the slow implementation speed, among others. This supports the argument that accumulation of several telemedicine applications is necessary to make it financially feasible for at least some of the actors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saygin, H.; Hebert, A.
The calculation of a dilution cross section {bar {sigma}}{sub e} is the most important step in the self-shielding formalism based on the equivalence principle. If a dilution cross section that accurately characterizes the physical situation can be calculated, it can then be used for calculating the effective resonance integrals and obtaining accurate self-shielded cross sections. A new technique for the calculation of equivalent cross sections based on the formalism of Riemann integration in the resolved energy domain is proposed. This new method is compared to the generalized Stamm`ler method, which is also based on an equivalence principle, for a two-regionmore » cylindrical cell and for a small pressurized water reactor assembly in two dimensions. The accuracy of each computing approach is obtained using reference results obtained from a fine-group slowing-down code named CESCOL. It is shown that the proposed method leads to slightly better performance than the generalized Stamm`ler approach.« less
Diffusion phenomenon for linear dissipative wave equations in an exterior domain
NASA Astrophysics Data System (ADS)
Ikehata, Ryo
Under the general condition of the initial data, we will derive the crucial estimates which imply the diffusion phenomenon for the dissipative linear wave equations in an exterior domain. In order to derive the diffusion phenomenon for dissipative wave equations, the time integral method which was developed by Ikehata and Matsuyama (Sci. Math. Japon. 55 (2002) 33) plays an effective role.
Automated Design Tools for Integrated Mixed-Signal Microsystems (NeoCAD)
2005-02-01
method, Model Order Reduction (MOR) tools, system-level, mixed-signal circuit synthesis and optimization tools, and parsitic extraction tools. A unique...Mission Area: Command and Control mixed signal circuit simulation parasitic extraction time-domain simulation IC design flow model order reduction... Extraction 1.2 Overall Program Milestones CHAPTER 2 FAST TIME DOMAIN MIXED-SIGNAL CIRCUIT SIMULATION 2.1 HAARSPICE Algorithms 2.1.1 Mathematical Background
Time Domain Diffraction by Composite Structures
NASA Astrophysics Data System (ADS)
Riccio, Giovanni; Frongillo, Marcello
2017-04-01
Time domain (TD) diffraction problems are receiving great attention because of the widespread use of ultra wide band (UWB) communication and radar systems. It is commonly accepted that, due to the large bandwidth of the UWB signals, the analysis of the wave propagation mechanisms in the TD framework is preferable to the frequency domain (FD) data processing. Furthermore, the analysis of transient scattering phenomena is also of importance for predicting the effects of electromagnetic pulses on civil structures. Diffraction in the TD framework represents a challenging problem and numerical discretization techniques can be used to support research and industry activities. Unfortunately, these methods become rapidly intractable when considering excitation pulses with high frequency content. This contribution deals with the TD diffraction phenomenon related to composite structures containing a dielectric wedge with arbitrary apex angle when illuminated by a plane wave. The approach is the same used in [1]-[3]. The transient diffracted field originated by an arbitrary function plane wave is evaluated via a convolution integral involving the TD diffraction coefficients, which are determined in closed form starting from the knowledge of the corresponding FD counterparts. In particular, the inverse Laplace transform is applied to the FD Uniform Asymptotic Physical Optics (FD-UAPO) diffraction coefficients available for the internal region of the structure and the surrounding space. For each observation domain, the FD-UAPO expressions are obtained by considering electric and magnetic equivalent PO surface currents located on the interfaces. The surface radiation integrals using these sources is assumed as starting point and manipulated for obtaining integrals able to be solved by means of the Steepest Descent Method and the Multiplicative Method. [1] G. Gennarelli and G. Riccio, "Time domain diffraction by a right-angled penetrable wedge," IEEE Trans. Antennas Propag., Vol. 60, 2829-2833, 2012. [2] G. Gennarelli and G. Riccio, "Obtuse-angled penetrable wedges: a time domain solution for the diffraction coefficients," J. Electromagn. Waves Appl., Vol. 27, 2020-2028, 2013. [3] M. Frongillo, G. Gennarelli and G. Riccio, "TD-UAPO diffracted field evaluation for penetrable wedges with acute apex angle," J. Opt. Soc. Am. A, Vol. 32, 1271-1275, 2015.
Integral equation approach to time-dependent kinematic dynamos in finite domains
NASA Astrophysics Data System (ADS)
Xu, Mingtian; Stefani, Frank; Gerbeth, Gunter
2004-11-01
The homogeneous dynamo effect is at the root of cosmic magnetic field generation. With only a very few exceptions, the numerical treatment of homogeneous dynamos is carried out in the framework of the differential equation approach. The present paper tries to facilitate the use of integral equations in dynamo research. Apart from the pedagogical value to illustrate dynamo action within the well-known picture of the Biot-Savart law, the integral equation approach has a number of practical advantages. The first advantage is its proven numerical robustness and stability. The second and perhaps most important advantage is its applicability to dynamos in arbitrary geometries. The third advantage is its intimate connection to inverse problems relevant not only for dynamos but also for technical applications of magnetohydrodynamics. The paper provides the first general formulation and application of the integral equation approach to time-dependent kinematic dynamos, with stationary dynamo sources, in finite domains. The time dependence is restricted to the magnetic field, whereas the velocity or corresponding mean-field sources of dynamo action are supposed to be stationary. For the spherically symmetric α2 dynamo model it is shown how the general formulation is reduced to a coupled system of two radial integral equations for the defining scalars of the poloidal and toroidal field components. The integral equation formulation for spherical dynamos with general stationary velocity fields is also derived. Two numerical examples—the α2 dynamo model with radially varying α and the Bullard-Gellman model—illustrate the equivalence of the approach with the usual differential equation method. The main advantage of the method is exemplified by the treatment of an α2 dynamo in rectangular domains.
NASA Astrophysics Data System (ADS)
Fallahi, Arya; Oswald, Benedikt; Leidenberger, Patrick
2012-04-01
We study a 3-dimensional, dual-field, fully explicit method for the solution of Maxwell's equations in the time domain on unstructured, tetrahedral grids. The algorithm uses the element level time domain (ELTD) discretization of the electric and magnetic vector wave equations. In particular, the suitability of the method for the numerical analysis of nanometer structured systems in the optical region of the electromagnetic spectrum is investigated. The details of the theory and its implementation as a computer code are introduced and its convergence behavior as well as conditions for stable time domain integration is examined. Here, we restrict ourselves to non-dispersive dielectric material properties since dielectric dispersion will be treated in a subsequent paper. Analytically solvable problems are analyzed in order to benchmark the method. Eventually, a dielectric microlens is considered to demonstrate the potential of the method. A flexible method of 2nd order accuracy is obtained that is applicable to a wide range of nano-optical configurations and can be a serious competitor to more conventional finite difference time domain schemes which operate only on hexahedral grids. The ELTD scheme can resolve geometries with a wide span of characteristic length scales and with the appropriate level of detail, using small tetrahedra where delicate, physically relevant details must be modeled.
Porcino, Antony; MacDougall, Colleen
2009-01-01
Background: Since the late 1980s, several taxonomies have been developed to help map and describe the interrelationships of complementary and alternative medicine (CAM) modalities. In these taxonomies, several issues are often incompletely addressed: A simple categorization process that clearly isolates a modality to a single conceptual categoryClear delineation of verticality—that is, a differentiation of scale being observed from individually applied techniques, through modalities (therapies), to whole medical systemsRecognition of CAM as part of the general field of health care Methods: Development of the Integrated Taxonomy of Health Care (ITHC) involved three stages: Development of a precise, uniform health glossaryAnalysis of the extant taxonomiesUse of an iterative process of classifying modalities and medical systems into categories until a failure to singularly classify a modality occurred, requiring a return to the glossary and adjustment of the classifying protocol Results: A full vertical taxonomy was developed that includes and clearly differentiates between techniques, modalities, domains (clusters of similar modalities), systems of health care (coordinated care system involving multiple modalities), and integrative health care. Domains are the classical primary focus of taxonomies. The ITHC has eleven domains: chemical/substance-based work, device-based work, soft tissue–focused manipulation, skeletal manipulation, fitness/movement instruction, mind–body integration/classical somatics work, mental/emotional–based work, bio-energy work based on physical manipulation, bio-energy modulation, spiritual-based work, unique assessments. Modalities are assigned to the domains based on the primary mode of interaction with the client, according the literature of the practitioners. Conclusions: The ITHC has several strengths: little interpretation is used while successfully assigning modalities to single domains; the issue of taxonomic verticality is fully resolved; and the design fully integrates the complementary health care fields of biomedicine and CAM. PMID:21589735
FastMag: Fast micromagnetic simulator for complex magnetic structures (invited)
NASA Astrophysics Data System (ADS)
Chang, R.; Li, S.; Lubarda, M. V.; Livshitz, B.; Lomakin, V.
2011-04-01
A fast micromagnetic simulator (FastMag) for general problems is presented. FastMag solves the Landau-Lifshitz-Gilbert equation and can handle multiscale problems with a high computational efficiency. The simulator derives its high performance from efficient methods for evaluating the effective field and from implementations on massively parallel graphics processing unit (GPU) architectures. FastMag discretizes the computational domain into tetrahedral elements and therefore is highly flexible for general problems. The magnetostatic field is computed via the superposition principle for both volume and surface parts of the computational domain. This is accomplished by implementing efficient quadrature rules and analytical integration for overlapping elements in which the integral kernel is singular. Thus, discretized superposition integrals are computed using a nonuniform grid interpolation method, which evaluates the field from N sources at N collocated observers in O(N) operations. This approach allows handling objects of arbitrary shape, allows easily calculating of the field outside the magnetized domains, does not require solving a linear system of equations, and requires little memory. FastMag is implemented on GPUs with ?> GPU-central processing unit speed-ups of 2 orders of magnitude. Simulations are shown of a large array of magnetic dots and a recording head fully discretized down to the exchange length, with over a hundred million tetrahedral elements on an inexpensive desktop computer.
NASA Astrophysics Data System (ADS)
Chen, Li-Chieh; Huang, Mei-Jiau
2017-02-01
A 2D simulation method for a rigid body moving in an incompressible viscous fluid is proposed. It combines one of the immersed-boundary methods, the DFFD (direct forcing fictitious domain) method with the spectral element method; the former is employed for efficiently capturing the two-way FSI (fluid-structure interaction) and the geometric flexibility of the latter is utilized for any possibly co-existing stationary and complicated solid or flow boundary. A pseudo body force is imposed within the solid domain to enforce the rigid body motion and a Lagrangian mesh composed of triangular elements is employed for tracing the rigid body. In particular, a so called sub-cell scheme is proposed to smooth the discontinuity at the fluid-solid interface and to execute integrations involving Eulerian variables over the moving-solid domain. The accuracy of the proposed method is verified through an observed agreement of the simulation results of some typical flows with analytical solutions or existing literatures.
NASA Astrophysics Data System (ADS)
Luk, B. L.; Liu, K. P.; Tong, F.; Man, K. F.
2010-05-01
The impact-acoustics method utilizes different information contained in the acoustic signals generated by tapping a structure with a small metal object. It offers a convenient and cost-efficient way to inspect the tile-wall bonding integrity. However, the existence of the surface irregularities will cause abnormal multiple bounces in the practical inspection implementations. The spectral characteristics from those bounces can easily be confused with the signals obtained from different bonding qualities. As a result, it will deteriorate the classic feature-based classification methods based on frequency domain. Another crucial difficulty posed by the implementation is the additive noise existing in the practical environments that may also cause feature mismatch and false judgment. In order to solve this problem, the work described in this paper aims to develop a robust inspection method that applies model-based strategy, and utilizes the wavelet domain features with hidden Markov modeling. It derives a bonding integrity recognition approach with enhanced immunity to surface roughness as well as the environmental noise. With the help of the specially designed artificial sample slabs, experiments have been carried out with impact acoustic signals contaminated by real environmental noises acquired under practical inspection background. The results are compared with those using classic method to demonstrate the effectiveness of the proposed method.
Assessing healthcare professionals' experiences of integrated care: do surveys tell the full story?
Stephenson, Matthew D; Campbell, Jared M; Lisy, Karolina; Aromataris, Edoardo C
2017-09-01
Integrated care is the combination of different healthcare services with the goal to provide comprehensive, seamless, effective and efficient patient care. Assessing the experiences of healthcare professionals (HCPs) is an important aspect when evaluating integrated care strategies. The aim of this rapid review was to investigate if quantitative surveys used to assess HCPs' experiences with integrated care capture all the aspects highlighted as being important in qualitative research, with a view to informing future survey development. The review considered all types of health professionals in primary care, and hospital and specialist services, with a specific focus on the provision of integrated care aimed at improving the patient journey. PubMed, CINAHL and grey literature sources were searched for relevant surveys/program evaluations and qualitative research studies. Full text articles deemed to be of relevance to the review were appraised for methodological quality using abridged critical appraisal instruments from the Joanna Briggs Institute. Data were extracted from included studies using standardized data extraction templates. Findings from included studies were grouped into domains based on similarity of meaning. Similarities and differences in the domains covered in quantitative surveys and those identified as being important in qualitative research were explored. A total of 37 studies (19 quantitative surveys, 14 qualitative studies and four mixed-method studies) were included in the review. A range of healthcare professions participated in the included studies, the majority being primary care providers. Common domains identified from quantitative surveys and qualitative studies included Communication, Agreement on Clear Roles and Responsibilities, Facilities, Information Systems, and Coordination of Care and Access. Qualitative research highlighted domains identified by HCPs as being relevant to their experiences with integrated care that have not routinely being surveyed, including Workload, Clear Leadership/Decision-Making, Management, Flexibility of Integrated Care Model, Engagement, Usefulness of Integrated Care and Collaboration, and Positive Impact/Clinical Benefits/Practice Level Benefits. There were several domains identified from qualitative research that are not routinely included in quantitative surveys to assess health professionals' experiences of integrated care. In addition, the qualitative findings suggest that the experiences of HCPs are often impacted by deeper aspects than those measured by existing surveys. Incorporation of targeted items within these domains in the design of surveys should enhance the capture of data that are relevant to the experiences of HCPs with integrated care, which may assist in more comprehensive evaluation and subsequent improvement of integrated care programs.
Domain-Invariant Partial-Least-Squares Regression.
Nikzad-Langerodi, Ramin; Zellinger, Werner; Lughofer, Edwin; Saminger-Platz, Susanne
2018-05-11
Multivariate calibration models often fail to extrapolate beyond the calibration samples because of changes associated with the instrumental response, environmental condition, or sample matrix. Most of the current methods used to adapt a source calibration model to a target domain exclusively apply to calibration transfer between similar analytical devices, while generic methods for calibration-model adaptation are largely missing. To fill this gap, we here introduce domain-invariant partial-least-squares (di-PLS) regression, which extends ordinary PLS by a domain regularizer in order to align the source and target distributions in the latent-variable space. We show that a domain-invariant weight vector can be derived in closed form, which allows the integration of (partially) labeled data from the source and target domains as well as entirely unlabeled data from the latter. We test our approach on a simulated data set where the aim is to desensitize a source calibration model to an unknown interfering agent in the target domain (i.e., unsupervised model adaptation). In addition, we demonstrate unsupervised, semisupervised, and supervised model adaptation by di-PLS on two real-world near-infrared (NIR) spectroscopic data sets.
Roche, Daniel B; Buenavista, Maria T; Tetchner, Stuart J; McGuffin, Liam J
2011-07-01
The IntFOLD server is a novel independent server that integrates several cutting edge methods for the prediction of structure and function from sequence. Our guiding principles behind the server development were as follows: (i) to provide a simple unified resource that makes our prediction software accessible to all and (ii) to produce integrated output for predictions that can be easily interpreted. The output for predictions is presented as a simple table that summarizes all results graphically via plots and annotated 3D models. The raw machine readable data files for each set of predictions are also provided for developers, which comply with the Critical Assessment of Methods for Protein Structure Prediction (CASP) data standards. The server comprises an integrated suite of five novel methods: nFOLD4, for tertiary structure prediction; ModFOLD 3.0, for model quality assessment; DISOclust 2.0, for disorder prediction; DomFOLD 2.0 for domain prediction; and FunFOLD 1.0, for ligand binding site prediction. Predictions from the IntFOLD server were found to be competitive in several categories in the recent CASP9 experiment. The IntFOLD server is available at the following web site: http://www.reading.ac.uk/bioinf/IntFOLD/.
Field-scale comparison of frequency- and time-domain spectral induced polarization
NASA Astrophysics Data System (ADS)
Maurya, P. K.; Fiandaca, G.; Christiansen, A. V.; Auken, E.
2018-05-01
In this paper we present a comparison study of the time-domain (TD) and frequency-domain (FD) spectral induced polarization (IP) methods in terms of acquisition time, data quality, and spectral information retrieved from inversion. We collected TDIP and FDIP surface measurements on three profiles with identical electrode setups, at two different field sites with different lithology. In addition, TDIP data were collected in two boreholes using the El-Log drilling technique, in which apparent formation resistivity and chargeability values are measured during drilling using electrodes integrated within the stem auger.
Huang, Chien-Hung; Peng, Huai-Shun; Ng, Ka-Lok
2015-01-01
Many proteins are known to be associated with cancer diseases. It is quite often that their precise functional role in disease pathogenesis remains unclear. A strategy to gain a better understanding of the function of these proteins is to make use of a combination of different aspects of proteomics data types. In this study, we extended Aragues's method by employing the protein-protein interaction (PPI) data, domain-domain interaction (DDI) data, weighted domain frequency score (DFS), and cancer linker degree (CLD) data to predict cancer proteins. Performances were benchmarked based on three kinds of experiments as follows: (I) using individual algorithm, (II) combining algorithms, and (III) combining the same classification types of algorithms. When compared with Aragues's method, our proposed methods, that is, machine learning algorithm and voting with the majority, are significantly superior in all seven performance measures. We demonstrated the accuracy of the proposed method on two independent datasets. The best algorithm can achieve a hit ratio of 89.4% and 72.8% for lung cancer dataset and lung cancer microarray study, respectively. It is anticipated that the current research could help understand disease mechanisms and diagnosis.
2015-01-01
Many proteins are known to be associated with cancer diseases. It is quite often that their precise functional role in disease pathogenesis remains unclear. A strategy to gain a better understanding of the function of these proteins is to make use of a combination of different aspects of proteomics data types. In this study, we extended Aragues's method by employing the protein-protein interaction (PPI) data, domain-domain interaction (DDI) data, weighted domain frequency score (DFS), and cancer linker degree (CLD) data to predict cancer proteins. Performances were benchmarked based on three kinds of experiments as follows: (I) using individual algorithm, (II) combining algorithms, and (III) combining the same classification types of algorithms. When compared with Aragues's method, our proposed methods, that is, machine learning algorithm and voting with the majority, are significantly superior in all seven performance measures. We demonstrated the accuracy of the proposed method on two independent datasets. The best algorithm can achieve a hit ratio of 89.4% and 72.8% for lung cancer dataset and lung cancer microarray study, respectively. It is anticipated that the current research could help understand disease mechanisms and diagnosis. PMID:25866773
How can knowledge discovery methods uncover spatio-temporal patterns in environmental data?
NASA Astrophysics Data System (ADS)
Wachowicz, Monica
2000-04-01
This paper proposes the integration of KDD, GVis and STDB as a long-term strategy, which will allow users to apply knowledge discovery methods for uncovering spatio-temporal patterns in environmental data. The main goal is to combine innovative techniques and associated tools for exploring very large environmental data sets in order to arrive at valid, novel, potentially useful, and ultimately understandable spatio-temporal patterns. The GeoInsight approach is described using the principles and key developments in the research domains of KDD, GVis, and STDB. The GeoInsight approach aims at the integration of these research domains in order to provide tools for performing information retrieval, exploration, analysis, and visualization. The result is a knowledge-based design, which involves visual thinking (perceptual-cognitive process) and automated information processing (computer-analytical process).
Habraken, Jolanda M.; Kremers, Stef P. J.; van Oers, Hans; Schuit, Albertine J.
2016-01-01
Background. Limited physical activity (PA) is a risk factor for childhood obesity. In Netherlands, as in many other countries worldwide, local policy officials bear responsibility for integrated PA policies, involving both health and nonhealth domains. In practice, its development seems hampered. We explore which obstacles local policy officials perceive in their effort. Methods. Fifteen semistructured interviews were held with policy officials from health and nonhealth policy domains, working at strategic, tactic, and operational level, in three relatively large municipalities. Questions focused on exploring perceived barriers for integrated PA policies. The interviews were deductively coded by applying the Behavior Change Ball framework. Findings. Childhood obesity prevention appeared on the governmental agenda and all officials understood the multicausal nature. However, operational officials had not yet developed a tradition to develop integrated PA policies due to insufficient boundary-spanning skills and structural and cultural differences between the domains. Tactical level officials did not sufficiently support intersectoral collaboration and strategic level officials mainly focused on public-private partnerships. Conclusion. Developing integrated PA policies is a bottom-up innovation process that needs to be supported by governmental leaders through better guiding organizational processes leading to such policies. Operational level officials can assist in this by making progress in intersectoral collaboration visible. PMID:27668255
Hendriks, Anna-Marie; Habraken, Jolanda M; Kremers, Stef P J; Jansen, Maria W J; van Oers, Hans; Schuit, Albertine J
Background . Limited physical activity (PA) is a risk factor for childhood obesity. In Netherlands, as in many other countries worldwide, local policy officials bear responsibility for integrated PA policies, involving both health and nonhealth domains. In practice, its development seems hampered. We explore which obstacles local policy officials perceive in their effort. Methods . Fifteen semistructured interviews were held with policy officials from health and nonhealth policy domains, working at strategic, tactic, and operational level, in three relatively large municipalities. Questions focused on exploring perceived barriers for integrated PA policies. The interviews were deductively coded by applying the Behavior Change Ball framework. Findings . Childhood obesity prevention appeared on the governmental agenda and all officials understood the multicausal nature. However, operational officials had not yet developed a tradition to develop integrated PA policies due to insufficient boundary-spanning skills and structural and cultural differences between the domains. Tactical level officials did not sufficiently support intersectoral collaboration and strategic level officials mainly focused on public-private partnerships. Conclusion . Developing integrated PA policies is a bottom-up innovation process that needs to be supported by governmental leaders through better guiding organizational processes leading to such policies. Operational level officials can assist in this by making progress in intersectoral collaboration visible.
Indicators and Measurement Tools for Health Systems Integration: A Knowledge Synthesis.
Suter, Esther; Oelke, Nelly D; da Silva Lima, Maria Alice Dias; Stiphout, Michelle; Janke, Robert; Witt, Regina Rigatto; Van Vliet-Brown, Cheryl; Schill, Kaela; Rostami, Mahnoush; Hepp, Shelanne; Birney, Arden; Al-Roubaiai, Fatima; Marques, Giselda Quintana
2017-11-13
Despite far reaching support for integrated care, conceptualizing and measuring integrated care remains challenging. This knowledge synthesis aimed to identify indicator domains and tools to measure progress towards integrated care. We used an established framework and a Delphi survey with integration experts to identify relevant measurement domains. For each domain, we searched and reviewed the literature for relevant tools. From 7,133 abstracts, we retrieved 114 unique tools. We found many quality tools to measure care coordination, patient engagement and team effectiveness/performance. In contrast, there were few tools in the domains of performance measurement and information systems, alignment of organizational goals and resource allocation. The search yielded 12 tools that measure overall integration or three or more indicator domains. Our findings highlight a continued gap in tools to measure foundational components that support integrated care. In the absence of such targeted tools, "overall integration" tools may be useful for a broad assessment of the overall state of a system. Continued progress towards integrated care depends on our ability to evaluate the success of strategies across different levels and context. This study has identified 114 tools that measure integrated care across 16 domains, supporting efforts towards a unified measurement framework.
Numerical integration of discontinuous functions: moment fitting and smart octree
NASA Astrophysics Data System (ADS)
Hubrich, Simeon; Di Stolfo, Paolo; Kudela, László; Kollmannsberger, Stefan; Rank, Ernst; Schröder, Andreas; Düster, Alexander
2017-11-01
A fast and simple grid generation can be achieved by non-standard discretization methods where the mesh does not conform to the boundary or the internal interfaces of the problem. However, this simplification leads to discontinuous integrands for intersected elements and, therefore, standard quadrature rules do not perform well anymore. Consequently, special methods are required for the numerical integration. To this end, we present two approaches to obtain quadrature rules for arbitrary domains. The first approach is based on an extension of the moment fitting method combined with an optimization strategy for the position and weights of the quadrature points. In the second approach, we apply the smart octree, which generates curved sub-cells for the integration mesh. To demonstrate the performance of the proposed methods, we consider several numerical examples, showing that the methods lead to efficient quadrature rules, resulting in less integration points and in high accuracy.
High order Nyström method for elastodynamic scattering
NASA Astrophysics Data System (ADS)
Chen, Kun; Gurrala, Praveen; Song, Jiming; Roberts, Ron
2016-02-01
Elastic waves in solids find important applications in ultrasonic non-destructive evaluation. The scattering of elastic waves has been treated using many approaches like the finite element method, boundary element method and Kirchhoff approximation. In this work, we propose a novel accurate and efficient high order Nyström method to solve the boundary integral equations for elastodynamic scattering problems. This approach employs high order geometry description for the element, and high order interpolation for fields inside each element. Compared with the boundary element method, this approach makes the choice of the nodes for interpolation based on the Gaussian quadrature, which renders matrix elements for far field interaction free from integration, and also greatly simplifies the process for singularity and near singularity treatment. The proposed approach employs a novel efficient near singularity treatment that makes the solver able to handle extreme geometries like very thin penny-shaped crack. Numerical results are presented to validate the approach. By using the frequency domain response and performing the inverse Fourier transform, we also report the time domain response of flaw scattering.
2015-06-01
and tools, called model-integrated computing ( MIC ) [3] relies on the use of domain-specific modeling languages for creating models of the system to be...hence giving reflective capabilities to it. We have followed the MIC method here: we designed a domain- specific modeling language for modeling...are produced one-off and not for the mass market , the scope for price reduction based on the market demands is non-existent. Processes to create
FORMED: Bringing Formal Methods to the Engineering Desktop
2016-02-01
integrates formal verification into software design and development by precisely defining semantics for a restricted subset of the Unified Modeling...input-output contract satisfaction and absence of null pointer dereferences. 15. SUBJECT TERMS Formal Methods, Software Verification , Model-Based...Domain specific languages (DSLs) drive both implementation and formal verification
Mainali, Laxman; Camenisch, Theodore G; Hyde, James S; Subczynski, Witold K
2017-12-01
The presence of integral membrane proteins induces the formation of distinct domains in the lipid bilayer portion of biological membranes. Qualitative application of both continuous wave (CW) and saturation recovery (SR) electron paramagnetic resonance (EPR) spin-labeling methods allowed discrimination of the bulk, boundary, and trapped lipid domains. A recently developed method, which is based on the CW EPR spectra of phospholipid (PL) and cholesterol (Chol) analog spin labels, allows evaluation of the relative amount of PLs (% of total PLs) in the boundary plus trapped lipid domain and the relative amount of Chol (% of total Chol) in the trapped lipid domain [ M. Raguz, L. Mainali, W. J. O'Brien, and W. K. Subczynski (2015), Exp. Eye Res., 140:179-186 ]. Here, a new method is presented that, based on SR EPR spin-labeling, allows quantitative evaluation of the relative amounts of PLs and Chol in the trapped lipid domain of intact membranes. This new method complements the existing one, allowing acquisition of more detailed information about the distribution of lipids between domains in intact membranes. The methodological transition of the SR EPR spin-labeling approach from qualitative to quantitative is demonstrated. The abilities of this method are illustrated for intact cortical and nuclear fiber cell plasma membranes from porcine eye lenses. Statistical analysis (Student's t -test) of the data allowed determination of the separations of mean values above which differences can be treated as statistically significant ( P ≤ 0.05) and can be attributed to sources other than preparation/technique.
Kroj, Thomas; Chanclud, Emilie; Michel-Romiti, Corinne; Grand, Xavier; Morel, Jean-Benoit
2016-04-01
Plant immune receptors of the class of nucleotide-binding and leucine-rich repeat domain (NLR) proteins can contain additional domains besides canonical NB-ARC (nucleotide-binding adaptor shared by APAF-1, R proteins, and CED-4 (NB-ARC)) and leucine-rich repeat (LRR) domains. Recent research suggests that these additional domains act as integrated decoys recognizing effectors from pathogens. Proteins homologous to integrated decoys are suspected to be effector targets and involved in disease or resistance. Here, we scrutinized 31 entire plant genomes to identify putative integrated decoy domains in NLR proteins using the Interpro search. The involvement of the Zinc Finger-BED type (ZBED) protein containing a putative decoy domain, called BED, in rice (Oryza sativa) resistance was investigated by evaluating susceptibility to the blast fungus Magnaporthe oryzae in rice over-expression and knock-out mutants. This analysis showed that all plants tested had integrated various atypical protein domains into their NLR proteins (on average 3.5% of all NLR proteins). We also demonstrated that modifying the expression of the ZBED gene modified disease susceptibility. This study suggests that integration of decoy domains in NLR immune receptors is widespread and frequent in plants. The integrated decoy model is therefore a powerful concept to identify new proteins involved in disease resistance. Further in-depth examination of additional domains in NLR proteins promises to unravel many new proteins of the plant immune system. © 2016 The Authors. New Phytologist © 2016 New Phytologist Trust.
Pernice, W H; Payne, F P; Gallagher, D F
2007-09-03
We present a novel numerical scheme for the simulation of the field enhancement by metal nano-particles in the time domain. The algorithm is based on a combination of the finite-difference time-domain method and the pseudo-spectral time-domain method for dispersive materials. The hybrid solver leads to an efficient subgridding algorithm that does not suffer from spurious field spikes as do FDTD schemes. Simulation of the field enhancement by gold particles shows the expected exponential field profile. The enhancement factors are computed for single particles and particle arrays. Due to the geometry conforming mesh the algorithm is stable for long integration times and thus suitable for the simulation of resonance phenomena in coupled nano-particle structures.
Temporal abstraction and temporal Bayesian networks in clinical domains: a survey.
Orphanou, Kalia; Stassopoulou, Athena; Keravnou, Elpida
2014-03-01
Temporal abstraction (TA) of clinical data aims to abstract and interpret clinical data into meaningful higher-level interval concepts. Abstracted concepts are used for diagnostic, prediction and therapy planning purposes. On the other hand, temporal Bayesian networks (TBNs) are temporal extensions of the known probabilistic graphical models, Bayesian networks. TBNs can represent temporal relationships between events and their state changes, or the evolution of a process, through time. This paper offers a survey on techniques/methods from these two areas that were used independently in many clinical domains (e.g. diabetes, hepatitis, cancer) for various clinical tasks (e.g. diagnosis, prognosis). A main objective of this survey, in addition to presenting the key aspects of TA and TBNs, is to point out important benefits from a potential integration of TA and TBNs in medical domains and tasks. The motivation for integrating these two areas is their complementary function: TA provides clinicians with high level views of data while TBNs serve as a knowledge representation and reasoning tool under uncertainty, which is inherent in all clinical tasks. Key publications from these two areas of relevance to clinical systems, mainly circumscribed to the latest two decades, are reviewed and classified. TA techniques are compared on the basis of: (a) knowledge acquisition and representation for deriving TA concepts and (b) methodology for deriving basic and complex temporal abstractions. TBNs are compared on the basis of: (a) representation of time, (b) knowledge representation and acquisition, (c) inference methods and the computational demands of the network, and (d) their applications in medicine. The survey performs an extensive comparative analysis to illustrate the separate merits and limitations of various TA and TBN techniques used in clinical systems with the purpose of anticipating potential gains through an integration of the two techniques, thus leading to a unified methodology for clinical systems. The surveyed contributions are evaluated using frameworks of respective key features. In addition, for the evaluation of TBN methods, a unifying clinical domain (diabetes) is used. The main conclusion transpiring from this review is that techniques/methods from these two areas, that so far are being largely used independently of each other in clinical domains, could be effectively integrated in the context of medical decision-support systems. The anticipated key benefits of the perceived integration are: (a) during problem solving, the reasoning can be directed at different levels of temporal and/or conceptual abstractions since the nodes of the TBNs can be complex entities, temporally and structurally and (b) during model building, knowledge generated in the form of basic and/or complex abstractions, can be deployed in a TBN. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Hu, Fang; Pizzo, Michelle E.; Nark, Douglas M.
2017-01-01
It has been well-known that under the assumption of a constant uniform mean flow, the acoustic wave propagation equation can be formulated as a boundary integral equation, in both the time domain and the frequency domain. Compared with solving partial differential equations, numerical methods based on the boundary integral equation have the advantage of a reduced spatial dimension and, hence, requiring only a surface mesh. However, the constant uniform mean flow assumption, while convenient for formulating the integral equation, does not satisfy the solid wall boundary condition wherever the body surface is not aligned with the uniform mean flow. In this paper, we argue that the proper boundary condition for the acoustic wave should not have its normal velocity be zero everywhere on the solid surfaces, as has been applied in the literature. A careful study of the acoustic energy conservation equation is presented that shows such a boundary condition in fact leads to erroneous source or sink points on solid surfaces not aligned with the mean flow. A new solid wall boundary condition is proposed that conserves the acoustic energy and a new time domain boundary integral equation is derived. In addition to conserving the acoustic energy, another significant advantage of the new equation is that it is considerably simpler than previous formulations. In particular, tangential derivatives of the solution on the solid surfaces are no longer needed in the new formulation, which greatly simplifies numerical implementation. Furthermore, stabilization of the new integral equation by Burton-Miller type reformulation is presented. The stability of the new formulation is studied theoretically as well as numerically by an eigenvalue analysis. Numerical solutions are also presented that demonstrate the stability of the new formulation.
Improving pairwise comparison of protein sequences with domain co-occurrence
Gascuel, Olivier
2018-01-01
Comparing and aligning protein sequences is an essential task in bioinformatics. More specifically, local alignment tools like BLAST are widely used for identifying conserved protein sub-sequences, which likely correspond to protein domains or functional motifs. However, to limit the number of false positives, these tools are used with stringent sequence-similarity thresholds and hence can miss several hits, especially for species that are phylogenetically distant from reference organisms. A solution to this problem is then to integrate additional contextual information to the procedure. Here, we propose to use domain co-occurrence to increase the sensitivity of pairwise sequence comparisons. Domain co-occurrence is a strong feature of proteins, since most protein domains tend to appear with a limited number of other domains on the same protein. We propose a method to take this information into account in a typical BLAST analysis and to construct new domain families on the basis of these results. We used Plasmodium falciparum as a case study to evaluate our method. The experimental findings showed an increase of 14% of the number of significant BLAST hits and an increase of 25% of the proteome area that can be covered with a domain. Our method identified 2240 new domains for which, in most cases, no model of the Pfam database could be linked. Moreover, our study of the quality of the new domains in terms of alignment and physicochemical properties show that they are close to that of standard Pfam domains. Source code of the proposed approach and supplementary data are available at: https://gite.lirmm.fr/menichelli/pairwise-comparison-with-cooccurrence PMID:29293498
2014-01-01
An integrated chassis control (ICC) system with active front steering (AFS) and yaw stability control (YSC) is introduced in this paper. The proposed ICC algorithm uses the improved Inverse Nyquist Array (INA) method based on a 2-degree-of-freedom (DOF) planar vehicle reference model to decouple the plant dynamics under different frequency bands, and the change of velocity and cornering stiffness were considered to calculate the analytical solution in the precompensator design so that the INA based algorithm runs well and fast on the nonlinear vehicle system. The stability of the system is guaranteed by dynamic compensator together with a proposed PI feedback controller. After the response analysis of the system on frequency domain and time domain, simulations under step steering maneuver were carried out using a 2-DOF vehicle model and a 14-DOF vehicle model by Matlab/Simulink. The results show that the system is decoupled and the vehicle handling and stability performance are significantly improved by the proposed method. PMID:24782676
Studies of Coherent Synchrotron Radiation with the Discontinuous Galerkin Method
NASA Astrophysics Data System (ADS)
Bizzozero, David A.
In this thesis, we present methods for integrating Maxwell's equations in Frenet-Serret coordinates in several settings using discontinuous Galerkin (DG) finite element method codes in 1D, 2D, and 3D. We apply these routines to the study of coherent synchrotron radiation, an important topic in accelerator physics. We build upon the published computational work of T. Agoh and D. Zhou in solving Maxwell's equations in the frequency-domain using a paraxial approximation which reduces Maxwell's equations to a Schrodinger-like system. We also evolve Maxwell's equations in the time-domain using a Fourier series decomposition with 2D DG motivated by an experiment performed at the Canadian Light Source. A comparison between theory and experiment has been published (Phys. Rev. Lett. 114, 204801 (2015)). Lastly, we devise a novel approach to integrating Maxwell's equations with 3D DG using a Galilean transformation and demonstrate proof-of-concept. In the above studies, we examine the accuracy, efficiency, and convergence of DG.
Zhu, Bing; Chen, Yizhou; Zhao, Jian
2014-01-01
An integrated chassis control (ICC) system with active front steering (AFS) and yaw stability control (YSC) is introduced in this paper. The proposed ICC algorithm uses the improved Inverse Nyquist Array (INA) method based on a 2-degree-of-freedom (DOF) planar vehicle reference model to decouple the plant dynamics under different frequency bands, and the change of velocity and cornering stiffness were considered to calculate the analytical solution in the precompensator design so that the INA based algorithm runs well and fast on the nonlinear vehicle system. The stability of the system is guaranteed by dynamic compensator together with a proposed PI feedback controller. After the response analysis of the system on frequency domain and time domain, simulations under step steering maneuver were carried out using a 2-DOF vehicle model and a 14-DOF vehicle model by Matlab/Simulink. The results show that the system is decoupled and the vehicle handling and stability performance are significantly improved by the proposed method.
3-D Forward modeling of Induced Polarization Effects of Transient Electromagnetic Method
NASA Astrophysics Data System (ADS)
Wu, Y.; Ji, Y.; Guan, S.; Li, D.; Wang, A.
2017-12-01
In transient electromagnetic (TEM) detection, Induced polarization (IP) effects are so important that they cannot be ignored. The authors simulate the three-dimensional (3-D) induced polarization effects in time-domain directly by applying the finite-difference time-domain method (FDTD) based on Cole-Cole model. Due to the frequency dispersion characteristics of the electrical conductivity, the computations of convolution in the generalized Ohm's law of fractional order system makes the forward modeling particularly complicated. Firstly, we propose a method to approximate the fractional order function of Cole-Cole model using a lower order rational transfer function based on error minimum theory in the frequency domain. In this section, two auxiliary variables are introduced to transform nonlinear least square fitting problem of the fractional order system into a linear programming problem, thus avoiding having to solve a system of equations and nonlinear problems. Secondly, the time-domain expression of Cole-Cole model is obtained by using Inverse Laplace transform. Then, for the calculation of Ohm's law, we propose an e-index auxiliary equation of conductivity to transform the convolution to non-convolution integral; in this section, the trapezoid rule is applied to compute the integral. We then substitute the recursion equation into Maxwell's equations to derive the iterative equations of electromagnetic field using the FDTD method. Finally, we finish the stimulation of 3-D model and evaluate polarization parameters. The results are compared with those obtained from the digital filtering solution of the analytical equation in the homogeneous half space, as well as with the 3-D model results from the auxiliary ordinary differential equation method (ADE). Good agreements are obtained across the three methods. In terms of the 3-D model, the proposed method has higher efficiency and lower memory requirements as execution times and memory usage were reduced by 20% compared with ADE method.
Chen, Chuan; Hendriks, Gijs A G M; van Sloun, Ruud J G; Hansen, Hendrik H G; de Korte, Chris L
2018-05-01
In this paper, a novel processing framework is introduced for Fourier-domain beamforming of plane-wave ultrasound data, which incorporates coherent compounding and angular weighting in the Fourier domain. Angular weighting implies spectral weighting by a 2-D steering-angle-dependent filtering template. The design of this filter is also optimized as part of this paper. Two widely used Fourier-domain plane-wave ultrasound beamforming methods, i.e., Lu's f-k and Stolt's f-k methods, were integrated in the framework. To enable coherent compounding in Fourier domain for the Stolt's f-k method, the original Stolt's f-k method was modified to achieve alignment of the spectra for different steering angles in k-space. The performance of the framework was compared for both methods with and without angular weighting using experimentally obtained data sets (phantom and in vivo), and data sets (phantom) provided by the IEEE IUS 2016 plane-wave beamforming challenge. The addition of angular weighting enhanced the image contrast while preserving image resolution. This resulted in images of equal quality as those obtained by conventionally used delay-and-sum (DAS) beamforming with apodization and coherent compounding. Given the lower computational load of the proposed framework compared to DAS, to our knowledge it can, therefore, be concluded that it outperforms commonly used beamforming methods such as Stolt's f-k, Lu's f-k, and DAS.
NASA Astrophysics Data System (ADS)
Čuma, Martin; Gribenko, Alexander; Zhdanov, Michael S.
2017-09-01
We have developed a multi-level parallel magnetotelluric (MT) integral equation based inversion program which uses variable sensitivity domain. The limited sensitivity of the data, which decreases with increasing frequency, is exploited by a receiver sensitivity domain, which also varies with frequency. We assess the effect of inverting principal impedances, full impedance tensor, and full tensor jointly with magnetovariational data (tipper). We first apply this method to several models and then invert the EarthScope MT data. We recover well the prominent features in the area including resistive structure associated with the Juan de Fuca slab subducting beneath the northwestern United States, the conductive zone of partially melted material above the subducting slab at the Cascade volcanic arc, conductive features in the Great Basin and in the area of Yellowstone associated with the hot spot, and resistive areas to the east corresponding to the older and more stable cratons.
An Empirical Human Controller Model for Preview Tracking Tasks.
van der El, Kasper; Pool, Daan M; Damveld, Herman J; van Paassen, Marinus Rene M; Mulder, Max
2016-11-01
Real-life tracking tasks often show preview information to the human controller about the future track to follow. The effect of preview on manual control behavior is still relatively unknown. This paper proposes a generic operator model for preview tracking, empirically derived from experimental measurements. Conditions included pursuit tracking, i.e., without preview information, and tracking with 1 s of preview. Controlled element dynamics varied between gain, single integrator, and double integrator. The model is derived in the frequency domain, after application of a black-box system identification method based on Fourier coefficients. Parameter estimates are obtained to assess the validity of the model in both the time domain and frequency domain. Measured behavior in all evaluated conditions can be captured with the commonly used quasi-linear operator model for compensatory tracking, extended with two viewpoints of the previewed target. The derived model provides new insights into how human operators use preview information in tracking tasks.
Regional climate model sensitivity to domain size
NASA Astrophysics Data System (ADS)
Leduc, Martin; Laprise, René
2009-05-01
Regional climate models are increasingly used to add small-scale features that are not present in their lateral boundary conditions (LBC). It is well known that the limited area over which a model is integrated must be large enough to allow the full development of small-scale features. On the other hand, integrations on very large domains have shown important departures from the driving data, unless large scale nudging is applied. The issue of domain size is studied here by using the “perfect model” approach. This method consists first of generating a high-resolution climatic simulation, nicknamed big brother (BB), over a large domain of integration. The next step is to degrade this dataset with a low-pass filter emulating the usual coarse-resolution LBC. The filtered nesting data (FBB) are hence used to drive a set of four simulations (LBs for Little Brothers), with the same model, but on progressively smaller domain sizes. The LB statistics for a climate sample of four winter months are compared with BB over a common region. The time average (stationary) and transient-eddy standard deviation patterns of the LB atmospheric fields generally improve in terms of spatial correlation with the reference (BB) when domain gets smaller. The extraction of the small-scale features by using a spectral filter allows detecting important underestimations of the transient-eddy variability in the vicinity of the inflow boundary, which can penalize the use of small domains (less than 100 × 100 grid points). The permanent “spatial spin-up” corresponds to the characteristic distance that the large-scale flow needs to travel before developing small-scale features. The spin-up distance tends to grow in size at higher levels in the atmosphere.
Evaluating Feynman integrals by the hypergeometry
NASA Astrophysics Data System (ADS)
Feng, Tai-Fu; Chang, Chao-Hsi; Chen, Jian-Bin; Gu, Zhi-Hua; Zhang, Hai-Bin
2018-02-01
The hypergeometric function method naturally provides the analytic expressions of scalar integrals from concerned Feynman diagrams in some connected regions of independent kinematic variables, also presents the systems of homogeneous linear partial differential equations satisfied by the corresponding scalar integrals. Taking examples of the one-loop B0 and massless C0 functions, as well as the scalar integrals of two-loop vacuum and sunset diagrams, we verify our expressions coinciding with the well-known results of literatures. Based on the multiple hypergeometric functions of independent kinematic variables, the systems of homogeneous linear partial differential equations satisfied by the mentioned scalar integrals are established. Using the calculus of variations, one recognizes the system of linear partial differential equations as stationary conditions of a functional under some given restrictions, which is the cornerstone to perform the continuation of the scalar integrals to whole kinematic domains numerically with the finite element methods. In principle this method can be used to evaluate the scalar integrals of any Feynman diagrams.
Poot, Antonius J.; de Waard, Claudia S.; Wind, Annet W.; Caljouw, Monique A. A.; Gussekloo, Jacobijn
2017-01-01
Evaluation of the implementation of integrated care can differ from trial-based research due to complexity. Therefore, we examined whether a theory-based method for process description of implementation can contribute to improvement of evidence-based care. MOVIT, a Dutch project aimed at implementing integrated care for older vulnerable persons in residential care homes, was used as a case study. The project activities were defined according to implementation taxonomy and mapped in a matrix of theoretical levels and domains. Project activities mainly targeted professionals (both individual and group). A few activities targeted the organizational level, whereas none targeted the policy level, or the patient, or the “social, political, and legal” domains. However, the resulting changes in care delivery arrangement had consequences for professionals, patients, organizations, and the social, political, and legal domains. A structured process description of a pragmatic implementation project can help assess the fidelity and quality of the implementation, and identify relevant contextual factors for immediate adaptation and future research. The description showed that, in the MOVIT project, there was a discrepancy between the levels and domains targeted by the implementation activities and those influenced by the resulting changes in delivery arrangement. This could have influenced, in particular, the adoption and sustainability of the project. PMID:29161944
Poot, Antonius J; de Waard, Claudia S; Wind, Annet W; Caljouw, Monique A A; Gussekloo, Jacobijn
2017-01-01
Evaluation of the implementation of integrated care can differ from trial-based research due to complexity. Therefore, we examined whether a theory-based method for process description of implementation can contribute to improvement of evidence-based care. MOVIT, a Dutch project aimed at implementing integrated care for older vulnerable persons in residential care homes, was used as a case study. The project activities were defined according to implementation taxonomy and mapped in a matrix of theoretical levels and domains. Project activities mainly targeted professionals (both individual and group). A few activities targeted the organizational level, whereas none targeted the policy level, or the patient, or the "social, political, and legal" domains. However, the resulting changes in care delivery arrangement had consequences for professionals, patients, organizations, and the social, political, and legal domains. A structured process description of a pragmatic implementation project can help assess the fidelity and quality of the implementation, and identify relevant contextual factors for immediate adaptation and future research. The description showed that, in the MOVIT project, there was a discrepancy between the levels and domains targeted by the implementation activities and those influenced by the resulting changes in delivery arrangement. This could have influenced, in particular, the adoption and sustainability of the project.
NASA Astrophysics Data System (ADS)
Perepelkin, Eugene; Tarelkin, Aleksandr
2018-02-01
A magnetostatics problem arises when searching for the distribution of the magnetic field generated by magnet systems of many physics research facilities, e.g., accelerators. The domain in which the boundary-value problem is solved often has a piecewise smooth boundary. In this case, numerical calculations of the problem require consideration of the solution behavior in the corner domain. In this work we obtained an upper estimation of the magnetic field growth using integral formulation of the magnetostatic problem and propose a method for condensing the differential mesh near the corner domain of the vacuum in the three-dimensional space based on this estimation.
Solution of steady and unsteady transonic-vortex flows using Euler and full-potential equations
NASA Technical Reports Server (NTRS)
Kandil, Osama A.; Chuang, Andrew H.; Hu, Hong
1989-01-01
Two methods are presented for inviscid transonic flows: unsteady Euler equations in a rotating frame of reference for transonic-vortex flows and integral solution of full-potential equation with and without embedded Euler domains for transonic airfoil flows. The computational results covered: steady and unsteady conical vortex flows; 3-D steady transonic vortex flow; and transonic airfoil flows. The results are in good agreement with other computational results and experimental data. The rotating frame of reference solution is potentially efficient as compared with the space fixed reference formulation with dynamic gridding. The integral equation solution with embedded Euler domain is computationally efficient and as accurate as the Euler equations.
Microtechnology management considering test and cost aspects for stacked 3D ICs with MEMS
NASA Astrophysics Data System (ADS)
Hahn, K.; Wahl, M.; Busch, R.; Grünewald, A.; Brück, R.
2018-01-01
Innovative automotive systems require complex semiconductor devices currently only available in consumer grade quality. The European project TRACE will develop and demonstrate methods, processes, and tools to facilitate usage of Consumer Electronics (CE) components to be deployable more rapidly in the life-critical automotive domain. Consumer electronics increasingly use heterogeneous system integration methods and "More than Moore" technologies, which are capable to combine different circuit domains (Analog, Digital, RF, MEMS) and which are integrated within SiP or 3D stacks. Making these technologies or at least some of the process steps available under automotive electronics requirements is an important goal to keep pace with the growing demand for information processing within cars. The approach presented in this paper aims at a technology management and recommendation system that covers technology data, functional and non-functional constraints, and application scenarios, and that will comprehend test planning and cost consideration capabilities.
Indirect boundary force measurements in beam-like structures using a derivative estimator
NASA Astrophysics Data System (ADS)
Chesne, Simon
2014-12-01
This paper proposes a new method for the identification of boundary forces (shear force or bending moment) in a beam, based on displacement measurements. The problem is considered in terms of the determination of the boundary spatial derivatives of transverse displacements. By assuming the displacement fields to be approximated by Taylor expansions in a domain close to the boundaries, the spatial derivatives can be estimated using specific point-wise derivative estimators. This approach makes it possible to extract the derivatives using a weighted spatial integration of the displacement field. Following the theoretical description, numerical simulations made with exact and noisy data are used to determine the relationship between the size of the integration domain and the wavelength of the vibrations. The simulations also highlight the self-regularization of the technique. Experimental measurements demonstrate the feasibility and accuracy of the proposed method.
A flexible importance sampling method for integrating subgrid processes
Raut, E. K.; Larson, V. E.
2016-01-29
Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that containsmore » both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). Here, the resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Itagaki, Masafumi; Miyoshi, Yoshinori; Hirose, Hideyuki
A procedure is presented for the determination of geometric buckling for regular polygons. A new computation technique, the multiple reciprocity boundary element method (MRBEM), has been applied to solve the one-group neutron diffusion equation. The main difficulty in applying the ordinary boundary element method (BEM) to neutron diffusion problems has been the need to compute a domain integral, resulting from the fission source. The MRBEM has been developed for transforming this type of domain integral into an equivalent boundary integral. The basic idea of the MRBEM is to apply repeatedly the reciprocity theorem (Green's second formula) using a sequence ofmore » higher order fundamental solutions. The MRBEM requires discretization of the boundary only rather than of the domain. This advantage is useful for extensive survey analyses of buckling for complex geometries. The results of survey analyses have indicated that the general form of geometric buckling is B[sub g][sup 2] = (a[sub n]/R[sub c])[sup 2], where R[sub c] represents the radius of the circumscribed circle of the regular polygon under consideration. The geometric constant A[sub n] depends on the type of regular polygon and takes the value of [pi] for a square and 2.405 for a circle, an extreme case that has an infinite number of sides. Values of a[sub n] for a triangle, pentagon, hexagon, and octagon have been calculated as 4.190, 2.281, 2.675, and 2.547, respectively.« less
Acoustic 3D modeling by the method of integral equations
NASA Astrophysics Data System (ADS)
Malovichko, M.; Khokhlov, N.; Yavich, N.; Zhdanov, M.
2018-02-01
This paper presents a parallel algorithm for frequency-domain acoustic modeling by the method of integral equations (IE). The algorithm is applied to seismic simulation. The IE method reduces the size of the problem but leads to a dense system matrix. A tolerable memory consumption and numerical complexity were achieved by applying an iterative solver, accompanied by an effective matrix-vector multiplication operation, based on the fast Fourier transform (FFT). We demonstrate that, the IE system matrix is better conditioned than that of the finite-difference (FD) method, and discuss its relation to a specially preconditioned FD matrix. We considered several methods of matrix-vector multiplication for the free-space and layered host models. The developed algorithm and computer code were benchmarked against the FD time-domain solution. It was demonstrated that, the method could accurately calculate the seismic field for the models with sharp material boundaries and a point source and receiver located close to the free surface. We used OpenMP to speed up the matrix-vector multiplication, while MPI was used to speed up the solution of the system equations, and also for parallelizing across multiple sources. The practical examples and efficiency tests are presented as well.
Recent developments in learning control and system identification for robots and structures
NASA Technical Reports Server (NTRS)
Phan, M.; Juang, J.-N.; Longman, R. W.
1990-01-01
This paper reviews recent results in learning control and learning system identification, with particular emphasis on discrete-time formulation, and their relation to adaptive theory. Related continuous-time results are also discussed. Among the topics presented are proportional, derivative, and integral learning controllers, time-domain formulation of discrete learning algorithms. Newly developed techniques are described including the concept of the repetition domain, and the repetition domain formulation of learning control by linear feedback, model reference learning control, indirect learning control with parameter estimation, as well as related basic concepts, recursive and non-recursive methods for learning identification.
A boundary element method for steady incompressible thermoviscous flow
NASA Technical Reports Server (NTRS)
Dargush, G. F.; Banerjee, P. K.
1991-01-01
A boundary element formulation is presented for moderate Reynolds number, steady, incompressible, thermoviscous flows. The governing integral equations are written exclusively in terms of velocities and temperatures, thus eliminating the need for the computation of any gradients. Furthermore, with the introduction of reference velocities and temperatures, volume modeling can often be confined to only a small portion of the problem domain, typically near obstacles or walls. The numerical implementation includes higher order elements, adaptive integration and multiregion capability. Both the integral formulation and implementation are discussed in detail. Several examples illustrate the high level of accuracy that is obtainable with the current method.
System and method for measuring fluorescence of a sample
Riot, Vincent J
2015-03-24
The present disclosure provides a system and a method for measuring fluorescence of a sample. The sample may be a polymerase-chain-reaction (PCR) array, a loop-mediated-isothermal amplification array, etc. LEDs are used to excite the sample, and a photodiode is used to collect the sample's fluorescence. An electronic offset signal is used to reduce the effects of background fluorescence and the noises from the measurement system. An integrator integrates the difference between the output of the photodiode and the electronic offset signal over a given period of time. The resulting integral is then converted into digital domain for further processing and storage.
Optical image encryption using QR code and multilevel fingerprints in gyrator transform domains
NASA Astrophysics Data System (ADS)
Wei, Yang; Yan, Aimin; Dong, Jiabin; Hu, Zhijuan; Zhang, Jingtao
2017-11-01
A new concept of GT encryption scheme is proposed in this paper. We present a novel optical image encryption method by using quick response (QR) code and multilevel fingerprint keys in gyrator transform (GT) domains. In this method, an original image is firstly transformed into a QR code, which is placed in the input plane of cascaded GTs. Subsequently, the QR code is encrypted into the cipher-text by using multilevel fingerprint keys. The original image can be obtained easily by reading the high-quality retrieved QR code with hand-held devices. The main parameters used as private keys are GTs' rotation angles and multilevel fingerprints. Biometrics and cryptography are integrated with each other to improve data security. Numerical simulations are performed to demonstrate the validity and feasibility of the proposed encryption scheme. In the future, the method of applying QR codes and fingerprints in GT domains possesses much potential for information security.
Analytic Method for Computing Instrument Pointing Jitter
NASA Technical Reports Server (NTRS)
Bayard, David
2003-01-01
A new method of calculating the root-mean-square (rms) pointing jitter of a scientific instrument (e.g., a camera, radar antenna, or telescope) is introduced based on a state-space concept. In comparison with the prior method of calculating the rms pointing jitter, the present method involves significantly less computation. The rms pointing jitter of an instrument (the square root of the jitter variance shown in the figure) is an important physical quantity which impacts the design of the instrument, its actuators, controls, sensory components, and sensor- output-sampling circuitry. Using the Sirlin, San Martin, and Lucke definition of pointing jitter, the prior method of computing the rms pointing jitter involves a frequency-domain integral of a rational polynomial multiplied by a transcendental weighting function, necessitating the use of numerical-integration techniques. In practice, numerical integration complicates the problem of calculating the rms pointing error. In contrast, the state-space method provides exact analytic expressions that can be evaluated without numerical integration.
A hybrid method for transient wave propagation in a multilayered solid
NASA Astrophysics Data System (ADS)
Tian, Jiayong; Xie, Zhoumin
2009-08-01
We present a hybrid method for the evaluation of transient elastic-wave propagation in a multilayered solid, integrating reverberation matrix method with the theory of generalized rays. Adopting reverberation matrix formulation, Laplace-Fourier domain solutions of elastic waves in the multilayered solid are expanded into the sum of a series of generalized-ray group integrals. Each generalized-ray group integral containing Kth power of reverberation matrix R represents the set of K-times reflections and refractions of source waves arriving at receivers in the multilayered solid, which was computed by fast inverse Laplace transform (FILT) and fast Fourier transform (FFT) algorithms. However, the calculation burden and low precision of FILT-FFT algorithm limit the application of reverberation matrix method. In this paper, we expand each of generalized-ray group integrals into the sum of a series of generalized-ray integrals, each of which is accurately evaluated by Cagniard-De Hoop method in the theory of generalized ray. The numerical examples demonstrate that the proposed method makes it possible to calculate the early-time transient response in the complex multilayered-solid configuration efficiently.
NASA Astrophysics Data System (ADS)
Brodaric, B.; Probst, F.
2007-12-01
Ontologies are being developed bottom-up in many geoscience domains to aid semantic-enabled computing. The contents of these ontologies are typically partitioned along domain boundaries, such as geology, geophsyics, hydrology, or are developed for specific data sets or processing needs. At the same time, very general foundational ontologies are being independently developed top-down to help facilitate integration of knowledge across such domains, and to provide homogeneity to the organization of knowledge within the domains. In this work we investigate the suitability of integrating the DOLCE foundational ontology with concepts from two prominent geoscience knowledge representations, GeoSciML and SWEET, to investigate the alignment of the concepts found within the foundational and domain representations. The geoscience concepts are partially mapped to each other and to those in the foundational ontology, via the subclass and other relations, resulting in an integrated OWL-based ontology called DOLCE ROCKS. These preliminary results demonstrate variable alignment between the foundational and domain concepts, and also between the domain concepts. Further work is required to ascertain the impact of this integrated ontology approach on broader geoscience ontology design, on the unification of domain ontologies, as well as their use within semantic-enabled geoscience applications.
Harton, Brenda B; Borrelli, Larry; Knupp, Ann; Rogers, Necolen; West, Vickie R
2009-01-01
Traditional nursing service orientation classes at an acute care hospital were integrated with orientation to the electronic medical record to blend the two components in a user-friendly format so that the learner is introduced to the culture, processes, and documentation methods of the organization, with an opportunity to document online in a practice domain while lecture and discussion information is fresh.
Error analysis of multipoint flux domain decomposition methods for evolutionary diffusion problems
NASA Astrophysics Data System (ADS)
Arrarás, A.; Portero, L.; Yotov, I.
2014-01-01
We study space and time discretizations for mixed formulations of parabolic problems. The spatial approximation is based on the multipoint flux mixed finite element method, which reduces to an efficient cell-centered pressure system on general grids, including triangles, quadrilaterals, tetrahedra, and hexahedra. The time integration is performed by using a domain decomposition time-splitting technique combined with multiterm fractional step diagonally implicit Runge-Kutta methods. The resulting scheme is unconditionally stable and computationally efficient, as it reduces the global system to a collection of uncoupled subdomain problems that can be solved in parallel without the need for Schwarz-type iteration. Convergence analysis for both the semidiscrete and fully discrete schemes is presented.
Attentional selection in visual perception, memory and action: a quest for cross-domain integration
Schneider, Werner X.; Einhäuser, Wolfgang; Horstmann, Gernot
2013-01-01
For decades, the cognitive and neural sciences have benefitted greatly from a separation of mind and brain into distinct functional domains. The tremendous success of this approach notwithstanding, it is self-evident that such a view is incomplete. Goal-directed behaviour of an organism requires the joint functioning of perception, memory and sensorimotor control. A prime candidate for achieving integration across these functional domains are attentional processes. Consequently, this Theme Issue brings together studies of attentional selection from many fields, both experimental and theoretical, that are united in their quest to find overreaching integrative principles of attention between perception, memory and action. In all domains, attention is understood as combination of competition and priority control (‘bias’), with the task as a decisive driving factor to ensure coherent goal-directed behaviour and cognition. Using vision as the predominant model system for attentional selection, many studies of this Theme Issue focus special emphasis on eye movements as a selection process that is both a fundamental action and serves a key function in perception. The Theme Issue spans a wide range of methods, from measuring human behaviour in the real word to recordings of single neurons in the non-human primate brain. We firmly believe that combining such a breadth in approaches is necessary not only for attentional selection, but also to take the next decisive step in all of the cognitive and neural sciences: to understand cognition and behaviour beyond isolated domains. PMID:24018715
NASA Astrophysics Data System (ADS)
Yamada, Hiroshi; Kawaguchi, Akira
Grid computing and web service technologies enable us to use networked resources in a coordinated manner. An integrated service is made of individual services running on coordinated resources. In order to achieve such coordinated services autonomously, the initiator of a coordinated service needs to know detailed service resource information. This information ranges from static attributes like the IP address of the application server to highly dynamic ones like the CPU load. The most famous wide-area service discovery mechanism based on names is DNS. Its hierarchical tree organization and caching methods take advantage of the static information managed. However, in order to integrate business applications in a virtual enterprise, we need a discovery mechanism to search for the optimal resources based on the given a set of criteria (search keys). In this paper, we propose a communication protocol for exchanging service resource information among wide-area systems. We introduce the concept of the service domain that consists of service providers managed under the same management policy. This concept of the service domain is similar to that for autonomous systems (ASs). In each service domain, the service information provider manages the service resource information of service providers that exist in this service domain. The service resource information provider exchanges this information with other service resource information providers that belong to the different service domains. We also verified the protocol's behavior and effectiveness using a simulation model developed for proposed protocol.
Computer analysis of multicircuit shells of revolution by the field method
NASA Technical Reports Server (NTRS)
Cohen, G. A.
1975-01-01
The field method, presented previously for the solution of even-order linear boundary value problems defined on one-dimensional open branch domains, is extended to boundary value problems defined on one-dimensional domains containing circuits. This method converts the boundary value problem into two successive numerically stable initial value problems, which may be solved by standard forward integration techniques. In addition, a new method for the treatment of singular boundary conditions is presented. This method, which amounts to a partial interchange of the roles of force and displacement variables, is problem independent with respect to both accuracy and speed of execution. This method was implemented in a computer program to calculate the static response of ring stiffened orthotropic multicircuit shells of revolution to asymmetric loads. Solutions are presented for sample problems which illustrate the accuracy and efficiency of the method.
Broadband impedance boundary conditions for the simulation of sound propagation in the time domain.
Bin, Jonghoon; Yousuff Hussaini, M; Lee, Soogab
2009-02-01
An accurate and practical surface impedance boundary condition in the time domain has been developed for application to broadband-frequency simulation in aeroacoustic problems. To show the capability of this method, two kinds of numerical simulations are performed and compared with the analytical/experimental results: one is acoustic wave reflection by a monopole source over an impedance surface and the other is acoustic wave propagation in a duct with a finite impedance wall. Both single-frequency and broadband-frequency simulations are performed within the framework of linearized Euler equations. A high-order dispersion-relation-preserving finite-difference method and a low-dissipation, low-dispersion Runge-Kutta method are used for spatial discretization and time integration, respectively. The results show excellent agreement with the analytical/experimental results at various frequencies. The method accurately predicts both the amplitude and the phase of acoustic pressure and ensures the well-posedness of the broadband time-domain impedance boundary condition.
The Golden Age of Software Architecture: A Comprehensive Survey
2006-02-01
UML [14], under the leadership of (at the time) Rational, has integrated a number of design notations and developed a method for applying them...yes 97 survey, model Garlan. Research directions in SA [28] 54 yes 93 specific domains Cremer et al. The SA for scenario control in the Iowa...Environment of the Domain-Specific Software Architecture Project, ADAGE-IBM-92-11, Version 2.0, November, 1993 [23] J. Cremer , J. Kearney, Y. Papelis, and
2006-06-01
Horizontal Fusion, the JCDX team developed two web services, a Classification Policy Decision Service (cPDS), and a Federated Search Provider (FSP...The cPDS web service primarily provides other systems with methods for handling labeled data such as label comparison. The federated search provider...level domains. To provide defense-in-depth, cPDS and the Federated Search Provider are implemented on a separate server known as the JCDX Web
Bailey, Paul C; Schudoma, Christian; Jackson, William; Baggs, Erin; Dagdas, Gulay; Haerty, Wilfried; Moscou, Matthew; Krasileva, Ksenia V
2018-02-19
The plant immune system is innate and encoded in the germline. Using it efficiently, plants are capable of recognizing a diverse range of rapidly evolving pathogens. A recently described phenomenon shows that plant immune receptors are able to recognize pathogen effectors through the acquisition of exogenous protein domains from other plant genes. We show that plant immune receptors with integrated domains are distributed unevenly across their phylogeny in grasses. Using phylogenetic analysis, we uncover a major integration clade, whose members underwent repeated independent integration events producing diverse fusions. This clade is ancestral in grasses with members often found on syntenic chromosomes. Analyses of these fusion events reveals that homologous receptors can be fused to diverse domains. Furthermore, we discover a 43 amino acid long motif associated with this dominant integration clade which is located immediately upstream of the fusion site. Sequence analysis reveals that DNA transposition and/or ectopic recombination are the most likely mechanisms of formation for nucleotide binding leucine rich repeat proteins with integrated domains. The identification of this subclass of plant immune receptors that is naturally adapted to new domain integration will inform biotechnological approaches for generating synthetic receptors with novel pathogen "baits."
Lu, Cheng-Tsung; Huang, Kai-Yao; Su, Min-Gang; Lee, Tzong-Yi; Bretaña, Neil Arvin; Chang, Wen-Chi; Chen, Yi-Ju; Chen, Yu-Ju; Huang, Hsien-Da
2013-01-01
Protein modification is an extremely important post-translational regulation that adjusts the physical and chemical properties, conformation, stability and activity of a protein; thus altering protein function. Due to the high throughput of mass spectrometry (MS)-based methods in identifying site-specific post-translational modifications (PTMs), dbPTM (http://dbPTM.mbc.nctu.edu.tw/) is updated to integrate experimental PTMs obtained from public resources as well as manually curated MS/MS peptides associated with PTMs from research articles. Version 3.0 of dbPTM aims to be an informative resource for investigating the substrate specificity of PTM sites and functional association of PTMs between substrates and their interacting proteins. In order to investigate the substrate specificity for modification sites, a newly developed statistical method has been applied to identify the significant substrate motifs for each type of PTMs containing sufficient experimental data. According to the data statistics in dbPTM, >60% of PTM sites are located in the functional domains of proteins. It is known that most PTMs can create binding sites for specific protein-interaction domains that work together for cellular function. Thus, this update integrates protein-protein interaction and domain-domain interaction to determine the functional association of PTM sites located in protein-interacting domains. Additionally, the information of structural topologies on transmembrane (TM) proteins is integrated in dbPTM in order to delineate the structural correlation between the reported PTM sites and TM topologies. To facilitate the investigation of PTMs on TM proteins, the PTM substrate sites and the structural topology are graphically represented. Also, literature information related to PTMs, orthologous conservations and substrate motifs of PTMs are also provided in the resource. Finally, this version features an improved web interface to facilitate convenient access to the resource.
Framing effects in choices between multioutcome life-expectancy lotteries.
Bernstein, L M; Chapman, G B; Elstein, A S
1999-01-01
To explore framing or editing effects and a method to debias framing in a clinical context. Clinical scenarios using multioutcome life-expectancy lotteries of equal value required choices between two supplementary drugs that either prolonged or shortened life from the 20-year beneficial effect of a baseline drug. The effects of these supplementary drugs were presented in two conditions, using a between-subjects design. In segregated editing (n = 116) the effects were presented separately from the effects of the baseline drug. In integrated editing (n = 100), effects of supplementary and baseline drugs were combined in the lottery presentation. Each subject responded to 30 problems. To explore one method of debiasing, another 100 subjects made choices after viewing both segregated and integrated editings of 20 problems (dual framing). Statistically significant preference reversals between segregated and integrated editing of pure lotteries occurred only when one framing placed outcomes in the gain domain, and the other framing placed them in the loss domain. When both editings resulted in gain-domain outcomes only, there was no framing effect. There was a related relationship of framing-effect shifts from losses to gains in mixed-lottery-choice problems. Responses to the dual framing condition did not consistently coincide with responses to either single framing. In some situations, dual framing eliminated or lessened framing effects. The results support two components of prospect theory, coding outcomes as gains or losses from a reference point, and an s-shaped utility function (concave in gain, convex in loss domains). Presenting both alternative editings of a complex situation prior to choice more fully informs the decision maker and may help to reduce framing effects. Given the extent to which preferences shift in response to alternative presentations, it is unclear which choice represents the subject's "true preferences."
A Serviced-based Approach to Connect Seismological Infrastructures: Current Efforts at the IRIS DMC
NASA Astrophysics Data System (ADS)
Ahern, Tim; Trabant, Chad
2014-05-01
As part of the COOPEUS initiative to build infrastructure that connects European and US research infrastructures, IRIS has advocated for the development of Federated services based upon internationally recognized standards using web services. By deploying International Federation of Digital Seismograph Networks (FDSN) endorsed web services at multiple data centers in the US and Europe, we have shown that integration within seismological domain can be realized. By deploying identical methods to invoke the web services at multiple centers this approach can significantly ease the methods through which a scientist can access seismic data (time series, metadata, and earthquake catalogs) from distributed federated centers. IRIS has developed an IRIS federator that helps a user identify where seismic data from global seismic networks can be accessed. The web services based federator can build the appropriate URLs and return them to client software running on the scientists own computer. These URLs are then used to directly pull data from the distributed center in a very peer-based fashion. IRIS is also involved in deploying web services across horizontal domains. As part of the US National Science Foundation's (NSF) EarthCube effort, an IRIS led EarthCube Building Block's project is underway. When completed this project will aid in the discovery, access, and usability of data across multiple geoscienece domains. This presentation will summarize current IRIS efforts in building vertical integration infrastructure within seismology working closely with 5 centers in Europe and 2 centers in the US, as well as how we are taking first steps toward horizontal integration of data from 14 different domains in the US, in Europe, and around the world.
Fast Maximum Entropy Moment Closure Approach to Solving the Boltzmann Equation
NASA Astrophysics Data System (ADS)
Summy, Dustin; Pullin, Dale
2015-11-01
We describe a method for a moment-based solution of the Boltzmann Equation (BE). This is applicable to an arbitrary set of velocity moments whose transport is governed by partial-differential equations (PDEs) derived from the BE. The equations are unclosed, containing both higher-order moments and molecular-collision terms. These are evaluated using a maximum-entropy reconstruction of the velocity distribution function f (c , x , t) , from the known moments, within a finite-box domain of single-particle velocity (c) space. Use of a finite-domain alleviates known problems (Junk and Unterreiter, Continuum Mech. Thermodyn., 2002) concerning existence and uniqueness of the reconstruction. Unclosed moments are evaluated with quadrature while collision terms are calculated using any desired method. This allows integration of the moment PDEs in time. The high computational cost of the general method is greatly reduced by careful choice of the velocity moments, allowing the necessary integrals to be reduced from three- to one-dimensional in the case of strictly 1D flows. A method to extend this enhancement to fully 3D flows is discussed. Comparison with relaxation and shock-wave problems using the DSMC method will be presented. Partially supported by NSF grant DMS-1418903.
The SIETTE Automatic Assessment Environment
ERIC Educational Resources Information Center
Conejo, Ricardo; Guzmán, Eduardo; Trella, Monica
2016-01-01
This article describes the evolution and current state of the domain-independent Siette assessment environment. Siette supports different assessment methods--including classical test theory, item response theory, and computer adaptive testing--and integrates them with multidimensional student models used by intelligent educational systems.…
Optical device terahertz integration in a two-dimensional-three-dimensional heterostructure.
Feng, Zhifang; Lin, Jie; Feng, Shuai
2018-01-10
The transmission properties of an off-planar integrated circuit including two wavelength division demultiplexers are designed, simulated, and analyzed in detail by the finite-difference time-domain method. The results show that the wavelength selection for different ports (0.404[c/a] at B 2 port, 0.389[c/a] at B 3 port, and 0.394[c/a] at B 4 port) can be realized by adjusting the parameters. It is especially important that the off-planar integration between two complex devices is also realized. These simulated results give valuable promotions in the all-optical integrated circuit, especially in compact integration.
Investigation for connecting waveguide in off-planar integrated circuits.
Lin, Jie; Feng, Zhifang
2017-09-01
The transmission properties of a vertical waveguide connected by different devices in off-planar integrated circuits are designed, investigated, and analyzed in detail by the finite-difference time-domain method. The results show that both guide bandwidth and transmission efficiency can be adjusted effectively by shifting the vertical waveguide continuously. Surprisingly, the wide guide band (0.385[c/a]∼0.407[c/a]) and well transmission (-6 dB) are observed simultaneously in several directions when the vertical waveguide is located at a specific location. The results are very important for all-optical integrated circuits, especially in compact integration.
An analytic approach to sunset diagrams in chiral perturbation theory: Theory and practice
NASA Astrophysics Data System (ADS)
Ananthanarayan, B.; Bijnens, Johan; Ghosh, Shayan; Hebbar, Aditya
2016-12-01
We demonstrate the use of several code implementations of the Mellin-Barnes method available in the public domain to derive analytic expressions for the sunset diagrams that arise in the two-loop contribution to the pion mass and decay constant in three-flavoured chiral perturbation theory. We also provide results for all possible two mass configurations of the sunset integral, and derive a new one-dimensional integral representation for the one mass sunset integral with arbitrary external momentum. Thoroughly annotated Mathematica notebooks are provided as ancillary files in the Electronic Supplementary Material to this paper, which may serve as pedagogical supplements to the methods described in this paper.
Derrac, Joaquín; Triguero, Isaac; Garcia, Salvador; Herrera, Francisco
2012-10-01
Cooperative coevolution is a successful trend of evolutionary computation which allows us to define partitions of the domain of a given problem, or to integrate several related techniques into one, by the use of evolutionary algorithms. It is possible to apply it to the development of advanced classification methods, which integrate several machine learning techniques into a single proposal. A novel approach integrating instance selection, instance weighting, and feature weighting into the framework of a coevolutionary model is presented in this paper. We compare it with a wide range of evolutionary and nonevolutionary related methods, in order to show the benefits of the employment of coevolution to apply the techniques considered simultaneously. The results obtained, contrasted through nonparametric statistical tests, show that our proposal outperforms other methods in the comparison, thus becoming a suitable tool in the task of enhancing the nearest neighbor classifier.
Note on the eigensolution of a homogeneous equation with semi-infinite domain
NASA Technical Reports Server (NTRS)
Wadia, A. R.
1980-01-01
The 'variation-iteration' method using Green's functions to find the eigenvalues and the corresponding eigenfunctions of a homogeneous Fredholm integral equation is employed for the stability analysis of fluid hydromechanics problems with a semiinfinite (infinite) domain of application. The objective of the study is to develop a suitable numerical approach to the solution of such equations in order to better understand the full set of equations for 'real-world' flow models. The study involves a search for a suitable value of the length of the domain which is a fair finite approximation to infinity, which makes the eigensolution an approximation dependent on the length of the interval chosen. In the examples investigated y = 1 = a seems to be the best approximation of infinity; for y greater than unity this method fails due to the polynomial nature of Green's functions.
Modeling human response errors in synthetic flight simulator domain
NASA Technical Reports Server (NTRS)
Ntuen, Celestine A.
1992-01-01
This paper presents a control theoretic approach to modeling human response errors (HRE) in the flight simulation domain. The human pilot is modeled as a supervisor of a highly automated system. The synthesis uses the theory of optimal control pilot modeling for integrating the pilot's observation error and the error due to the simulation model (experimental error). Methods for solving the HRE problem are suggested. Experimental verification of the models will be tested in a flight quality handling simulation.
An Ada Based Expert System for the Ada Version of SAtool II. Volume 1 and 2
1991-06-06
Integrated Computer-Aided Manufacturing (ICAM) (20). In fact, IDEF 0 stands for ICAM Definition Method Zero . IDEF0 defines a subset of SA that omits...reasoning that has been programmed). An expert’s knowledge is specific to one problem domain as opposed to knowledge about general problem-solving...techniques. General problem domains are medicine, finance, science or engineering and so forth in which an expert can solve specific problems very well
James, Susan; Harris, Sara; Foster, Gary; Clarke, Juanne; Gadermann, Anne; Morrison, Marie; Bezanson, Birdie Jane
2013-01-01
This article outlines a model for conducting psychotherapy with people of diverse cultural backgrounds. The theoretical foundation for the model is based on clinical and cultural psychology. Cultural psychology integrates psychology and anthropology in order to provide a complex understanding of both culture and the individual within his or her cultural context. The model proposed in this article is also based on our clinical experience and mixed-method research with the Portuguese community. The model demonstrates its value with ethnic minority clients by situating the clients within the context of their multi-layered social reality. The individual, familial, socio-cultural, and religio-moral domains are explored in two research projects, revealing the interrelation of these levels/contexts. The article is structured according to these domains. Study 1 is a quantitative study that validates the Agonias Questionnaire in Ontario. The results of this study are used to illustrate the individual domain of our proposed model. Study 2 is an ethnography conducted in the Azorean Islands, and the results of this study are integrated to illustrate the other three levels of the model, namely family, socio-cultural, and the religio-moral levels. PMID:23720642
The effectivenes of science domain-based science learning integrated with local potency
NASA Astrophysics Data System (ADS)
Kurniawati, Arifah Putri; Prasetyo, Zuhdan Kun; Wilujeng, Insih; Suryadarma, I. Gusti Putu
2017-08-01
This research aimed to determine the significant effect of science domain-based science learning integrated with local potency toward science process skills. The research method used was a quasi-experimental design with nonequivalent control group design. The population of this research was all students of class VII SMP Negeri 1 Muntilan. The sample of this research was selected through cluster random sampling, namely class VII B as an experiment class (24 students) and class VII C as a control class (24 students). This research used a test instrument that was adapted from Agus Dwianto's research. The aspect of science process skills in this research was observation, classification, interpretation and communication. The analysis of data used the one factor anova at 0,05 significance level and normalized gain score. The significance level result of science process skills with one factor anova is 0,000. It shows that the significance level < alpha (0,05). It means that there was significant effect of science domain-based science learning integrated with local potency toward science learning process skills. The results of analysis show that the normalized gain score are 0,29 (low category) in control class and 0,67 (medium category) in experiment class.
NASA Technical Reports Server (NTRS)
Rochon, Gilbert L.
1989-01-01
A user requirements analysis (URA) was undertaken to determine and appropriate public domain Geographic Information System (GIS) software package for potential integration with NASA's LAS (Land Analysis System) 5.0 image processing system. The necessity for a public domain system was underscored due to the perceived need for source code access and flexibility in tailoring the GIS system to the needs of a heterogenous group of end-users, and to specific constraints imposed by LAS and its user interface, Transportable Applications Executive (TAE). Subsequently, a review was conducted of a variety of public domain GIS candidates, including GRASS 3.0, MOSS, IEMIS, and two university-based packages, IDRISI and KBGIS. The review method was a modified version of the GIS evaluation process, development by the Federal Interagency Coordinating Committee on Digital Cartography. One IEMIS-derivative product, the ALBE (AirLand Battlefield Environment) GIS, emerged as the most promising candidate for integration with LAS. IEMIS (Integrated Emergency Management Information System) was developed by the Federal Emergency Management Agency (FEMA). ALBE GIS is currently under development at the Pacific Northwest Laboratory under contract with the U.S. Army Corps of Engineers' Engineering Topographic Laboratory (ETL). Accordingly, recommendations are offered with respect to a potential LAS/ALBE GIS linkage and with respect to further system enhancements, including coordination with the development of the Spatial Analysis and Modeling System (SAMS) GIS in Goddard's IDM (Intelligent Data Management) developments in Goddard's National Space Science Data Center.
Prediction of submarine scattered noise by the acoustic analogy
NASA Astrophysics Data System (ADS)
Testa, C.; Greco, L.
2018-07-01
The prediction of the noise scattered by a submarine subject to the propeller tonal noise is here addressed through a non-standard frequency-domain formulation that extends the use of the acoustic analogy to scattering problems. A boundary element method yields the scattered pressure upon the hull surface by the solution of a boundary integral equation, whereas the noise radiated in the fluid domain is evaluated by the corresponding boundary integral representation. Propeller-induced incident pressure field on the scatterer is detected by combining an unsteady three-dimensional panel method with the Bernoulli equation. For each frequency of interest, numerical results concern with sound pressure levels upon the hull and in the flowfield. The validity of the results is established by a comparison with a time-marching hydrodynamic panel method that solves propeller and hull jointly. Within the framework of potential-flow hydrodynamics, it is found out that the scattering formulation herein proposed is appropriate to successfully capture noise magnitude and directivity both on the hull surface and in the flowfield, yielding a computationally efficient solution procedure that may be useful in preliminary design/multidisciplinary optimization applications.
Instruments Measuring Integrated Care: A Systematic Review of Measurement Properties
BAUTISTA, MARY ANN C.; NURJONO, MILAWATY; DESSERS, EZRA; VRIJHOEF, HUBERTUS JM
2016-01-01
Policy Points: Investigations on systematic methodologies for measuring integrated care should coincide with the growing interest in this field of research.A systematic review of instruments provides insights into integrated care measurement, including setting the research agenda for validating available instruments and informing the decision to develop new ones.This study is the first systematic review of instruments measuring integrated care with an evidence synthesis of the measurement properties.We found 209 index instruments measuring different constructs related to integrated care; the strength of evidence on the adequacy of the majority of their measurement properties remained largely unassessed. Context Integrated care is an important strategy for increasing health system performance. Despite its growing significance, detailed evidence on the measurement properties of integrated care instruments remains vague and limited. Our systematic review aims to provide evidence on the state of the art in measuring integrated care. Methods Our comprehensive systematic review framework builds on the Rainbow Model for Integrated Care (RMIC). We searched MEDLINE/PubMed for published articles on the measurement properties of instruments measuring integrated care and identified eligible articles using a standard set of selection criteria. We assessed the methodological quality of every validation study reported using the COSMIN checklist and extracted data on study and instrument characteristics. We also evaluated the measurement properties of each examined instrument per validation study and provided a best evidence synthesis on the adequacy of measurement properties of the index instruments. Findings From the 300 eligible articles, we assessed the methodological quality of 379 validation studies from which we identified 209 index instruments measuring integrated care constructs. The majority of studies reported on instruments measuring constructs related to care integration (33%) and patient‐centered care (49%); fewer studies measured care continuity/comprehensive care (15%) and care coordination/case management (3%). We mapped 84% of the measured constructs to the clinical integration domain of the RMIC, with fewer constructs related to the domains of professional (3.7%), organizational (3.4%), and functional (0.5%) integration. Only 8% of the instruments were mapped to a combination of domains; none were mapped exclusively to the system or normative integration domains. The majority of instruments were administered to either patients (60%) or health care providers (20%). Of the measurement properties, responsiveness (4%), measurement error (7%), and criterion (12%) and cross‐cultural validity (14%) were less commonly reported. We found <50% of the validation studies to be of good or excellent quality for any of the measurement properties. Only a minority of index instruments showed strong evidence of positive findings for internal consistency (15%), content validity (19%), and structural validity (7%); with moderate evidence of positive findings for internal consistency (14%) and construct validity (14%). Conclusions Our results suggest that the quality of measurement properties of instruments measuring integrated care is in need of improvement with the less‐studied constructs and domains to become part of newly developed instruments. PMID:27995711
Multimodal manifold-regularized transfer learning for MCI conversion prediction.
Cheng, Bo; Liu, Mingxia; Suk, Heung-Il; Shen, Dinggang; Zhang, Daoqiang
2015-12-01
As the early stage of Alzheimer's disease (AD), mild cognitive impairment (MCI) has high chance to convert to AD. Effective prediction of such conversion from MCI to AD is of great importance for early diagnosis of AD and also for evaluating AD risk pre-symptomatically. Unlike most previous methods that used only the samples from a target domain to train a classifier, in this paper, we propose a novel multimodal manifold-regularized transfer learning (M2TL) method that jointly utilizes samples from another domain (e.g., AD vs. normal controls (NC)) as well as unlabeled samples to boost the performance of the MCI conversion prediction. Specifically, the proposed M2TL method includes two key components. The first one is a kernel-based maximum mean discrepancy criterion, which helps eliminate the potential negative effect induced by the distributional difference between the auxiliary domain (i.e., AD and NC) and the target domain (i.e., MCI converters (MCI-C) and MCI non-converters (MCI-NC)). The second one is a semi-supervised multimodal manifold-regularized least squares classification method, where the target-domain samples, the auxiliary-domain samples, and the unlabeled samples can be jointly used for training our classifier. Furthermore, with the integration of a group sparsity constraint into our objective function, the proposed M2TL has a capability of selecting the informative samples to build a robust classifier. Experimental results on the Alzheimer's Disease Neuroimaging Initiative (ADNI) database validate the effectiveness of the proposed method by significantly improving the classification accuracy of 80.1 % for MCI conversion prediction, and also outperforming the state-of-the-art methods.
A transformed path integral approach for solution of the Fokker-Planck equation
NASA Astrophysics Data System (ADS)
Subramaniam, Gnana M.; Vedula, Prakash
2017-10-01
A novel path integral (PI) based method for solution of the Fokker-Planck equation is presented. The proposed method, termed the transformed path integral (TPI) method, utilizes a new formulation for the underlying short-time propagator to perform the evolution of the probability density function (PDF) in a transformed computational domain where a more accurate representation of the PDF can be ensured. The new formulation, based on a dynamic transformation of the original state space with the statistics of the PDF as parameters, preserves the non-negativity of the PDF and incorporates short-time properties of the underlying stochastic process. New update equations for the state PDF in a transformed space and the parameters of the transformation (including mean and covariance) that better accommodate nonlinearities in drift and non-Gaussian behavior in distributions are proposed (based on properties of the SDE). Owing to the choice of transformation considered, the proposed method maps a fixed grid in transformed space to a dynamically adaptive grid in the original state space. The TPI method, in contrast to conventional methods such as Monte Carlo simulations and fixed grid approaches, is able to better represent the distributions (especially the tail information) and better address challenges in processes with large diffusion, large drift and large concentration of PDF. Additionally, in the proposed TPI method, error bounds on the probability in the computational domain can be obtained using the Chebyshev's inequality. The benefits of the TPI method over conventional methods are illustrated through simulations of linear and nonlinear drift processes in one-dimensional and multidimensional state spaces. The effects of spatial and temporal grid resolutions as well as that of the diffusion coefficient on the error in the PDF are also characterized.
Proceedings of the Third Annual Symposium on Mathematical Pattern Recognition and Image Analysis
NASA Technical Reports Server (NTRS)
Guseman, L. F., Jr.
1985-01-01
Topics addressed include: multivariate spline method; normal mixture analysis applied to remote sensing; image data analysis; classifications in spatially correlated environments; probability density functions; graphical nonparametric methods; subpixel registration analysis; hypothesis integration in image understanding systems; rectification of satellite scanner imagery; spatial variation in remotely sensed images; smooth multidimensional interpolation; and optimal frequency domain textural edge detection filters.
NASA Astrophysics Data System (ADS)
Bekkouche, S.; Chouarfia, A.
2011-06-01
Image watermarking can be defined as a technique that allows insertion of imperceptible and indelible digital data into an image. In addition to its initial application which is the copyright, watermarking can be used in other fields, particularly in the medical field in order to contribute to secure images shared on the network for telemedicine applications. In this report we study some watermarking methods and the comparison result of their combination, the first one is based on the CDMA (Code Division Multiple Access) in DWT and spatial domain and its aim is to verify the image authenticity whereas the second one is the reversible watermarking (the least significant bits LSB and cryptography tools) and the reversible contrast mapping RCM its objective is to check the integrity of the image and to keep the Confidentiality of the patient data. A new scheme of watermarking is the combination of the reversible watermarking method based on LSB and cryptography tools and the method of CDMA in spatial and DWT domain to verify the three security properties Integrity, Authenticity and confidentiality of medical data and patient information .In the end ,we made a comparison between these methods within the parameters of quality of medical images. Initially, an in-depth study on the characteristics of medical images would contribute to improve these methods to mitigate their limits and to optimize the results. Tests were done on IRM kind of medical images and the quality measurements have been done on the watermarked image to verify that this technique does not lead to a wrong diagnostic. The robustness of the watermarked images against attacks has been verified on the parameters of PSNR, SNR, MSE and MAE which the experimental result demonstrated that the proposed algorithm is good and robust in DWT than in spatial domain.
Robust location of optical fiber modes via the argument principle method
NASA Astrophysics Data System (ADS)
Chen, Parry Y.; Sivan, Yonatan
2017-05-01
We implement a robust, globally convergent root search method for transcendental equations guaranteed to locate all complex roots within a specified search domain, based on Cauchy's residue theorem. Although several implementations of the argument principle already exist, ours has several advantages: it allows singularities within the search domain and branch points are not fatal to the method. Furthermore, our implementation is simple and is written in MATLAB, fulfilling the need for an easily integrated implementation which can be readily modified to accommodate the many variations of the argument principle method, each of which is suited to a different application. We apply the method to the step index fiber dispersion relation, which has become topical due to the recent proliferation of high index contrast fibers. We also find modes with permittivity as the eigenvalue, catering to recent numerical methods that expand the radiation of sources using eigenmodes.
A Fast Solver for Implicit Integration of the Vlasov--Poisson System in the Eulerian Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garrett, C. Kristopher; Hauck, Cory D.
In this paper, we present a domain decomposition algorithm to accelerate the solution of Eulerian-type discretizations of the linear, steady-state Vlasov equation. The steady-state solver then forms a key component in the implementation of fully implicit or nearly fully implicit temporal integrators for the nonlinear Vlasov--Poisson system. The solver relies on a particular decomposition of phase space that enables the use of sweeping techniques commonly used in radiation transport applications. The original linear system for the phase space unknowns is then replaced by a smaller linear system involving only unknowns on the boundary between subdomains, which can then be solvedmore » efficiently with Krylov methods such as GMRES. Steady-state solves are combined to form an implicit Runge--Kutta time integrator, and the Vlasov equation is coupled self-consistently to the Poisson equation via a linearized procedure or a nonlinear fixed-point method for the electric field. Finally, numerical results for standard test problems demonstrate the efficiency of the domain decomposition approach when compared to the direct application of an iterative solver to the original linear system.« less
A Fast Solver for Implicit Integration of the Vlasov--Poisson System in the Eulerian Framework
Garrett, C. Kristopher; Hauck, Cory D.
2018-04-05
In this paper, we present a domain decomposition algorithm to accelerate the solution of Eulerian-type discretizations of the linear, steady-state Vlasov equation. The steady-state solver then forms a key component in the implementation of fully implicit or nearly fully implicit temporal integrators for the nonlinear Vlasov--Poisson system. The solver relies on a particular decomposition of phase space that enables the use of sweeping techniques commonly used in radiation transport applications. The original linear system for the phase space unknowns is then replaced by a smaller linear system involving only unknowns on the boundary between subdomains, which can then be solvedmore » efficiently with Krylov methods such as GMRES. Steady-state solves are combined to form an implicit Runge--Kutta time integrator, and the Vlasov equation is coupled self-consistently to the Poisson equation via a linearized procedure or a nonlinear fixed-point method for the electric field. Finally, numerical results for standard test problems demonstrate the efficiency of the domain decomposition approach when compared to the direct application of an iterative solver to the original linear system.« less
An analytical-numerical method for determining the mechanical response of a condenser microphone
Homentcovschi, Dorel; Miles, Ronald N.
2011-01-01
The paper is based on determining the reaction pressure on the diaphragm of a condenser microphone by integrating numerically the frequency domain Stokes system describing the velocity and the pressure in the air domain beneath the diaphragm. Afterwards, the membrane displacement can be obtained analytically or numerically. The method is general and can be applied to any geometry of the backplate holes, slits, and backchamber. As examples, the method is applied to the Bruel & Kjaer (B&K) 4134 1/2-inch microphone determining the mechanical sensitivity and the mechano-thermal noise for a domain of frequencies and also the displacement field of the membrane for two specified frequencies. These elements compare well with the measured values published in the literature. Also a new design, completely micromachined (including the backvolume) of the B&K micro-electro-mechanical systems (MEM) 1/4-inch measurement microphone is proposed. It is shown that its mechanical performances are very similar to those of the B&K MEMS measurement microphone. PMID:22225026
An analytical-numerical method for determining the mechanical response of a condenser microphone.
Homentcovschi, Dorel; Miles, Ronald N
2011-12-01
The paper is based on determining the reaction pressure on the diaphragm of a condenser microphone by integrating numerically the frequency domain Stokes system describing the velocity and the pressure in the air domain beneath the diaphragm. Afterwards, the membrane displacement can be obtained analytically or numerically. The method is general and can be applied to any geometry of the backplate holes, slits, and backchamber. As examples, the method is applied to the Bruel & Kjaer (B&K) 4134 1/2-inch microphone determining the mechanical sensitivity and the mechano-thermal noise for a domain of frequencies and also the displacement field of the membrane for two specified frequencies. These elements compare well with the measured values published in the literature. Also a new design, completely micromachined (including the backvolume) of the B&K micro-electro-mechanical systems (MEM) 1/4-inch measurement microphone is proposed. It is shown that its mechanical performances are very similar to those of the B&K MEMS measurement microphone. © 2011 Acoustical Society of America
Design of supercritical cascades with high solidity
NASA Technical Reports Server (NTRS)
Sanz, J. M.
1982-01-01
The method of complex characteristics of Garabedian and Korn was successfully used to design shockless cascades with solidities of up to one. A code was developed using this method and a new hodograph transformation of the flow onto an ellipse. This code allows the design of cascades with solidities of up to two and larger turning angles. The equations of potential flow are solved in a complex hodograph like domain by setting a characteristic initial value problem and integrating along suitable paths. The topology that the new mapping introduces permits a simpler construction of these paths of integration.
Indirect (source-free) integration method. I. Wave-forms from geodesic generic orbits of EMRIs
NASA Astrophysics Data System (ADS)
Ritter, Patxi; Aoudia, Sofiane; Spallicci, Alessandro D. A. M.; Cordier, Stéphane
2016-12-01
The Regge-Wheeler-Zerilli (RWZ) wave-equation describes Schwarzschild-Droste black hole perturbations. The source term contains a Dirac distribution and its derivative. We have previously designed a method of integration in time domain. It consists of a finite difference scheme where analytic expressions, dealing with the wave-function discontinuity through the jump conditions, replace the direct integration of the source and the potential. Herein, we successfully apply the same method to the geodesic generic orbits of EMRI (Extreme Mass Ratio Inspiral) sources, at second order. An EMRI is a Compact Star (CS) captured by a Super-Massive Black Hole (SMBH). These are considered the best probes for testing gravitation in strong regime. The gravitational wave-forms, the radiated energy and angular momentum at infinity are computed and extensively compared with other methods, for different orbits (circular, elliptic, parabolic, including zoom-whirl).
NASA Astrophysics Data System (ADS)
Thakore, Arun K.; Sauer, Frank
1994-05-01
The organization of modern medical care environments into disease-related clusters, such as a cancer center, a diabetes clinic, etc., has the side-effect of introducing multiple heterogeneous databases, often containing similar information, within the same organization. This heterogeneity fosters incompatibility and prevents the effective sharing of data amongst applications at different sites. Although integration of heterogeneous databases is now feasible, in the medical arena this is often an ad hoc process, not founded on proven database technology or formal methods. In this paper we illustrate the use of a high-level object- oriented semantic association method to model information found in different databases into an integrated conceptual global model that integrates the databases. We provide examples from the medical domain to illustrate an integration approach resulting in a consistent global view, without attacking the autonomy of the underlying databases.
Van de Cavey, Joris; Hartsuiker, Robert J
2016-01-01
Cognitive processing in many domains (e.g., sentence comprehension, music listening, and math solving) requires sequential information to be organized into an integrational structure. There appears to be some overlap in integrational processing across domains, as shown by cross-domain interference effects when for example linguistic and musical stimuli are jointly presented (Koelsch, Gunter, Wittfoth, & Sammler, 2005; Slevc, Rosenberg, & Patel, 2009). These findings support theories of overlapping resources for integrational processing across domains (cfr. SSIRH Patel, 2003; SWM, Kljajevic, 2010). However, there are some limitations to the studies mentioned above, such as the frequent use of unnaturalistic integrational difficulties. In recent years, the idea has risen that evidence for domain-generality in structural processing might also be yielded though priming paradigms (cfr. Scheepers, 2003). The rationale behind this is that integrational processing across domains regularly requires the processing of dependencies across short or long distances in the sequence, involving respectively less or more syntactic working memory resources (cfr. SWM, Kljajevic, 2010), and such processing decisions might persist over time. However, whereas recent studies have shown suggestive priming of integrational structure between language and arithmetics (though often dependent on arithmetic performance, cfr. Scheepers et al., 2011; Scheepers & Sturt, 2014), it remains to be investigated to what extent we can also find evidence for priming in other domains, such as music and action (cfr. SWM, Kljajevic, 2010). Experiment 1a showed structural priming from the processing of musical sequences onto the position in the sentence structure (early or late) to which a relative clause was attached in subsequent sentence completion. Importantly, Experiment 1b showed that a similar structural manipulation based on non-hierarchically ordered color sequences did not yield any priming effect, suggesting that the priming effect is not based on linear order, but integrational dependency. Finally, Experiment 2 presented primes in four domains (relative clause sentences, music, mathematics, and structured descriptions of actions), and consistently showed priming within and across domains. These findings provide clear evidence for domain-general structural processing mechanisms. Copyright © 2015 Elsevier B.V. All rights reserved.
Time-of-flight depth image enhancement using variable integration time
NASA Astrophysics Data System (ADS)
Kim, Sun Kwon; Choi, Ouk; Kang, Byongmin; Kim, James Dokyoon; Kim, Chang-Yeong
2013-03-01
Time-of-Flight (ToF) cameras are used for a variety of applications because it delivers depth information at a high frame rate. These cameras, however, suffer from challenging problems such as noise and motion artifacts. To increase signal-to-noise ratio (SNR), the camera should calculate a distance based on a large amount of infra-red light, which needs to be integrated over a long time. On the other hand, the integration time should be short enough to suppress motion artifacts. We propose a ToF depth imaging method to combine advantages of short and long integration times exploiting an imaging fusion scheme proposed for color imaging. To calibrate depth differences due to the change of integration times, a depth transfer function is estimated by analyzing the joint histogram of depths in the two images of different integration times. The depth images are then transformed into wavelet domains and fused into a depth image with suppressed noise and low motion artifacts. To evaluate the proposed method, we captured a moving bar of a metronome with different integration times. The experiment shows the proposed method could effectively remove the motion artifacts while preserving high SNR comparable to the depth images acquired during long integration time.
A time-domain finite element boundary integral approach for elastic wave scattering
NASA Astrophysics Data System (ADS)
Shi, F.; Lowe, M. J. S.; Skelton, E. A.; Craster, R. V.
2018-04-01
The response of complex scatterers, such as rough or branched cracks, to incident elastic waves is required in many areas of industrial importance such as those in non-destructive evaluation and related fields; we develop an approach to generate accurate and rapid simulations. To achieve this we develop, in the time domain, an implementation to efficiently couple the finite element (FE) method within a small local region, and the boundary integral (BI) globally. The FE explicit scheme is run in a local box to compute the surface displacement of the scatterer, by giving forcing signals to excitation nodes, which can lie on the scatterer itself. The required input forces on the excitation nodes are obtained with a reformulated FE equation, according to the incident displacement field. The surface displacements computed by the local FE are then projected, through time-domain BI formulae, to calculate the scattering signals with different modes. This new method yields huge improvements in the efficiency of FE simulations for scattering from complex scatterers. We present results using different shapes and boundary conditions, all simulated using this approach in both 2D and 3D, and then compare with full FE models and theoretical solutions to demonstrate the efficiency and accuracy of this numerical approach.
NASA Astrophysics Data System (ADS)
Qarib, Hossein; Adeli, Hojjat
2015-12-01
In this paper authors introduce a new adaptive signal processing technique for feature extraction and parameter estimation in noisy exponentially damped signals. The iterative 3-stage method is based on the adroit integration of the strengths of parametric and nonparametric methods such as multiple signal categorization, matrix pencil, and empirical mode decomposition algorithms. The first stage is a new adaptive filtration or noise removal scheme. The second stage is a hybrid parametric-nonparametric signal parameter estimation technique based on an output-only system identification technique. The third stage is optimization of estimated parameters using a combination of the primal-dual path-following interior point algorithm and genetic algorithm. The methodology is evaluated using a synthetic signal and a signal obtained experimentally from transverse vibrations of a steel cantilever beam. The method is successful in estimating the frequencies accurately. Further, it estimates the damping exponents. The proposed adaptive filtration method does not include any frequency domain manipulation. Consequently, the time domain signal is not affected as a result of frequency domain and inverse transformations.
A spectral multi-domain technique applied to high-speed chemically reacting flows
NASA Technical Reports Server (NTRS)
Macaraeg, Michele G.; Streett, Craig L.; Hussaini, M. Yousuff
1989-01-01
The first applications of a spectral multidomain method for viscous compressible flow is presented. The method imposes a global flux balance condition at the interface so that high-order continuity of the solution is preserved. The global flux balance is imposed in terms of a spectral integral of the discrete equations across adjoining domains. Since the discretized equations interior to each domain solved are uncoupled from each other, and since the interface relation has a block structure, the solution scheme can be adapted to the particular requirements of each subdomain. The spectral multidomain technique presented is well-suited for the multiple scales associated with the chemically reacting and transition flows in hypersonic research. A nonstaggered multidomain discretization is used for the chemically reacting flow calculation, and the first implementation of a staggered multidomain mesh is presented for accurately solving the stability equation for a viscous compressible fluid.
Streamline integration as a method for two-dimensional elliptic grid generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wiesenberger, M., E-mail: Matthias.Wiesenberger@uibk.ac.at; Held, M.; Einkemmer, L.
We propose a new numerical algorithm to construct a structured numerical elliptic grid of a doubly connected domain. Our method is applicable to domains with boundaries defined by two contour lines of a two-dimensional function. Furthermore, we can adapt any analytically given boundary aligned structured grid, which specifically includes polar and Cartesian grids. The resulting coordinate lines are orthogonal to the boundary. Grid points as well as the elements of the Jacobian matrix can be computed efficiently and up to machine precision. In the simplest case we construct conformal grids, yet with the help of weight functions and monitor metricsmore » we can control the distribution of cells across the domain. Our algorithm is parallelizable and easy to implement with elementary numerical methods. We assess the quality of grids by considering both the distribution of cell sizes and the accuracy of the solution to elliptic problems. Among the tested grids these key properties are best fulfilled by the grid constructed with the monitor metric approach. - Graphical abstract: - Highlights: • Construct structured, elliptic numerical grids with elementary numerical methods. • Align coordinate lines with or make them orthogonal to the domain boundary. • Compute grid points and metric elements up to machine precision. • Control cell distribution by adaption functions or monitor metrics.« less
Photonic crystal ring resonator based optical filters for photonic integrated circuits
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, S., E-mail: mail2robinson@gmail.com
In this paper, a two Dimensional (2D) Photonic Crystal Ring Resonator (PCRR) based optical Filters namely Add Drop Filter, Bandpass Filter, and Bandstop Filter are designed for Photonic Integrated Circuits (PICs). The normalized output response of the filters is obtained using 2D Finite Difference Time Domain (FDTD) method and the band diagram of periodic and non-periodic structure is attained by Plane Wave Expansion (PWE) method. The size of the device is minimized from a scale of few tens of millimeters to the order of micrometers. The overall size of the filters is around 11.4 μm × 11.4 μm which ismore » highly suitable of photonic integrated circuits.« less
Object-oriented integrated approach for the design of scalable ECG systems.
Boskovic, Dusanka; Besic, Ingmar; Avdagic, Zikrija
2009-01-01
The paper presents the implementation of Object-Oriented (OO) integrated approaches to the design of scalable Electro-Cardio-Graph (ECG) Systems. The purpose of this methodology is to preserve real-world structure and relations with the aim to minimize the information loss during the process of modeling, especially for Real-Time (RT) systems. We report on a case study of the design that uses the integration of OO and RT methods and the Unified Modeling Language (UML) standard notation. OO methods identify objects in the real-world domain and use them as fundamental building blocks for the software system. The gained experience based on the strongly defined semantics of the object model is discussed and related problems are analyzed.
Optimizing Cubature for Efficient Integration of Subspace Deformations
An, Steven S.; Kim, Theodore; James, Doug L.
2009-01-01
We propose an efficient scheme for evaluating nonlinear subspace forces (and Jacobians) associated with subspace deformations. The core problem we address is efficient integration of the subspace force density over the 3D spatial domain. Similar to Gaussian quadrature schemes that efficiently integrate functions that lie in particular polynomial subspaces, we propose cubature schemes (multi-dimensional quadrature) optimized for efficient integration of force densities associated with particular subspace deformations, particular materials, and particular geometric domains. We support generic subspace deformation kinematics, and nonlinear hyperelastic materials. For an r-dimensional deformation subspace with O(r) cubature points, our method is able to evaluate subspace forces at O(r2) cost. We also describe composite cubature rules for runtime error estimation. Results are provided for various subspace deformation models, several hyperelastic materials (St.Venant-Kirchhoff, Mooney-Rivlin, Arruda-Boyce), and multimodal (graphics, haptics, sound) applications. We show dramatically better efficiency than traditional Monte Carlo integration. CR Categories: I.6.8 [Simulation and Modeling]: Types of Simulation—Animation, I.3.5 [Computer Graphics]: Computational Geometry and Object Modeling—Physically based modeling G.1.4 [Mathematics of Computing]: Numerical Analysis—Quadrature and Numerical Differentiation PMID:19956777
An improved local radial point interpolation method for transient heat conduction analysis
NASA Astrophysics Data System (ADS)
Wang, Feng; Lin, Gao; Zheng, Bao-Jing; Hu, Zhi-Qiang
2013-06-01
The smoothing thin plate spline (STPS) interpolation using the penalty function method according to the optimization theory is presented to deal with transient heat conduction problems. The smooth conditions of the shape functions and derivatives can be satisfied so that the distortions hardly occur. Local weak forms are developed using the weighted residual method locally from the partial differential equations of the transient heat conduction. Here the Heaviside step function is used as the test function in each sub-domain to avoid the need for a domain integral. Essential boundary conditions can be implemented like the finite element method (FEM) as the shape functions possess the Kronecker delta property. The traditional two-point difference method is selected for the time discretization scheme. Three selected numerical examples are presented in this paper to demonstrate the availability and accuracy of the present approach comparing with the traditional thin plate spline (TPS) radial basis functions.
Determination of the Fracture Parameters in a Stiffened Composite Panel
NASA Technical Reports Server (NTRS)
Lin, Chung-Yi
2000-01-01
A modified J-integral, namely the equivalent domain integral, is derived for a three-dimensional anisotropic cracked solid to evaluate the stress intensity factor along the crack front using the finite element method. Based on the equivalent domain integral method with auxiliary fields, an interaction integral is also derived to extract the second fracture parameter, the T-stress, from the finite element results. The auxiliary fields are the two-dimensional plane strain solutions of monoclinic materials with the plane of symmetry at x(sub 3) = 0 under point loads applied at the crack tip. These solutions are expressed in a compact form based on the Stroh formalism. Both integrals can be implemented into a single numerical procedure to determine the distributions of stress intensity factor and T-stress components, T11, T13, and thus T33, along a three-dimensional crack front. The effects of plate thickness and crack length on the variation of the stress intensity factor and T-stresses through the thickness are investigated in detail for through-thickness center-cracked plates (isotropic and orthotropic) and orthotropic stiffened panels under pure mode-I loading conditions. For all the cases studied, T11 remains negative. For plates with the same dimensions, a larger size of crack yields larger magnitude of the normalized stress intensity factor and normalized T-stresses. The results in orthotropic stiffened panels exhibit an opposite trend in general. As expected, for the thicker panels, the fracture parameters evaluated through the thickness, except the region near the free surfaces, approach two-dimensional plane strain solutions. In summary, the numerical methods presented in this research demonstrate their high computational effectiveness and good numerical accuracy in extracting these fracture parameters from the finite element results in three-dimensional cracked solids.
Remote Sensing of Soils for Environmental Assessment and Management.
NASA Technical Reports Server (NTRS)
DeGloria, Stephen D.; Irons, James R.; West, Larry T.
2014-01-01
The next generation of imaging systems integrated with complex analytical methods will revolutionize the way we inventory and manage soil resources across a wide range of scientific disciplines and application domains. This special issue highlights those systems and methods for the direct benefit of environmental professionals and students who employ imaging and geospatial information for improved understanding, management, and monitoring of soil resources.
A framework for simultaneous aerodynamic design optimization in the presence of chaos
DOE Office of Scientific and Technical Information (OSTI.GOV)
Günther, Stefanie, E-mail: stefanie.guenther@scicomp.uni-kl.de; Gauger, Nicolas R.; Wang, Qiqi
Integrating existing solvers for unsteady partial differential equations into a simultaneous optimization method is challenging due to the forward-in-time information propagation of classical time-stepping methods. This paper applies the simultaneous single-step one-shot optimization method to a reformulated unsteady constraint that allows for both forward- and backward-in-time information propagation. Especially in the presence of chaotic and turbulent flow, solving the initial value problem simultaneously with the optimization problem often scales poorly with the time domain length. The new formulation relaxes the initial condition and instead solves a least squares problem for the discrete partial differential equations. This enables efficient one-shot optimizationmore » that is independent of the time domain length, even in the presence of chaos.« less
NASA Astrophysics Data System (ADS)
Ren, Xiaodong; Xu, Kun; Shyy, Wei
2016-07-01
This paper presents a multi-dimensional high-order discontinuous Galerkin (DG) method in an arbitrary Lagrangian-Eulerian (ALE) formulation to simulate flows over variable domains with moving and deforming meshes. It is an extension of the gas-kinetic DG method proposed by the authors for static domains (X. Ren et al., 2015 [22]). A moving mesh gas kinetic DG method is proposed for both inviscid and viscous flow computations. A flux integration method across a translating and deforming cell interface has been constructed. Differently from the previous ALE-type gas kinetic method with piecewise constant mesh velocity at each cell interface within each time step, the mesh velocity variation inside a cell and the mesh moving and rotating at a cell interface have been accounted for in the finite element framework. As a result, the current scheme is applicable for any kind of mesh movement, such as translation, rotation, and deformation. The accuracy and robustness of the scheme have been improved significantly in the oscillating airfoil calculations. All computations are conducted in a physical domain rather than in a reference domain, and the basis functions move with the grid movement. Therefore, the numerical scheme can preserve the uniform flow automatically, and satisfy the geometric conservation law (GCL). The numerical accuracy can be maintained even for a largely moving and deforming mesh. Several test cases are presented to demonstrate the performance of the gas-kinetic DG-ALE method.
Chen, I L; Chen, J T; Kuo, S R; Liang, M T
2001-03-01
Integral equation methods have been widely used to solve interior eigenproblems and exterior acoustic problems (radiation and scattering). It was recently found that the real-part boundary element method (BEM) for the interior problem results in spurious eigensolutions if the singular (UT) or the hypersingular (LM) equation is used alone. The real-part BEM results in spurious solutions for interior problems in a similar way that the singular integral equation (UT method) results in fictitious solutions for the exterior problem. To solve this problem, a Combined Helmholtz Exterior integral Equation Formulation method (CHEEF) is proposed. Based on the CHEEF method, the spurious solutions can be filtered out if additional constraints from the exterior points are chosen carefully. Finally, two examples for the eigensolutions of circular and rectangular cavities are considered. The optimum numbers and proper positions for selecting the points in the exterior domain are analytically studied. Also, numerical experiments were designed to verify the analytical results. It is worth pointing out that the nodal line of radiation mode of a circle can be rotated due to symmetry, while the nodal line of the rectangular is on a fixed position.
Computer simulation of solutions of polyharmonic equations in plane domain
NASA Astrophysics Data System (ADS)
Kazakova, A. O.
2018-05-01
A systematic study of plane problems of the theory of polyharmonic functions is presented. A method of reducing boundary problems for polyharmonic functions to the system of integral equations on the boundary of the domain is given and a numerical algorithm for simulation of solutions of this system is suggested. Particular attention is paid to the numerical solution of the main tasks when the values of the function and its derivatives are given. Test examples are considered that confirm the effectiveness and accuracy of the suggested algorithm.
NASA Astrophysics Data System (ADS)
Lorin, E.; Yang, X.; Antoine, X.
2016-06-01
The paper is devoted to develop efficient domain decomposition methods for the linear Schrödinger equation beyond the semiclassical regime, which does not carry a small enough rescaled Planck constant for asymptotic methods (e.g. geometric optics) to produce a good accuracy, but which is too computationally expensive if direct methods (e.g. finite difference) are applied. This belongs to the category of computing middle-frequency wave propagation, where neither asymptotic nor direct methods can be directly used with both efficiency and accuracy. Motivated by recent works of the authors on absorbing boundary conditions (Antoine et al. (2014) [13] and Yang and Zhang (2014) [43]), we introduce Semiclassical Schwarz Waveform Relaxation methods (SSWR), which are seamless integrations of semiclassical approximation to Schwarz Waveform Relaxation methods. Two versions are proposed respectively based on Herman-Kluk propagation and geometric optics, and we prove the convergence and provide numerical evidence of efficiency and accuracy of these methods.
Classification and Lineage Tracing of SH2 Domains Throughout Eukaryotes.
Liu, Bernard A
2017-01-01
Today there exists a rapidly expanding number of sequenced genomes. Cataloging protein interaction domains such as the Src Homology 2 (SH2) domain across these various genomes can be accomplished with ease due to existing algorithms and predictions models. An evolutionary analysis of SH2 domains provides a step towards understanding how SH2 proteins integrated with existing signaling networks to position phosphotyrosine signaling as a crucial driver of robust cellular communication networks in metazoans. However organizing and tracing SH2 domain across organisms and understanding their evolutionary trajectory remains a challenge. This chapter describes several methodologies towards analyzing the evolutionary trajectory of SH2 domains including a global SH2 domain classification system, which facilitates annotation of new SH2 sequences essential for tracing the lineage of SH2 domains throughout eukaryote evolution. This classification utilizes a combination of sequence homology, protein domain architecture and the boundary positions between introns and exons within the SH2 domain or genes encoding these domains. Discrete SH2 families can then be traced across various genomes to provide insight into its origins. Furthermore, additional methods for examining potential mechanisms for divergence of SH2 domains from structural changes to alterations in the protein domain content and genome duplication will be discussed. Therefore a better understanding of SH2 domain evolution may enhance our insight into the emergence of phosphotyrosine signaling and the expansion of protein interaction domains.
Designing Caregiver-Implemented Shared-Reading Interventions to Overcome Implementation Barriers
ERIC Educational Resources Information Center
Justice, Laura M.; Logan, Jessica R.; Damschroder, Laura
2015-01-01
Purpose: This study presents an application of the theoretical domains framework (TDF; Michie et al., 2005), an integrative framework drawing on behavior-change theories, to speech-language pathology. Methods: A multistep procedure was used to identify barriers affecting caregivers' implementation of shared-reading interventions with their…
Mobilizing Knowledge via Documentary Filmmaking--Is the Academy Ready?
ERIC Educational Resources Information Center
Petrarca, Diana M.; Hughes, Janette M.
2014-01-01
The predominant form of research dissemination resides in the scholar's domain, namely academic conferences and peer-reviewed journals. This paper describes how two colleagues and researchers integrated documentary filmmaking with research methods in their respective scholarly work, supporting the case for documentary film as an alternative form…
Integrating Semantic Information in Metadata Descriptions for a Geoscience-wide Resource Inventory.
NASA Astrophysics Data System (ADS)
Zaslavsky, I.; Richard, S. M.; Gupta, A.; Valentine, D.; Whitenack, T.; Ozyurt, I. B.; Grethe, J. S.; Schachne, A.
2016-12-01
Integrating semantic information into legacy metadata catalogs is a challenging issue and so far has been mostly done on a limited scale. We present experience of CINERGI (Community Inventory of Earthcube Resources for Geoscience Interoperability), an NSF Earthcube Building Block project, in creating a large cross-disciplinary catalog of geoscience information resources to enable cross-domain discovery. The project developed a pipeline for automatically augmenting resource metadata, in particular generating keywords that describe metadata documents harvested from multiple geoscience information repositories or contributed by geoscientists through various channels including surveys and domain resource inventories. The pipeline examines available metadata descriptions using text parsing, vocabulary management and semantic annotation and graph navigation services of GeoSciGraph. GeoSciGraph, in turn, relies on a large cross-domain ontology of geoscience terms, which bridges several independently developed ontologies or taxonomies including SWEET, ENVO, YAGO, GeoSciML, GCMD, SWO, and CHEBI. The ontology content enables automatic extraction of keywords reflecting science domains, equipment used, geospatial features, measured properties, methods, processes, etc. We specifically focus on issues of cross-domain geoscience ontology creation, resolving several types of semantic conflicts among component ontologies or vocabularies, and constructing and managing facets for improved data discovery and navigation. The ontology and keyword generation rules are iteratively improved as pipeline results are presented to data managers for selective manual curation via a CINERGI Annotator user interface. We present lessons learned from applying CINERGI metadata augmentation pipeline to a number of federal agency and academic data registries, in the context of several use cases that require data discovery and integration across multiple earth science data catalogs of varying quality and completeness. The inventory is accessible at http://cinergi.sdsc.edu, and the CINERGI project web page is http://earthcube.org/group/cinergi
Huang, Qiuhua; Vittal, Vijay
2018-05-09
Conventional electromagnetic transient (EMT) and phasor-domain hybrid simulation approaches presently exist for trans-mission system level studies. Their simulation efficiency is generally constrained by the EMT simulation. With an increasing number of distributed energy resources and non-conventional loads being installed in distribution systems, it is imperative to extend the hybrid simulation application to include distribution systems and integrated transmission and distribution systems. Meanwhile, it is equally important to improve the simulation efficiency as the modeling scope and complexity of the detailed system in the EMT simulation increases. To meet both requirements, this paper introduces an advanced EMT and phasor-domain hybrid simulationmore » approach. This approach has two main features: 1) a comprehensive phasor-domain modeling framework which supports positive-sequence, three-sequence, three-phase and mixed three-sequence/three-phase representations and 2) a robust and flexible simulation mode switching scheme. The developed scheme enables simulation switching from hybrid simulation mode back to pure phasor-domain dynamic simulation mode to achieve significantly improved simulation efficiency. The proposed method has been tested on integrated transmission and distribution systems. In conclusion, the results show that with the developed simulation switching feature, the total computational time is significantly reduced compared to running the hybrid simulation for the whole simulation period, while maintaining good simulation accuracy.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Qiuhua; Vittal, Vijay
Conventional electromagnetic transient (EMT) and phasor-domain hybrid simulation approaches presently exist for trans-mission system level studies. Their simulation efficiency is generally constrained by the EMT simulation. With an increasing number of distributed energy resources and non-conventional loads being installed in distribution systems, it is imperative to extend the hybrid simulation application to include distribution systems and integrated transmission and distribution systems. Meanwhile, it is equally important to improve the simulation efficiency as the modeling scope and complexity of the detailed system in the EMT simulation increases. To meet both requirements, this paper introduces an advanced EMT and phasor-domain hybrid simulationmore » approach. This approach has two main features: 1) a comprehensive phasor-domain modeling framework which supports positive-sequence, three-sequence, three-phase and mixed three-sequence/three-phase representations and 2) a robust and flexible simulation mode switching scheme. The developed scheme enables simulation switching from hybrid simulation mode back to pure phasor-domain dynamic simulation mode to achieve significantly improved simulation efficiency. The proposed method has been tested on integrated transmission and distribution systems. In conclusion, the results show that with the developed simulation switching feature, the total computational time is significantly reduced compared to running the hybrid simulation for the whole simulation period, while maintaining good simulation accuracy.« less
On-Chip Magnetic Platform for Single-Particle Manipulation with Integrated Electrical Feedback.
Monticelli, Marco; Torti, Andrea; Cantoni, Matteo; Petti, Daniela; Albisetti, Edoardo; Manzin, Alessandra; Guerriero, Erica; Sordan, Roman; Gervasoni, Giacomo; Carminati, Marco; Ferrari, Giorgio; Sampietro, Marco; Bertacco, Riccardo
2016-02-17
Methods for the manipulation of single magnetic particles have become very interesting, in particular for in vitro biological studies. Most of these studies require an external microscope to provide the operator with feedback for controlling the particle motion, thus preventing the use of magnetic particles in high-throughput experiments. In this paper, a simple and compact system with integrated electrical feedback is presented, implementing in the very same device both the manipulation and detection of the transit of single particles. The proposed platform is based on zig-zag shaped magnetic nanostructures, where transverse magnetic domain walls are pinned at the corners and attract magnetic particles in suspension. By applying suitable external magnetic fields, the domain walls move to the nearest corner, thus causing the step by step displacement of the particles along the nanostructure. The very same structure is also employed for detecting the bead transit. Indeed, the presence of the magnetic particle in suspension over the domain wall affects the depinning field required for its displacement. This characteristic field can be monitored through anisotropic magnetoresistance measurements, thus implementing an integrated electrical feedback of the bead transit. In particular, the individual manipulation and detection of single 1-μm sized beads is demonstrated. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Frequency-domain method for discrete frequency noise prediction of rotors in arbitrary steady motion
NASA Astrophysics Data System (ADS)
Gennaretti, M.; Testa, C.; Bernardini, G.
2012-12-01
A novel frequency-domain formulation for the prediction of the tonal noise emitted by rotors in arbitrary steady motion is presented. It is derived from Farassat's 'Formulation 1A', that is a time-domain boundary integral representation for the solution of the Ffowcs-Williams and Hawkings equation, and represents noise as harmonic response to body kinematics and aerodynamic loads via frequency-response-function matrices. The proposed frequency-domain solver is applicable to rotor configurations for which sound pressure levels of discrete tones are much higher than those of broadband noise. The numerical investigation concerns the analysis of noise produced by an advancing helicopter rotor in blade-vortex interaction conditions, as well as the examination of pressure disturbances radiated by the interaction of a marine propeller with a non-uniform inflow.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riot, Vincent J.
The present disclosure provides a system and a method for measuring fluorescence of a sample. The sample may be a polymerase-chain-reaction (PCR) array, a loop-mediated-isothermal amplification array, etc. LEDs are used to excite the sample, and a photodiode is used to collect the sample's fluorescence. An electronic offset signal is used to reduce the effects of background fluorescence and the noises from the measurement system. An integrator integrates the difference between the output of the photodiode and the electronic offset signal over a given period of time. The resulting integral is then converted into digital domain for further processing andmore » storage.« less
BUSCA: an integrative web server to predict subcellular localization of proteins.
Savojardo, Castrense; Martelli, Pier Luigi; Fariselli, Piero; Profiti, Giuseppe; Casadio, Rita
2018-04-30
Here, we present BUSCA (http://busca.biocomp.unibo.it), a novel web server that integrates different computational tools for predicting protein subcellular localization. BUSCA combines methods for identifying signal and transit peptides (DeepSig and TPpred3), GPI-anchors (PredGPI) and transmembrane domains (ENSEMBLE3.0 and BetAware) with tools for discriminating subcellular localization of both globular and membrane proteins (BaCelLo, MemLoci and SChloro). Outcomes from the different tools are processed and integrated for annotating subcellular localization of both eukaryotic and bacterial protein sequences. We benchmark BUSCA against protein targets derived from recent CAFA experiments and other specific data sets, reporting performance at the state-of-the-art. BUSCA scores better than all other evaluated methods on 2732 targets from CAFA2, with a F1 value equal to 0.49 and among the best methods when predicting targets from CAFA3. We propose BUSCA as an integrated and accurate resource for the annotation of protein subcellular localization.
The DTIC Review. Volume 2, Number 3: Optical and Infrared Detection and Countermeasures
1996-10-01
are different from those en- countered in designing wavelets for other applications. For use in time- frequency analysis of signals, for example, it...view within the field of regard, and for high -fidelity simulation of optical blurring and temporal effects such as jitter. The real-time CLDWSG method ...integration methods or, for near spatially invariant FOV regions, by convolution methods or by way of the convolution theorem using OTF frequency -domain
NASA Technical Reports Server (NTRS)
Gomez, Fernando
1989-01-01
It is shown how certain kinds of domain independent expert systems based on classification problem-solving methods can be constructed directly from natural language descriptions by a human expert. The expert knowledge is not translated into production rules. Rather, it is mapped into conceptual structures which are integrated into long-term memory (LTM). The resulting system is one in which problem-solving, retrieval and memory organization are integrated processes. In other words, the same algorithm and knowledge representation structures are shared by these processes. As a result of this, the system can answer questions, solve problems or reorganize LTM.
Hromadka, T.V.; Guymon, G.L.
1985-01-01
An algorithm is presented for the numerical solution of the Laplace equation boundary-value problem, which is assumed to apply to soil freezing or thawing. The Laplace equation is numerically approximated by the complex-variable boundary-element method. The algorithm aids in reducing integrated relative error by providing a true measure of modeling error along the solution domain boundary. This measure of error can be used to select locations for adding, removing, or relocating nodal points on the boundary or to provide bounds for the integrated relative error of unknown nodal variable values along the boundary.
Information Requirements for Integrating Spatially Discrete, Feature-Based Earth Observations
NASA Astrophysics Data System (ADS)
Horsburgh, J. S.; Aufdenkampe, A. K.; Lehnert, K. A.; Mayorga, E.; Hsu, L.; Song, L.; Zaslavsky, I.; Valentine, D. L.
2014-12-01
Several cyberinfrastructures have emerged for sharing observational data collected at densely sampled and/or highly instrumented field sites. These include the CUAHSI Hydrologic Information System (HIS), the Critical Zone Observatory Integrated Data Management System (CZOData), the Integrated Earth Data Applications (IEDA) and EarthChem system, and the Integrated Ocean Observing System (IOOS). These systems rely on standard data encodings and, in some cases, standard semantics for classes of geoscience data. Their focus is on sharing data on the Internet via web services in domain specific encodings or markup languages. While they have made progress in making data available, it still takes investigators significant effort to discover and access datasets from multiple repositories because of inconsistencies in the way domain systems describe, encode, and share data. Yet, there are many scenarios that require efficient integration of these data types across different domains. For example, understanding a soil profile's geochemical response to extreme weather events requires integration of hydrologic and atmospheric time series with geochemical data from soil samples collected over various depth intervals from soil cores or pits at different positions on a landscape. Integrated access to and analysis of data for such studies are hindered because common characteristics of data, including time, location, provenance, methods, and units are described differently within different systems. Integration requires syntactic and semantic translations that can be manual, error-prone, and lossy. We report information requirements identified as part of our work to define an information model for a broad class of earth science data - i.e., spatially-discrete, feature-based earth observations resulting from in-situ sensors and environmental samples. We sought to answer the question: "What information must accompany observational data for them to be archivable and discoverable within a publication system as well as interpretable once retrieved from such a system for analysis and (re)use?" We also describe development of multiple functional schemas (i.e., physical implementations for data storage, transfer, and archival) for the information model that capture the requirements reported here.
Margaria, Tiziana; Kubczak, Christian; Steffen, Bernhard
2008-01-01
Background With Bio-jETI, we introduce a service platform for interdisciplinary work on biological application domains and illustrate its use in a concrete application concerning statistical data processing in R and xcms for an LC/MS analysis of FAAH gene knockout. Methods Bio-jETI uses the jABC environment for service-oriented modeling and design as a graphical process modeling tool and the jETI service integration technology for remote tool execution. Conclusions As a service definition and provisioning platform, Bio-jETI has the potential to become a core technology in interdisciplinary service orchestration and technology transfer. Domain experts, like biologists not trained in computer science, directly define complex service orchestrations as process models and use efficient and complex bioinformatics tools in a simple and intuitive way. PMID:18460173
AERIS: An Integrated Domain Information System for Aerospace Science and Technology
ERIC Educational Resources Information Center
Hatua, Sudip Ranjan; Madalli, Devika P.
2011-01-01
Purpose: The purpose of this paper is to discuss the methodology in building an integrated domain information system with illustrations that provide proof of concept. Design/methodology/approach: The present work studies the usual search engine approach to information and its pitfalls. A methodology was adopted for construction of a domain-based…
Yang, S A
2002-10-01
This paper presents an effective solution method for predicting acoustic radiation and scattering fields in two dimensions. The difficulty of the fictitious characteristic frequency is overcome by incorporating an auxiliary interior surface that satisfies certain boundary condition into the body surface. This process gives rise to a set of uniquely solvable boundary integral equations. Distributing monopoles with unknown strengths over the body and interior surfaces yields the simple source formulation. The modified boundary integral equations are further transformed to ordinary ones that contain nonsingular kernels only. This implementation allows direct application of standard quadrature formulas over the entire integration domain; that is, the collocation points are exactly the positions at which the integration points are located. Selecting the interior surface is an easy task. Moreover, only a few corresponding interior nodal points are sufficient for the computation. Numerical calculations consist of the acoustic radiation and scattering by acoustically hard elliptic and rectangular cylinders. Comparisons with analytical solutions are made. Numerical results demonstrate the efficiency and accuracy of the current solution method.
García-Remesal, M; Maojo, V; Billhardt, H; Crespo, J
2010-01-01
Bringing together structured and text-based sources is an exciting challenge for biomedical informaticians, since most relevant biomedical sources belong to one of these categories. In this paper we evaluate the feasibility of integrating relational and text-based biomedical sources using: i) an original logical schema acquisition method for textual databases developed by the authors, and ii) OntoFusion, a system originally designed by the authors for the integration of relational sources. We conducted an integration experiment involving a test set of seven differently structured sources covering the domain of genetic diseases. We used our logical schema acquisition method to generate schemas for all textual sources. The sources were integrated using the methods and tools provided by OntoFusion. The integration was validated using a test set of 500 queries. A panel of experts answered a questionnaire to evaluate i) the quality of the extracted schemas, ii) the query processing performance of the integrated set of sources, and iii) the relevance of the retrieved results. The results of the survey show that our method extracts coherent and representative logical schemas. Experts' feedback on the performance of the integrated system and the relevance of the retrieved results was also positive. Regarding the validation of the integration, the system successfully provided correct results for all queries in the test set. The results of the experiment suggest that text-based sources including a logical schema can be regarded as equivalent to structured databases. Using our method, previous research and existing tools designed for the integration of structured databases can be reused - possibly subject to minor modifications - to integrate differently structured sources.
NASA Astrophysics Data System (ADS)
Dodig, H.
2017-11-01
This contribution presents the boundary integral formulation for numerical computation of time-harmonic radar cross section for 3D targets. Method relies on hybrid edge element BEM/FEM to compute near field edge element coefficients that are associated with near electric and magnetic fields at the boundary of the computational domain. Special boundary integral formulation is presented that computes radar cross section directly from these edge element coefficients. Consequently, there is no need for near-to-far field transformation (NTFFT) which is common step in RCS computations. By the end of the paper it is demonstrated that the formulation yields accurate results for canonical models such as spheres, cubes, cones and pyramids. Method has demonstrated accuracy even in the case of dielectrically coated PEC sphere at interior resonance frequency which is common problem for computational electromagnetic codes.
Chasin, Rachel; Rumshisky, Anna; Uzuner, Ozlem; Szolovits, Peter
2014-01-01
Objective To evaluate state-of-the-art unsupervised methods on the word sense disambiguation (WSD) task in the clinical domain. In particular, to compare graph-based approaches relying on a clinical knowledge base with bottom-up topic-modeling-based approaches. We investigate several enhancements to the topic-modeling techniques that use domain-specific knowledge sources. Materials and methods The graph-based methods use variations of PageRank and distance-based similarity metrics, operating over the Unified Medical Language System (UMLS). Topic-modeling methods use unlabeled data from the Multiparameter Intelligent Monitoring in Intensive Care (MIMIC II) database to derive models for each ambiguous word. We investigate the impact of using different linguistic features for topic models, including UMLS-based and syntactic features. We use a sense-tagged clinical dataset from the Mayo Clinic for evaluation. Results The topic-modeling methods achieve 66.9% accuracy on a subset of the Mayo Clinic's data, while the graph-based methods only reach the 40–50% range, with a most-frequent-sense baseline of 56.5%. Features derived from the UMLS semantic type and concept hierarchies do not produce a gain over bag-of-words features in the topic models, but identifying phrases from UMLS and using syntax does help. Discussion Although topic models outperform graph-based methods, semantic features derived from the UMLS prove too noisy to improve performance beyond bag-of-words. Conclusions Topic modeling for WSD provides superior results in the clinical domain; however, integration of knowledge remains to be effectively exploited. PMID:24441986
Slave finite elements: The temporal element approach to nonlinear analysis
NASA Technical Reports Server (NTRS)
Gellin, S.
1984-01-01
A formulation method for finite elements in space and time incorporating nonlinear geometric and material behavior is presented. The method uses interpolation polynomials for approximating the behavior of various quantities over the element domain, and only explicit integration over space and time. While applications are general, the plate and shell elements that are currently being programmed are appropriate to model turbine blades, vanes, and combustor liners.
Chevrette, Marc G; Aicheler, Fabian; Kohlbacher, Oliver; Currie, Cameron R; Medema, Marnix H
2017-10-15
Nonribosomally synthesized peptides (NRPs) are natural products with widespread applications in medicine and biotechnology. Many algorithms have been developed to predict the substrate specificities of nonribosomal peptide synthetase adenylation (A) domains from DNA sequences, which enables prioritization and dereplication, and integration with other data types in discovery efforts. However, insufficient training data and a lack of clarity regarding prediction quality have impeded optimal use. Here, we introduce prediCAT, a new phylogenetics-inspired algorithm, which quantitatively estimates the degree of predictability of each A-domain. We then systematically benchmarked all algorithms on a newly gathered, independent test set of 434 A-domain sequences, showing that active-site-motif-based algorithms outperform whole-domain-based methods. Subsequently, we developed SANDPUMA, a powerful ensemble algorithm, based on newly trained versions of all high-performing algorithms, which significantly outperforms individual methods. Finally, we deployed SANDPUMA in a systematic investigation of 7635 Actinobacteria genomes, suggesting that NRP chemical diversity is much higher than previously estimated. SANDPUMA has been integrated into the widely used antiSMASH biosynthetic gene cluster analysis pipeline and is also available as an open-source, standalone tool. SANDPUMA is freely available at https://bitbucket.org/chevrm/sandpuma and as a docker image at https://hub.docker.com/r/chevrm/sandpuma/ under the GNU Public License 3 (GPL3). chevrette@wisc.edu or marnix.medema@wur.nl. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
A domain-specific compiler for a parallel multiresolution adaptive numerical simulation environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rajbhandari, Samyam; Kim, Jinsung; Krishnamoorthy, Sriram
This paper describes the design and implementation of a layered domain-specific compiler to support MADNESS---Multiresolution ADaptive Numerical Environment for Scientific Simulation. MADNESS is a high-level software environment for the solution of integral and differential equations in many dimensions, using adaptive and fast harmonic analysis methods with guaranteed precision. MADNESS uses k-d trees to represent spatial functions and implements operators like addition, multiplication, differentiation, and integration on the numerical representation of functions. The MADNESS runtime system provides global namespace support and a task-based execution model including futures. MADNESS is currently deployed on massively parallel supercomputers and has enabled many science advances.more » Due to the highly irregular and statically unpredictable structure of the k-d trees representing the spatial functions encountered in MADNESS applications, only purely runtime approaches to optimization have previously been implemented in the MADNESS framework. This paper describes a layered domain-specific compiler developed to address some performance bottlenecks in MADNESS. The newly developed static compile-time optimizations, in conjunction with the MADNESS runtime support, enable significant performance improvement for the MADNESS framework.« less
Cheng, Ching-Min; Hwang, Sheue-Ling
2015-03-01
This paper outlines the human error identification (HEI) techniques that currently exist to assess latent human errors. Many formal error identification techniques have existed for years, but few have been validated to cover latent human error analysis in different domains. This study considers many possible error modes and influential factors, including external error modes, internal error modes, psychological error mechanisms, and performance shaping factors, and integrates several execution procedures and frameworks of HEI techniques. The case study in this research was the operational process of changing chemical cylinders in a factory. In addition, the integrated HEI method was used to assess the operational processes and the system's reliability. It was concluded that the integrated method is a valuable aid to develop much safer operational processes and can be used to predict human error rates on critical tasks in the plant. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Aligning, Bonding, and Testing Mirrors for Lightweight X-ray Telescopes
NASA Technical Reports Server (NTRS)
Chan, Kai-Wing; Zhang, William W.; Saha, Timo T.; McClelland, Ryan S.; Biskach, Michael P.; Niemeyer, Jason; Schofield, Mark J.; Mazzarella, James R.; Kolos, Linette D.; Hong, Melinda M.;
2015-01-01
High-resolution, high throughput optics for x-ray astronomy entails fabrication of well-formed mirror segments and their integration with arc-second precision. In this paper, we address issues of aligning and bonding thin glass mirrors with negligible additional distortion. Stability of the bonded mirrors and the curing of epoxy used in bonding them were tested extensively. We present results from tests of bonding mirrors onto experimental modules, and on the stability of the bonded mirrors tested in x-ray. These results demonstrate the fundamental validity of the methods used in integrating mirrors into telescope module, and reveal the areas for further investigation. The alignment and integration methods are applicable to the astronomical mission concept such as STAR-X, the Survey and Time-domain Astronomical Research Explorer.
Sahoo, Satya S.; Ogbuji, Chimezie; Luo, Lingyun; Dong, Xiao; Cui, Licong; Redline, Susan S.; Zhang, Guo-Qiang
2011-01-01
Clinical studies often use data dictionaries with controlled sets of terms to facilitate data collection, limited interoperability and sharing at a local site. Multi-center retrospective clinical studies require that these data dictionaries, originating from individual participating centers, be harmonized in preparation for the integration of the corresponding clinical research data. Domain ontologies are often used to facilitate multi-center data integration by modeling terms from data dictionaries in a logic-based language, but interoperability among domain ontologies (using automated techniques) is an unresolved issue. Although many upper-level reference ontologies have been proposed to address this challenge, our experience in integrating multi-center sleep medicine data highlights the need for an upper level ontology that models a common set of terms at multiple-levels of abstraction, which is not covered by the existing upper-level ontologies. We introduce a methodology underpinned by a Minimal Domain of Discourse (MiDas) algorithm to automatically extract a minimal common domain of discourse (upper-domain ontology) from an existing domain ontology. Using the Multi-Modality, Multi-Resource Environment for Physiological and Clinical Research (Physio-MIMI) multi-center project in sleep medicine as a use case, we demonstrate the use of MiDas in extracting a minimal domain of discourse for sleep medicine, from Physio-MIMI’s Sleep Domain Ontology (SDO). We then extend the resulting domain of discourse with terms from the data dictionary of the Sleep Heart and Health Study (SHHS) to validate MiDas. To illustrate the wider applicability of MiDas, we automatically extract the respective domains of discourse from 6 sample domain ontologies from the National Center for Biomedical Ontologies (NCBO) and the OBO Foundry. PMID:22195180
Sahoo, Satya S; Ogbuji, Chimezie; Luo, Lingyun; Dong, Xiao; Cui, Licong; Redline, Susan S; Zhang, Guo-Qiang
2011-01-01
Clinical studies often use data dictionaries with controlled sets of terms to facilitate data collection, limited interoperability and sharing at a local site. Multi-center retrospective clinical studies require that these data dictionaries, originating from individual participating centers, be harmonized in preparation for the integration of the corresponding clinical research data. Domain ontologies are often used to facilitate multi-center data integration by modeling terms from data dictionaries in a logic-based language, but interoperability among domain ontologies (using automated techniques) is an unresolved issue. Although many upper-level reference ontologies have been proposed to address this challenge, our experience in integrating multi-center sleep medicine data highlights the need for an upper level ontology that models a common set of terms at multiple-levels of abstraction, which is not covered by the existing upper-level ontologies. We introduce a methodology underpinned by a Minimal Domain of Discourse (MiDas) algorithm to automatically extract a minimal common domain of discourse (upper-domain ontology) from an existing domain ontology. Using the Multi-Modality, Multi-Resource Environment for Physiological and Clinical Research (Physio-MIMI) multi-center project in sleep medicine as a use case, we demonstrate the use of MiDas in extracting a minimal domain of discourse for sleep medicine, from Physio-MIMI's Sleep Domain Ontology (SDO). We then extend the resulting domain of discourse with terms from the data dictionary of the Sleep Heart and Health Study (SHHS) to validate MiDas. To illustrate the wider applicability of MiDas, we automatically extract the respective domains of discourse from 6 sample domain ontologies from the National Center for Biomedical Ontologies (NCBO) and the OBO Foundry.
NASA Astrophysics Data System (ADS)
Narock, T.; Arko, R. A.; Carbotte, S. M.; Chandler, C. L.; Cheatham, M.; Finin, T.; Hitzler, P.; Krisnadhi, A.; Raymond, L. M.; Shepherd, A.; Wiebe, P. H.
2014-12-01
A wide spectrum of maturing methods and tools, collectively characterized as the Semantic Web, is helping to vastly improve the dissemination of scientific research. Creating semantic integration requires input from both domain and cyberinfrastructure scientists. OceanLink, an NSF EarthCube Building Block, is demonstrating semantic technologies through the integration of geoscience data repositories, library holdings, conference abstracts, and funded research awards. Meeting project objectives involves applying semantic technologies to support data representation, discovery, sharing and integration. Our semantic cyberinfrastructure components include ontology design patterns, Linked Data collections, semantic provenance, and associated services to enhance data and knowledge discovery, interoperation, and integration. We discuss how these components are integrated, the continued automated and semi-automated creation of semantic metadata, and techniques we have developed to integrate ontologies, link resources, and preserve provenance and attribution.
Trofimov, Vyacheslav A.; Varentsova, Svetlana A.; Zakharova, Irina G.; Zagursky, Dmitry Yu.
2017-01-01
Using an experiment with thin paper layers and computer simulation, we demonstrate the principal limitations of standard Time Domain Spectroscopy (TDS) based on using a broadband THz pulse for the detection and identification of a substance placed inside a disordered structure. We demonstrate the spectrum broadening of both transmitted and reflected pulses due to the cascade mechanism of the high energy level excitation considering, for example, a three-energy level medium. The pulse spectrum in the range of high frequencies remains undisturbed in the presence of a disordered structure. To avoid false absorption frequencies detection, we apply the spectral dynamics analysis method (SDA-method) together with certain integral correlation criteria (ICC). PMID:29186849
The Practice Integration Profile: Rationale, development, method, and research.
Macchi, C R; Kessler, Rodger; Auxier, Andrea; Hitt, Juvena R; Mullin, Daniel; van Eeghen, Constance; Littenberg, Benjamin
2016-12-01
Insufficient knowledge exists regarding how to measure the presence and degree of integrated care. Prior estimates of integration levels are neither grounded in theory nor psychometrically validated. They provide scant guidance to inform improvement activities, compare integration efforts, discriminate among practices by degree of integration, measure the effect of integration on quadruple aim outcomes, or address the needs of clinicians, regulators, and policymakers seeking new models of health care delivery and funding. We describe the development of the Practice Integration Profile (PIP), a novel instrument designed to measure levels of integrated behavioral health care within a primary care clinic. The PIP draws upon the Agency for Health care Research & Quality's (AHRQ) Lexicon of Collaborative Care which provides theoretic justification for a paradigm case of collaborative care. We used the key clauses of the Lexicon to derive domains of integration and generate measures corresponding to those key clauses. After reviewing currently used methods for identifying collaborative care, or integration, and identifying the need to improve on them, we describe a national collaboration to describe and evaluate the PIP. We also describe its potential use in practice improvement, research, responsiveness to multiple stakeholder needs, and other future directions. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Solution of Grad-Shafranov equation by the method of fundamental solutions
NASA Astrophysics Data System (ADS)
Nath, D.; Kalra, M. S.; Kalra
2014-06-01
In this paper we have used the Method of Fundamental Solutions (MFS) to solve the Grad-Shafranov (GS) equation for the axisymmetric equilibria of tokamak plasmas with monomial sources. These monomials are the individual terms appearing on the right-hand side of the GS equation if one expands the nonlinear terms into polynomials. Unlike the Boundary Element Method (BEM), the MFS does not involve any singular integrals and is a meshless boundary-alone method. Its basic idea is to create a fictitious boundary around the actual physical boundary of the computational domain. This automatically removes the involvement of singular integrals. The results obtained by the MFS match well with the earlier results obtained using the BEM. The method is also applied to Solov'ev profiles and it is found that the results are in good agreement with analytical results.
NASA Astrophysics Data System (ADS)
Zhou, Yi; Tang, Yan; Deng, Qinyuan; Zhao, Lixin; Hu, Song
2017-08-01
Three-dimensional measurement and inspection is an area with growing needs and interests in many domains, such as integrated circuits (IC), medical cure, and chemistry. Among the methods, broadband light interferometry is widely utilized due to its large measurement range, noncontact and high precision. In this paper, we propose a spatial modulation depth-based method to retrieve the surface topography through analyzing the characteristics of both frequency and spatial domains in the interferogram. Due to the characteristics of spatial modulation depth, the technique could effectively suppress the negative influences caused by light fluctuations and external disturbance. Both theory and experiments are elaborated to confirm that the proposed method can greatly improve the measurement stability and sensitivity with high precision. This technique can achieve a superior robustness with the potential to be applied in online topography measurement.
A novel partitioning method for block-structured adaptive meshes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fu, Lin, E-mail: lin.fu@tum.de; Litvinov, Sergej, E-mail: sergej.litvinov@aer.mw.tum.de; Hu, Xiangyu Y., E-mail: xiangyu.hu@tum.de
We propose a novel partitioning method for block-structured adaptive meshes utilizing the meshless Lagrangian particle concept. With the observation that an optimum partitioning has high analogy to the relaxation of a multi-phase fluid to steady state, physically motivated model equations are developed to characterize the background mesh topology and are solved by multi-phase smoothed-particle hydrodynamics. In contrast to well established partitioning approaches, all optimization objectives are implicitly incorporated and achieved during the particle relaxation to stationary state. Distinct partitioning sub-domains are represented by colored particles and separated by a sharp interface with a surface tension model. In order to obtainmore » the particle relaxation, special viscous and skin friction models, coupled with a tailored time integration algorithm are proposed. Numerical experiments show that the present method has several important properties: generation of approximately equal-sized partitions without dependence on the mesh-element type, optimized interface communication between distinct partitioning sub-domains, continuous domain decomposition which is physically localized and implicitly incremental. Therefore it is particularly suitable for load-balancing of high-performance CFD simulations.« less
A novel partitioning method for block-structured adaptive meshes
NASA Astrophysics Data System (ADS)
Fu, Lin; Litvinov, Sergej; Hu, Xiangyu Y.; Adams, Nikolaus A.
2017-07-01
We propose a novel partitioning method for block-structured adaptive meshes utilizing the meshless Lagrangian particle concept. With the observation that an optimum partitioning has high analogy to the relaxation of a multi-phase fluid to steady state, physically motivated model equations are developed to characterize the background mesh topology and are solved by multi-phase smoothed-particle hydrodynamics. In contrast to well established partitioning approaches, all optimization objectives are implicitly incorporated and achieved during the particle relaxation to stationary state. Distinct partitioning sub-domains are represented by colored particles and separated by a sharp interface with a surface tension model. In order to obtain the particle relaxation, special viscous and skin friction models, coupled with a tailored time integration algorithm are proposed. Numerical experiments show that the present method has several important properties: generation of approximately equal-sized partitions without dependence on the mesh-element type, optimized interface communication between distinct partitioning sub-domains, continuous domain decomposition which is physically localized and implicitly incremental. Therefore it is particularly suitable for load-balancing of high-performance CFD simulations.
2013-01-01
Background It is important to quickly and efficiently identify policies that are effective at changing behavior; therefore, we must be able to quantify and evaluate the effect of those policies and of changes to those policies. The purpose of this study was to develop state-level physical education (PE) and physical activity (PA) policy domain scores at the high-school level. Policy domain scores were developed with a focus on measuring policy change. Methods Exploratory factor analysis was used to group items from the state-level School Health Policies and Programs Study (SHPPS) into policy domains. Items that related to PA or PE at the High School level were identified from the 7 SHPPS health program surveys. Data from 2000 and 2006 were used in the factor analysis. RESULTS: From the 98 items identified, 17 policy domains were extracted. Average policy domain change scores were positive for 12 policy domains, with the largest increases for “Discouraging PA as Punishment”, “Collaboration”, and “Staff Development Opportunities”. On average, states increased scores in 4.94 ± 2.76 policy domains, decreased in 3.53 ± 2.03, and had no change in 7.69 ± 2.09 policy domains. Significant correlations were found between several policy domain scores. Conclusions Quantifying policy change and its impact is integral to the policy making and revision process. Our results build on previous research offering a way to examine changes in state-level policies related to PE and PA of high-school students and the faculty and staff who serve them. This work provides methods for combining state-level policies relevant to PE or PA in youth for studies of their impact. PMID:23815860
ERIC Educational Resources Information Center
Ross, Jennifer Gunberg
2011-01-01
Simulation is a teaching method that closely replicates reality by integrating all three learning domains: cognitive, affective, and psychomotor. Despite the widespread use of simulation in nursing education today, there is a dearth of empirical evidence supporting the use of simulation to teach psychomotor skills. Furthermore, there is no…
DOT National Transportation Integrated Search
2011-02-01
A new method of cable installation using a heavy-duty Cone Penetration Test : (CPT) truck was developed and practiced successfully in this study. The coaxial and fiber : optic cables were pushed along with the cone rods by the hydraulic system integr...
ERIC Educational Resources Information Center
Connelly, Brian S.; Ones, Deniz S.
2010-01-01
The bulk of personality research has been built from self-report measures of personality. However, collecting personality ratings from other-raters, such as family, friends, and even strangers, is a dramatically underutilized method that allows better explanation and prediction of personality's role in many domains of psychology. Drawing…
Efficient visibility encoding for dynamic illumination in direct volume rendering.
Kronander, Joel; Jönsson, Daniel; Löw, Joakim; Ljung, Patric; Ynnerman, Anders; Unger, Jonas
2012-03-01
We present an algorithm that enables real-time dynamic shading in direct volume rendering using general lighting, including directional lights, point lights, and environment maps. Real-time performance is achieved by encoding local and global volumetric visibility using spherical harmonic (SH) basis functions stored in an efficient multiresolution grid over the extent of the volume. Our method enables high-frequency shadows in the spatial domain, but is limited to a low-frequency approximation of visibility and illumination in the angular domain. In a first pass, level of detail (LOD) selection in the grid is based on the current transfer function setting. This enables rapid online computation and SH projection of the local spherical distribution of visibility information. Using a piecewise integration of the SH coefficients over the local regions, the global visibility within the volume is then computed. By representing the light sources using their SH projections, the integral over lighting, visibility, and isotropic phase functions can be efficiently computed during rendering. The utility of our method is demonstrated in several examples showing the generality and interactive performance of the approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shlivinski, A., E-mail: amirshli@ee.bgu.ac.il; Lomakin, V., E-mail: vlomakin@eng.ucsd.edu
2016-03-01
Scattering or coupling of electromagnetic beam-field at a surface discontinuity separating two homogeneous or inhomogeneous media with different propagation characteristics is formulated using surface integral equation, which are solved by the Method of Moments with the aid of the Gabor-based Gaussian window frame set of basis and testing functions. The application of the Gaussian window frame provides (i) a mathematically exact and robust tool for spatial-spectral phase-space formulation and analysis of the problem; (ii) a system of linear equations in a transmission-line like form relating mode-like wave objects of one medium with mode-like wave objects of the second medium; (iii)more » furthermore, an appropriate setting of the frame parameters yields mode-like wave objects that blend plane wave properties (as if solving in the spectral domain) with Green's function properties (as if solving in the spatial domain); and (iv) a representation of the scattered field with Gaussian-beam propagators that may be used in many large (in terms of wavelengths) systems.« less
Chai, Xin; Wang, Qisong; Zhao, Yongping; Li, Yongqiang; Liu, Dan; Liu, Xin; Bai, Ou
2017-01-01
Electroencephalography (EEG)-based emotion recognition is an important element in psychiatric health diagnosis for patients. However, the underlying EEG sensor signals are always non-stationary if they are sampled from different experimental sessions or subjects. This results in the deterioration of the classification performance. Domain adaptation methods offer an effective way to reduce the discrepancy of marginal distribution. However, for EEG sensor signals, both marginal and conditional distributions may be mismatched. In addition, the existing domain adaptation strategies always require a high level of additional computation. To address this problem, a novel strategy named adaptive subspace feature matching (ASFM) is proposed in this paper in order to integrate both the marginal and conditional distributions within a unified framework (without any labeled samples from target subjects). Specifically, we develop a linear transformation function which matches the marginal distributions of the source and target subspaces without a regularization term. This significantly decreases the time complexity of our domain adaptation procedure. As a result, both marginal and conditional distribution discrepancies between the source domain and unlabeled target domain can be reduced, and logistic regression (LR) can be applied to the new source domain in order to train a classifier for use in the target domain, since the aligned source domain follows a distribution which is similar to that of the target domain. We compare our ASFM method with six typical approaches using a public EEG dataset with three affective states: positive, neutral, and negative. Both offline and online evaluations were performed. The subject-to-subject offline experimental results demonstrate that our component achieves a mean accuracy and standard deviation of 80.46% and 6.84%, respectively, as compared with a state-of-the-art method, the subspace alignment auto-encoder (SAAE), which achieves values of 77.88% and 7.33% on average, respectively. For the online analysis, the average classification accuracy and standard deviation of ASFM in the subject-to-subject evaluation for all the 15 subjects in a dataset was 75.11% and 7.65%, respectively, gaining a significant performance improvement compared to the best baseline LR which achieves 56.38% and 7.48%, respectively. The experimental results confirm the effectiveness of the proposed method relative to state-of-the-art methods. Moreover, computational efficiency of the proposed ASFM method is much better than standard domain adaptation; if the numbers of training samples and test samples are controlled within certain range, it is suitable for real-time classification. It can be concluded that ASFM is a useful and effective tool for decreasing domain discrepancy and reducing performance degradation across subjects and sessions in the field of EEG-based emotion recognition. PMID:28467371
Chai, Xin; Wang, Qisong; Zhao, Yongping; Li, Yongqiang; Liu, Dan; Liu, Xin; Bai, Ou
2017-05-03
Electroencephalography (EEG)-based emotion recognition is an important element in psychiatric health diagnosis for patients. However, the underlying EEG sensor signals are always non-stationary if they are sampled from different experimental sessions or subjects. This results in the deterioration of the classification performance. Domain adaptation methods offer an effective way to reduce the discrepancy of marginal distribution. However, for EEG sensor signals, both marginal and conditional distributions may be mismatched. In addition, the existing domain adaptation strategies always require a high level of additional computation. To address this problem, a novel strategy named adaptive subspace feature matching (ASFM) is proposed in this paper in order to integrate both the marginal and conditional distributions within a unified framework (without any labeled samples from target subjects). Specifically, we develop a linear transformation function which matches the marginal distributions of the source and target subspaces without a regularization term. This significantly decreases the time complexity of our domain adaptation procedure. As a result, both marginal and conditional distribution discrepancies between the source domain and unlabeled target domain can be reduced, and logistic regression (LR) can be applied to the new source domain in order to train a classifier for use in the target domain, since the aligned source domain follows a distribution which is similar to that of the target domain. We compare our ASFM method with six typical approaches using a public EEG dataset with three affective states: positive, neutral, and negative. Both offline and online evaluations were performed. The subject-to-subject offline experimental results demonstrate that our component achieves a mean accuracy and standard deviation of 80.46% and 6.84%, respectively, as compared with a state-of-the-art method, the subspace alignment auto-encoder (SAAE), which achieves values of 77.88% and 7.33% on average, respectively. For the online analysis, the average classification accuracy and standard deviation of ASFM in the subject-to-subject evaluation for all the 15 subjects in a dataset was 75.11% and 7.65%, respectively, gaining a significant performance improvement compared to the best baseline LR which achieves 56.38% and 7.48%, respectively. The experimental results confirm the effectiveness of the proposed method relative to state-of-the-art methods. Moreover, computational efficiency of the proposed ASFM method is much better than standard domain adaptation; if the numbers of training samples and test samples are controlled within certain range, it is suitable for real-time classification. It can be concluded that ASFM is a useful and effective tool for decreasing domain discrepancy and reducing performance degradation across subjects and sessions in the field of EEG-based emotion recognition.
Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Knox, Lenora A.
The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.
A nodal domain theorem for integrable billiards in two dimensions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samajdar, Rhine; Jain, Sudhir R., E-mail: srjain@barc.gov.in
Eigenfunctions of integrable planar billiards are studied — in particular, the number of nodal domains, ν of the eigenfunctions with Dirichlet boundary conditions are considered. The billiards for which the time-independent Schrödinger equation (Helmholtz equation) is separable admit trivial expressions for the number of domains. Here, we discover that for all separable and non-separable integrable billiards, ν satisfies certain difference equations. This has been possible because the eigenfunctions can be classified in families labelled by the same value of mmodkn, given a particular k, for a set of quantum numbers, m,n. Further, we observe that the patterns in a familymore » are similar and the algebraic representation of the geometrical nodal patterns is found. Instances of this representation are explained in detail to understand the beauty of the patterns. This paper therefore presents a mathematical connection between integrable systems and difference equations. - Highlights: • We find that the number of nodal domains of eigenfunctions of integrable, planar billiards satisfy a class of difference equations. • The eigenfunctions labelled by quantum numbers (m,n) can be classified in terms of mmodkn. • A theorem is presented, realising algebraic representations of geometrical patterns exhibited by the domains. • This work presents a connection between integrable systems and difference equations.« less
NASA Astrophysics Data System (ADS)
Vainshtein, Sergey N.; Duan, Guoyong; Mikhnev, Valeri A.; Zemlyakov, Valery E.; Egorkin, Vladimir I.; Kalyuzhnyy, Nikolay A.; Maleev, Nikolai A.; Näpänkangas, Juha; Sequeiros, Roberto Blanco; Kostamovaara, Juha T.
2018-05-01
Progress in terahertz spectroscopy and imaging is mostly associated with femtosecond laser-driven systems, while solid-state sources, mainly sub-millimetre integrated circuits, are still in an early development phase. As simple and cost-efficient an emitter as a Gunn oscillator could cause a breakthrough in the field, provided its frequency limitations could be overcome. Proposed here is an application of the recently discovered collapsing field domains effect that permits sub-THz oscillations in sub-micron semiconductor layers thanks to nanometer-scale powerfully ionizing domains arising due to negative differential mobility in extreme fields. This shifts the frequency limit by an order of magnitude relative to the conventional Gunn effect. Our first miniature picosecond pulsed sources cover the 100-200 GHz band and promise milliwatts up to ˜500 GHz. Thanks to the method of interferometrically enhanced time-domain imaging proposed here and the low single-shot jitter of ˜1 ps, our simple imaging system provides sufficient time-domain imaging contrast for fresh-tissue terahertz histology.
Zhang, Shuangyue; Han, Dong; Politte, David G; Williamson, Jeffrey F; O'Sullivan, Joseph A
2018-05-01
The purpose of this study was to assess the performance of a novel dual-energy CT (DECT) approach for proton stopping power ratio (SPR) mapping that integrates image reconstruction and material characterization using a joint statistical image reconstruction (JSIR) method based on a linear basis vector model (BVM). A systematic comparison between the JSIR-BVM method and previously described DECT image- and sinogram-domain decomposition approaches is also carried out on synthetic data. The JSIR-BVM method was implemented to estimate the electron densities and mean excitation energies (I-values) required by the Bethe equation for SPR mapping. In addition, image- and sinogram-domain DECT methods based on three available SPR models including BVM were implemented for comparison. The intrinsic SPR modeling accuracy of the three models was first validated. Synthetic DECT transmission sinograms of two 330 mm diameter phantoms each containing 17 soft and bony tissues (for a total of 34) of known composition were then generated with spectra of 90 and 140 kVp. The estimation accuracy of the reconstructed SPR images were evaluated for the seven investigated methods. The impact of phantom size and insert location on SPR estimation accuracy was also investigated. All three selected DECT-SPR models predict the SPR of all tissue types with less than 0.2% RMS errors under idealized conditions with no reconstruction uncertainties. When applied to synthetic sinograms, the JSIR-BVM method achieves the best performance with mean and RMS-average errors of less than 0.05% and 0.3%, respectively, for all noise levels, while the image- and sinogram-domain decomposition methods show increasing mean and RMS-average errors with increasing noise level. The JSIR-BVM method also reduces statistical SPR variation by sixfold compared to other methods. A 25% phantom diameter change causes up to 4% SPR differences for the image-domain decomposition approach, while the JSIR-BVM method and sinogram-domain decomposition methods are insensitive to size change. Among all the investigated methods, the JSIR-BVM method achieves the best performance for SPR estimation in our simulation phantom study. This novel method is robust with respect to sinogram noise and residual beam-hardening effects, yielding SPR estimation errors comparable to intrinsic BVM modeling error. In contrast, the achievable SPR estimation accuracy of the image- and sinogram-domain decomposition methods is dominated by the CT image intensity uncertainties introduced by the reconstruction and decomposition processes. © 2018 American Association of Physicists in Medicine.
Ontology-guided data preparation for discovering genotype-phenotype relationships.
Coulet, Adrien; Smaïl-Tabbone, Malika; Benlian, Pascale; Napoli, Amedeo; Devignes, Marie-Dominique
2008-04-25
Complexity and amount of post-genomic data constitute two major factors limiting the application of Knowledge Discovery in Databases (KDD) methods in life sciences. Bio-ontologies may nowadays play key roles in knowledge discovery in life science providing semantics to data and to extracted units, by taking advantage of the progress of Semantic Web technologies concerning the understanding and availability of tools for knowledge representation, extraction, and reasoning. This paper presents a method that exploits bio-ontologies for guiding data selection within the preparation step of the KDD process. We propose three scenarios in which domain knowledge and ontology elements such as subsumption, properties, class descriptions, are taken into account for data selection, before the data mining step. Each of these scenarios is illustrated within a case-study relative to the search of genotype-phenotype relationships in a familial hypercholesterolemia dataset. The guiding of data selection based on domain knowledge is analysed and shows a direct influence on the volume and significance of the data mining results. The method proposed in this paper is an efficient alternative to numerical methods for data selection based on domain knowledge. In turn, the results of this study may be reused in ontology modelling and data integration.
A point-value enhanced finite volume method based on approximate delta functions
NASA Astrophysics Data System (ADS)
Xuan, Li-Jun; Majdalani, Joseph
2018-02-01
We revisit the concept of an approximate delta function (ADF), introduced by Huynh (2011) [1], in the form of a finite-order polynomial that holds identical integral properties to the Dirac delta function when used in conjunction with a finite-order polynomial integrand over a finite domain. We show that the use of generic ADF polynomials can be effective at recovering and generalizing several high-order methods, including Taylor-based and nodal-based Discontinuous Galerkin methods, as well as the Correction Procedure via Reconstruction. Based on the ADF concept, we then proceed to formulate a Point-value enhanced Finite Volume (PFV) method, which stores and updates the cell-averaged values inside each element as well as the unknown quantities and, if needed, their derivatives on nodal points. The sharing of nodal information with surrounding elements saves the number of degrees of freedom compared to other compact methods at the same order. To ensure conservation, cell-averaged values are updated using an identical approach to that adopted in the finite volume method. Here, the updating of nodal values and their derivatives is achieved through an ADF concept that leverages all of the elements within the domain of integration that share the same nodal point. The resulting scheme is shown to be very stable at successively increasing orders. Both accuracy and stability of the PFV method are verified using a Fourier analysis and through applications to the linear wave and nonlinear Burgers' equations in one-dimensional space.
Integrated care: a comprehensive bibliometric analysis and literature review
Sun, Xiaowei; Tang, Wenxi; Ye, Ting; Zhang, Yan; Wen, Bo; Zhang, Liang
2014-01-01
Introduction Integrated care could not only fix up fragmented health care but also improve the continuity of care and the quality of life. Despite the volume and variety of publications, little is known about how ‘integrated care’ has developed. There is a need for a systematic bibliometric analysis on studying the important features of the integrated care literature. Aim To investigate the growth pattern, core journals and jurisdictions and identify the key research domains of integrated care. Methods We searched Medline/PubMed using the search strategy ‘(delivery of health care, integrated [MeSH Terms]) OR integrated care [Title/Abstract]’ without time and language limits. Second, we extracted the publishing year, journals, jurisdictions and keywords of the retrieved articles. Finally, descriptive statistical analysis by the Bibliographic Item Co-occurrence Matrix Builder and hierarchical clustering by SPSS were used. Results As many as 9090 articles were retrieved. Results included: (1) the cumulative numbers of the publications on integrated care rose perpendicularly after 1993; (2) all documents were recorded by 1646 kinds of journals. There were 28 core journals; (3) the USA is the predominant publishing country; and (4) there are six key domains including: the definition/models of integrated care, interdisciplinary patient care team, disease management for chronically ill patients, types of health care organizations and policy, information system integration and legislation/jurisprudence. Discussion and conclusion Integrated care literature has been most evident in developed countries. International Journal of Integrated Care is highly recommended in this research area. The bibliometric analysis and identification of publication hotspots provides researchers and practitioners with core target journals, as well as an overview of the field for further research in integrated care. PMID:24987322
GROWING UP IS HARD TO DO: AN EMPIRICAL EVALUATION OF MATURATION AND DESISTANCE
Rocque, Michael; Posick, Chad; White, Helene R.
2016-01-01
Purpose With an increase in longitudinal datasets and analyses, scholars have made theoretical advances toward understanding desistance, using biological, social, and psychological factors. In an effort to integrate the theoretical views on desistance, some scholars have argued that each of these views represents a piece of adult maturation. Yet to date, research has not empirically examined an integrated perspective. The purpose of this study is to conduct an exploratory examination of various “domains” of maturation to determine whether they explain desistance from crime separately and as a whole. Methods Using the Rutgers Health and Human Development Project, a longitudinal study spanning ages 12–31, we develop exploratory measures of maturation in five domains: 1) adult social roles, 2) identity/cognitive, 3) psychosocial, 4) civic, and 5) neurocognitive. We then utilize growth curve models to examine the relationship between these domains and crime over time. Results Although each of the domains is associated with crime at the bivariate level, only three (i.e., psychosocial, identity/cognitive transformation, and adult social role) remain significant in the growth curve models (2 in within-individual analyses). In addition, a combined measure of maturation is related to crime, indicating that greater maturation through emerging adulthood has a negative effect on criminal behavior and is, therefore, a factor influencing desistance. Conclusions Maturation emerges as a promising approach to integrating the multiple theoretical views that characterize the literature on desistance from crime. Further research should develop additional domains and determine the best approach for measurement. PMID:28580234
V S, Unni; Mishra, Deepak; Subrahmanyam, G R K S
2016-12-01
The need for image fusion in current image processing systems is increasing mainly due to the increased number and variety of image acquisition techniques. Image fusion is the process of combining substantial information from several sensors using mathematical techniques in order to create a single composite image that will be more comprehensive and thus more useful for a human operator or other computer vision tasks. This paper presents a new approach to multifocus image fusion based on sparse signal representation. Block-based compressive sensing integrated with a projection-driven compressive sensing (CS) recovery that encourages sparsity in the wavelet domain is used as a method to get the focused image from a set of out-of-focus images. Compression is achieved during the image acquisition process using a block compressive sensing method. An adaptive thresholding technique within the smoothed projected Landweber recovery process reconstructs high-resolution focused images from low-dimensional CS measurements of out-of-focus images. Discrete wavelet transform and dual-tree complex wavelet transform are used as the sparsifying basis for the proposed fusion. The main finding lies in the fact that sparsification enables a better selection of the fusion coefficients and hence better fusion. A Laplacian mixture model fit is done in the wavelet domain and estimation of the probability density function (pdf) parameters by expectation maximization leads us to the proper selection of the coefficients of the fused image. Using the proposed method compared with the fusion scheme without employing the projected Landweber (PL) scheme and the other existing CS-based fusion approaches, it is observed that with fewer samples itself, the proposed method outperforms other approaches.
Discretization analysis of bifurcation based nonlinear amplifiers
NASA Astrophysics Data System (ADS)
Feldkord, Sven; Reit, Marco; Mathis, Wolfgang
2017-09-01
Recently, for modeling biological amplification processes, nonlinear amplifiers based on the supercritical Andronov-Hopf bifurcation have been widely analyzed analytically. For technical realizations, digital systems have become the most relevant systems in signal processing applications. The underlying continuous-time systems are transferred to the discrete-time domain using numerical integration methods. Within this contribution, effects on the qualitative behavior of the Andronov-Hopf bifurcation based systems concerning numerical integration methods are analyzed. It is shown exemplarily that explicit Runge-Kutta methods transform the truncated normalform equation of the Andronov-Hopf bifurcation into the normalform equation of the Neimark-Sacker bifurcation. Dependent on the order of the integration method, higher order terms are added during this transformation.A rescaled normalform equation of the Neimark-Sacker bifurcation is introduced that allows a parametric design of a discrete-time system which corresponds to the rescaled Andronov-Hopf system. This system approximates the characteristics of the rescaled Hopf-type amplifier for a large range of parameters. The natural frequency and the peak amplitude are preserved for every set of parameters. The Neimark-Sacker bifurcation based systems avoid large computational effort that would be caused by applying higher order integration methods to the continuous-time normalform equations.
Development and validation of a competency framework for veterinarians.
Bok, Harold G J; Jaarsma, Debbie A D C; Teunissen, Pim W; van der Vleuten, Cees P M; van Beukelen, Peter
2011-01-01
Changing demands from society and the veterinary profession call for veterinary medical curricula that can deliver veterinarians who are able to integrate specific and generic competencies in their professional practice. This requires educational innovation directed by an integrative veterinary competency framework to guide curriculum development. Given the paucity of relevant information from the veterinary literature, a qualitative multi-method study was conducted to develop and validate such a framework. A competency framework was developed based on the analysis of focus group interviews with 54 recently graduated veterinarians and clients and subsequently validated in a Delphi procedure with a panel of 29 experts, representing the full range and diversity of the veterinary profession. The study resulted in an integrated competency framework for veterinary professionals, which consists of 16 competencies organized in seven domains: veterinary expertise, communication, collaboration, entrepreneurship, health and welfare, scholarship, and personal development. Training veterinarians who are able to use and integrate the seven domains in their professional practice is an important challenge for today's veterinary medical schools. The Veterinary Professional (VetPro) framework provides a sound empirical basis for the ongoing debate about the direction of veterinary education and curriculum development.
NASA Astrophysics Data System (ADS)
Geng, Lin; Bi, Chuan-Xing; Xie, Feng; Zhang, Xiao-Zheng
2018-07-01
Interpolated time-domain equivalent source method is extended to reconstruct the instantaneous surface normal velocity of a vibrating structure by using the time-evolving particle velocity as the input, which provides a non-contact way to overall understand the instantaneous vibration behavior of the structure. In this method, the time-evolving particle velocity in the near field is first modeled by a set of equivalent sources positioned inside the vibrating structure, and then the integrals of equivalent source strengths are solved by an iterative solving process and are further used to calculate the instantaneous surface normal velocity. An experiment of a semi-cylindrical steel plate impacted by a steel ball is investigated to examine the ability of the extended method, where the time-evolving normal particle velocity and pressure on the hologram surface measured by a Microflown pressure-velocity probe are used as the inputs of the extended method and the method based on pressure measurements, respectively, and the instantaneous surface normal velocity of the plate measured by a laser Doppler vibrometry is used as the reference for comparison. The experimental results demonstrate that the extended method is a powerful tool to visualize the instantaneous surface normal velocity of a vibrating structure in both time and space domains and can obtain more accurate results than that of the method based on pressure measurements.
NASA Astrophysics Data System (ADS)
Li, Jing Xia; Xu, Hang; Liu, Li; Su, Peng Cheng; Zhang, Jian Guo
2015-05-01
We report a chaotic optical time-domain reflectometry for fiber fault location, where a chaotic probe signal is generated by driving a distributed feedback laser diode with an improved Colpitts chaotic oscillator. The results show that the unterminated fiber end, the loose connector, and the mismatch connector can be precisely located. A measurement range of approximately 91 km and a range independent resolution of 6 cm are achieved. This implementation method is easy to integrate and is cost effective, which gives it great potential for commercial applications.
Prediction of non-cavitation propeller noise in time domain
NASA Astrophysics Data System (ADS)
Ye, Jin-Ming; Xiong, Ying; Xiao, Chang-Run; Bi, Yi
2011-09-01
The blade frequency noise of non-cavitation propeller in a uniform flow is analyzed in time domain. The unsteady loading (dipole source) on the blade surface is calculated by a potential-based surface panel method. Then the time-dependent pressure data is used as the input for Ffowcs Williams-Hawkings formulation to predict the acoustics pressure. The integration of noise source is performed over the true blade surface rather than the nothickness blade surface, and the effect of hub can be considered. The noise characteristics of the non-cavitation propeller and the numerical discretization forms are discussed.
ben-Avraham, D; Fokas, A S
2001-07-01
A new transform method for solving boundary value problems for linear and integrable nonlinear partial differential equations recently introduced in the literature is used here to obtain the solution of the modified Helmholtz equation q(xx)(x,y)+q(yy)(x,y)-4 beta(2)q(x,y)=0 in the triangular domain 0< or =x< or =L-y< or =L, with mixed boundary conditions. This solution is applied to the problem of diffusion-limited coalescence, A+A<==>A, in the segment (-L/2,L/2), with traps at the edges.
NASA Astrophysics Data System (ADS)
Poursartip, B.
2015-12-01
Seismic hazard assessment to predict the behavior of infrastructures subjected to earthquake relies on ground motion numerical simulation because the analytical solution of seismic waves is limited to only a few simple geometries. Recent advances in numerical methods and computer architectures make it ever more practical to reliably and quickly obtain the near-surface response to seismic events. The key motivation stems from the need to access the performance of sensitive components of the civil infrastructure (nuclear power plants, bridges, lifelines, etc), when subjected to realistic scenarios of seismic events. We discuss an integrated approach that deploys best-practice tools for simulating seismic events in arbitrarily heterogeneous formations, while also accounting for topography. Specifically, we describe an explicit forward wave solver based on a hybrid formulation that couples a single-field formulation for the computational domain with an unsplit mixed-field formulation for Perfectly-Matched-Layers (PMLs and/or M-PMLs) used to limit the computational domain. Due to the material heterogeneity and the contrasting discretization needs it imposes, an adaptive time solver is adopted. We use a Runge-Kutta-Fehlberg time-marching scheme that adjusts optimally the time step such that the local truncation error rests below a predefined tolerance. We use spectral elements for spatial discretization, and the Domain Reduction Method in accordance with double couple method to allow for the efficient prescription of the input seismic motion. Of particular interest to this development is the study of the effects idealized topographic features have on the surface motion when compared against motion results that are based on a flat-surface assumption. We discuss the components of the integrated approach we followed, and report the results of parametric studies in two and three dimensions, for various idealized topographic features, which show motion amplification that depends, as expected, on the relation between the topographic feature's characteristics and the dominant wavelength. Lastly, we report results involving three-dimensional simulations.
NASA Technical Reports Server (NTRS)
Navon, I. M.
1984-01-01
A Lagrange multiplier method using techniques developed by Bertsekas (1982) was applied to solving the problem of enforcing simultaneous conservation of the nonlinear integral invariants of the shallow water equations on a limited area domain. This application of nonlinear constrained optimization is of the large dimensional type and the conjugate gradient method was found to be the only computationally viable method for the unconstrained minimization. Several conjugate-gradient codes were tested and compared for increasing accuracy requirements. Robustness and computational efficiency were the principal criteria.
A numerical method for the dynamics of non-spherical cavitation bubbles
NASA Technical Reports Server (NTRS)
Lucca, G.; Prosperetti, A.
1982-01-01
A boundary integral numerical method for the dynamics of nonspherical cavitation bubbles in inviscid incompressible liquids is described. Only surface values of the velocity potential and its first derivatives are involved. The problem of solving the Laplace equation in the entire domain occupied by the liquid is thus avoided. The collapse of a bubble in the vicinity of a solid wall and the collapse of three bubbles with collinear centers are considered.
2.5-D frequency-domain viscoelastic wave modelling using finite-element method
NASA Astrophysics Data System (ADS)
Zhao, Jian-guo; Huang, Xing-xing; Liu, Wei-fang; Zhao, Wei-jun; Song, Jian-yong; Xiong, Bin; Wang, Shang-xu
2017-10-01
2-D seismic modelling has notable dynamic information discrepancies with field data because of the implicit line-source assumption, whereas 3-D modelling suffers from a huge computational burden. The 2.5-D approach is able to overcome both of the aforementioned limitations. In general, the earth model is treated as an elastic material, but the real media is viscous. In this study, we develop an accurate and efficient frequency-domain finite-element method (FEM) for modelling 2.5-D viscoelastic wave propagation. To perform the 2.5-D approach, we assume that the 2-D viscoelastic media are based on the Kelvin-Voigt rheological model and a 3-D point source. The viscoelastic wave equation is temporally and spatially Fourier transformed into the frequency-wavenumber domain. Then, we systematically derive the weak form and its spatial discretization of 2.5-D viscoelastic wave equations in the frequency-wavenumber domain through the Galerkin weighted residual method for FEM. Fixing a frequency, the 2-D problem for each wavenumber is solved by FEM. Subsequently, a composite Simpson formula is adopted to estimate the inverse Fourier integration to obtain the 3-D wavefield. We implement the stiffness reduction method (SRM) to suppress artificial boundary reflections. The results show that this absorbing boundary condition is valid and efficient in the frequency-wavenumber domain. Finally, three numerical models, an unbounded homogeneous medium, a half-space layered medium and an undulating topography medium, are established. Numerical results validate the accuracy and stability of 2.5-D solutions and present the adaptability of finite-element method to complicated geographic conditions. The proposed 2.5-D modelling strategy has the potential to address modelling studies on wave propagation in real earth media in an accurate and efficient way.
Global boundary flattening transforms for acoustic propagation under rough sea surfaces.
Oba, Roger M
2010-07-01
This paper introduces a conformal transform of an acoustic domain under a one-dimensional, rough sea surface onto a domain with a flat top. This non-perturbative transform can include many hundreds of wavelengths of the surface variation. The resulting two-dimensional, flat-topped domain allows direct application of any existing, acoustic propagation model of the Helmholtz or wave equation using transformed sound speeds. Such a transform-model combination applies where the surface particle velocity is much slower than sound speed, such that the boundary motion can be neglected. Once the acoustic field is computed, the bijective (one-to-one and onto) mapping permits the field interpolation in terms of the original coordinates. The Bergstrom method for inverse Riemann maps determines the transform by iterated solution of an integral equation for a surface matching term. Rough sea surface forward scatter test cases provide verification of the method using a particular parabolic equation model of the Helmholtz equation.
Vector tomography for reconstructing electric fields with non-zero divergence in bounded domains
NASA Astrophysics Data System (ADS)
Koulouri, Alexandra; Brookes, Mike; Rimpiläinen, Ville
2017-01-01
In vector tomography (VT), the aim is to reconstruct an unknown multi-dimensional vector field using line integral data. In the case of a 2-dimensional VT, two types of line integral data are usually required. These data correspond to integration of the parallel and perpendicular projection of the vector field along the integration lines and are called the longitudinal and transverse measurements, respectively. In most cases, however, the transverse measurements cannot be physically acquired. Therefore, the VT methods are typically used to reconstruct divergence-free (or source-free) velocity and flow fields that can be reconstructed solely from the longitudinal measurements. In this paper, we show how vector fields with non-zero divergence in a bounded domain can also be reconstructed from the longitudinal measurements without the need of explicitly evaluating the transverse measurements. To the best of our knowledge, VT has not previously been used for this purpose. In particular, we study low-frequency, time-harmonic electric fields generated by dipole sources in convex bounded domains which arise, for example, in electroencephalography (EEG) source imaging. We explain in detail the theoretical background, the derivation of the electric field inverse problem and the numerical approximation of the line integrals. We show that fields with non-zero divergence can be reconstructed from the longitudinal measurements with the help of two sparsity constraints that are constructed from the transverse measurements and the vector Laplace operator. As a comparison to EEG source imaging, we note that VT does not require mathematical modeling of the sources. By numerical simulations, we show that the pattern of the electric field can be correctly estimated using VT and the location of the source activity can be determined accurately from the reconstructed magnitudes of the field.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fokicheva, V V
2015-10-31
A new class of integrable billiard systems, called generalized billiards, is discovered. These are billiards in domains formed by gluing classical billiard domains along pieces of their boundaries. (A classical billiard domain is a part of the plane bounded by arcs of confocal quadrics.) On the basis of the Fomenko-Zieschang theory of invariants of integrable systems, a full topological classification of generalized billiards is obtained, up to Liouville equivalence. Bibliography: 18 titles.
Cholesterol Bilayer Domains in the Eye Lens Health: A Review.
Widomska, Justyna; Subczynski, Witold K; Mainali, Laxman; Raguz, Marija
2017-12-01
The most unique biochemical characteristic of the eye lens fiber cell plasma membrane is its extremely high cholesterol content, the need for which is still unclear. It is evident, however, that the disturbance of Chol homeostasis may result in damages associated with cataracts. Electron paramagnetic resonance methods allow discrimination of two types of lipid domains in model membranes overloaded with Chol, namely, phospholipid-cholesterol domains and pure Chol bilayer domains. These domains are also detected in human lens lipid membranes prepared from the total lipids extracted from lens cortices and nuclei of donors from different age groups. Independent of the age-related changes in phospholipid composition, the physical properties of phospholipid-Chol domains remain the same for all age groups and are practically identical for cortical and nuclear membranes. The presence of Chol bilayer domains in these membranes provides a buffering capacity for cholesterol concentration in the surrounding phospholipid-Chol domains, keeping it at a constant saturating level and thus keeping the physical properties of the membrane consistent with and independent of changes in phospholipid composition. It seems that the presence of Chol bilayer domains plays an integral role in the regulation of cholesterol-dependent processes in fiber cell plasm membranes and in the maintenance of fiber cell membrane homeostasis.
Development of a Risk-Based Comparison Methodology of Carbon Capture Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engel, David W.; Dalton, Angela C.; Dale, Crystal
2014-06-01
Given the varying degrees of maturity among existing carbon capture (CC) technology alternatives, an understanding of the inherent technical and financial risk and uncertainty associated with these competing technologies is requisite to the success of carbon capture as a viable solution to the greenhouse gas emission challenge. The availability of tools and capabilities to conduct rigorous, risk–based technology comparisons is thus highly desirable for directing valuable resources toward the technology option(s) with a high return on investment, superior carbon capture performance, and minimum risk. To address this research need, we introduce a novel risk-based technology comparison method supported by anmore » integrated multi-domain risk model set to estimate risks related to technological maturity, technical performance, and profitability. Through a comparison between solid sorbent and liquid solvent systems, we illustrate the feasibility of estimating risk and quantifying uncertainty in a single domain (modular analytical capability) as well as across multiple risk dimensions (coupled analytical capability) for comparison. This method brings technological maturity and performance to bear on profitability projections, and carries risk and uncertainty modeling across domains via inter-model sharing of parameters, distributions, and input/output. The integration of the models facilitates multidimensional technology comparisons within a common probabilistic risk analysis framework. This approach and model set can equip potential technology adopters with the necessary computational capabilities to make risk-informed decisions about CC technology investment. The method and modeling effort can also be extended to other industries where robust tools and analytical capabilities are currently lacking for evaluating nascent technologies.« less
NASA Astrophysics Data System (ADS)
Markic, Silvija; Eilks, Ingo
2012-03-01
The study presented in this paper integrates data from four combined research studies, which are both qualitative and quantitative in nature. The studies describe freshman science student teachers' beliefs about teaching and learning. These freshmen intend to become teachers in Germany in one of four science teaching domains (secondary biology, chemistry, and physics, respectively, as well as primary school science). The qualitative data from the first study are based on student teachers' drawings of themselves in teaching situations. It was formulated using Grounded Theory to test three scales: Beliefs about Classroom Organisation, Beliefs about Teaching Objectives, and Epistemological Beliefs. Three further quantitative studies give insight into student teachers' curricular beliefs, their beliefs about the nature of science itself, and about the student- and/or teacher-centredness of science teaching. This paper describes a design to integrate all these data within a mixed methods framework. The aim of the current study is to describe a broad, triangulated picture of freshman science student teachers' beliefs about teaching and learning within their respective science teaching domain. The study reveals clear tendencies between the sub-groups. The results suggest that freshman chemistry and-even more pronouncedly-freshman physics student teachers profess quite traditional beliefs about science teaching and learning. Biology and primary school student teachers express beliefs about their subjects which are more in line with modern educational theory. The mixed methods approach towards the student teachers' beliefs is reflected upon and implications for science education and science teacher education are discussed.
Microwave dielectric properties of BNT-BT0.08 thin films prepared by sol-gel technique
NASA Astrophysics Data System (ADS)
Huitema, L.; Cernea, M.; Crunteanu, A.; Trupina, L.; Nedelcu, L.; Banciu, M. G.; Ghalem, A.; Rammal, M.; Madrangeas, V.; Passerieux, D.; Dutheil, P.; Dumas-Bouchiat, F.; Marchet, P.; Champeaux, C.
2016-04-01
We report for the first time the microwave characterization of 0.92(Bi0.5Na0.5)TiO3-0.08BaTiO3 (BNT-BT0.08) ferroelectric thin films fabricated by the sol-gel method and integrated in both planar and out-of-plane tunable capacitors for agile high-frequency applications and particularly on the WiFi frequency band from 2.4 GHz to 2.49 GHz. The permittivity and loss tangent of the realized BNT-BT0.08 layers have been first measured by a resonant cavity method working at 12.5 GHz. Then, we integrated the ferroelectric material in planar inter-digitated capacitors (IDC) and in out-of-plane metal-insulator-metal (MIM) devices and investigated their specific properties (dielectric tunability and losses) on the whole 100 MHz-15 GHz frequency domain. The 3D finite-elements electromagnetic simulations of the IDC capacitances are fitting very well with their measured responses and confirm the dielectric properties determined with the cavity method. While IDCs are not exhibiting an optimal tunability, the MIM capacitor devices with optimized Ir/MgO(100) bottom electrodes demonstrate a high dielectric tunability, of 30% at 2.45 GHz under applied voltages as low as 10 V, and it is reaching 50% under 20 V voltage bias at the same frequency. These high-frequency properties of the MIM devices integrating the BNT-BT0.08 films, combining a high tunability under low applied voltages indicate a wide integration potential for tunable devices in the microwave domain and particularly at 2.45 GHz, corresponding to the widely used industrial, scientific, and medical frequency band.
Rodriguez, Blanca; Carusi, Annamaria; Abi-Gerges, Najah; Ariga, Rina; Britton, Oliver; Bub, Gil; Bueno-Orovio, Alfonso; Burton, Rebecca A B; Carapella, Valentina; Cardone-Noott, Louie; Daniels, Matthew J; Davies, Mark R; Dutta, Sara; Ghetti, Andre; Grau, Vicente; Harmer, Stephen; Kopljar, Ivan; Lambiase, Pier; Lu, Hua Rong; Lyon, Aurore; Minchole, Ana; Muszkiewicz, Anna; Oster, Julien; Paci, Michelangelo; Passini, Elisa; Severi, Stefano; Taggart, Peter; Tinker, Andy; Valentin, Jean-Pierre; Varro, Andras; Wallman, Mikael; Zhou, Xin
2016-09-01
Both biomedical research and clinical practice rely on complex datasets for the physiological and genetic characterization of human hearts in health and disease. Given the complexity and variety of approaches and recordings, there is now growing recognition of the need to embed computational methods in cardiovascular medicine and science for analysis, integration and prediction. This paper describes a Workshop on Computational Cardiovascular Science that created an international, interdisciplinary and inter-sectorial forum to define the next steps for a human-based approach to disease supported by computational methodologies. The main ideas highlighted were (i) a shift towards human-based methodologies, spurred by advances in new in silico, in vivo, in vitro, and ex vivo techniques and the increasing acknowledgement of the limitations of animal models. (ii) Computational approaches complement, expand, bridge, and integrate in vitro, in vivo, and ex vivo experimental and clinical data and methods, and as such they are an integral part of human-based methodologies in pharmacology and medicine. (iii) The effective implementation of multi- and interdisciplinary approaches, teams, and training combining and integrating computational methods with experimental and clinical approaches across academia, industry, and healthcare settings is a priority. (iv) The human-based cross-disciplinary approach requires experts in specific methodologies and domains, who also have the capacity to communicate and collaborate across disciplines and cross-sector environments. (v) This new translational domain for human-based cardiology and pharmacology requires new partnerships supported financially and institutionally across sectors. Institutional, organizational, and social barriers must be identified, understood and overcome in each specific setting. © The Author 2015. Published by Oxford University Press on behalf of the European Society of Cardiology.
Integrity and security in an Ada runtime environment
NASA Technical Reports Server (NTRS)
Bown, Rodney L.
1991-01-01
A review is provided of the Formal Methods group discussions. It was stated that integrity is not a pure mathematical dual of security. The input data is part of the integrity domain. The group provided a roadmap for research. One item of the roadmap and the final position statement are closely related to the space shuttle and space station. The group's position is to use a safe subset of Ada. Examples of safe sets include the Army Secure Operating System and the Penelope Ada verification tool. It is recommended that a conservative attitude is required when writing Ada code for life and property critical systems.
Canards and black swans in a model of a 3-D autocatalator
NASA Astrophysics Data System (ADS)
Shchepakina, E.
2005-01-01
The mathematical model of a 3-D autocatalator is studied using the geometric theory of singular perturbations, namely, the black swan and canard techniques. Critical regimes are modeled by canards (one-dimensional stable-unstable slow integral manifolds). The meaning of criticality here is as follows. The critical regime corresponds to a chemical reaction which separates the domain of self-accelerating reactions from the domain of slow reactions. A two-dimensional stable-unstable slow integral manifold (black swan) consisting entirely of canards, which simulate the critical phenomena for different initial data of the dynamical system, is constructed. It is shown that this procedure leads to the phenomenon of auto-oscillations in the chemical system. The geometric approach combined with asymptotic and numerical methods permits us to explain the strong parametric sensitivity and to obtain asymptotic representations of the critical behavior of the chemical system.
Computer vision for general purpose visual inspection: a fuzzy logic approach
NASA Astrophysics Data System (ADS)
Chen, Y. H.
In automatic visual industrial inspection, computer vision systems have been widely used. Such systems are often application specific, and therefore require domain knowledge in order to have a successful implementation. Since visual inspection can be viewed as a decision making process, it is argued that the integration of fuzzy logic analysis and computer vision systems provides a practical approach to general purpose visual inspection applications. This paper describes the development of an integrated fuzzy-rule-based automatic visual inspection system. Domain knowledge about a particular application is represented as a set of fuzzy rules. From the status of predefined fuzzy variables, the set of fuzzy rules are defuzzified to give the inspection results. A practical application where IC marks (often in the forms of English characters and a company logo) inspection is demonstrated, which shows a more consistent result as compared to a conventional thresholding method.
Vector intensity reconstruction using the data completion method.
Langrenne, Christophe; Garcia, Alexandre
2013-04-01
This paper presents an application of the data completion method (DCM) for vector intensity reconstructions. A mobile array of 36 pressure-pressure probes (72 microphones) is used to perform measurements near a planar surface. Nevertheless, since the proposed method is based on integral formulations, DCM can be applied with any kind of geometry. This method requires the knowledge of Cauchy data (pressure and velocity) on a part of the boundary of an empty domain in order to evaluate pressure and velocity on the remaining part of the boundary. Intensity vectors are calculated in the interior domain surrounded by the measurement array. This inverse acoustic problem requires the use of a regularization method to obtain a realistic solution. An experiment in a closed wooden car trunk mock-up excited by a shaker and two loudspeakers is presented. In this case, where the volume of the mock-up is small (0.61 m(3)), standing-waves and fluid structure interactions appear and show that DCM is a powerful tool to identify sources in a confined space.
Fast time- and frequency-domain finite-element methods for electromagnetic analysis
NASA Astrophysics Data System (ADS)
Lee, Woochan
Fast electromagnetic analysis in time and frequency domain is of critical importance to the design of integrated circuits (IC) and other advanced engineering products and systems. Many IC structures constitute a very large scale problem in modeling and simulation, the size of which also continuously grows with the advancement of the processing technology. This results in numerical problems beyond the reach of existing most powerful computational resources. Different from many other engineering problems, the structure of most ICs is special in the sense that its geometry is of Manhattan type and its dielectrics are layered. Hence, it is important to develop structure-aware algorithms that take advantage of the structure specialties to speed up the computation. In addition, among existing time-domain methods, explicit methods can avoid solving a matrix equation. However, their time step is traditionally restricted by the space step for ensuring the stability of a time-domain simulation. Therefore, making explicit time-domain methods unconditionally stable is important to accelerate the computation. In addition to time-domain methods, frequency-domain methods have suffered from an indefinite system that makes an iterative solution difficult to converge fast. The first contribution of this work is a fast time-domain finite-element algorithm for the analysis and design of very large-scale on-chip circuits. The structure specialty of on-chip circuits such as Manhattan geometry and layered permittivity is preserved in the proposed algorithm. As a result, the large-scale matrix solution encountered in the 3-D circuit analysis is turned into a simple scaling of the solution of a small 1-D matrix, which can be obtained in linear (optimal) complexity with negligible cost. Furthermore, the time step size is not sacrificed, and the total number of time steps to be simulated is also significantly reduced, thus achieving a total cost reduction in CPU time. The second contribution is a new method for making an explicit time-domain finite-element method (TDFEM) unconditionally stable for general electromagnetic analysis. In this method, for a given time step, we find the unstable modes that are the root cause of instability, and deduct them directly from the system matrix resulting from a TDFEM based analysis. As a result, an explicit TDFEM simulation is made stable for an arbitrarily large time step irrespective of the space step. The third contribution is a new method for full-wave applications from low to very high frequencies in a TDFEM based on matrix exponential. In this method, we directly deduct the eigenmodes having large eigenvalues from the system matrix, thus achieving a significantly increased time step in the matrix exponential based TDFEM. The fourth contribution is a new method for transforming the indefinite system matrix of a frequency-domain FEM to a symmetric positive definite one. We deduct non-positive definite component directly from the system matrix resulting from a frequency-domain FEM-based analysis. The resulting new representation of the finite-element operator ensures an iterative solution to converge in a small number of iterations. We then add back the non-positive definite component to synthesize the original solution with negligible cost.
NASA Astrophysics Data System (ADS)
Rahmouni, Lyes; Mitharwal, Rajendra; Andriulli, Francesco P.
2017-11-01
This work presents two new volume integral equations for the Electroencephalography (EEG) forward problem which, differently from the standard integral approaches in the domain, can handle heterogeneities and anisotropies of the head/brain conductivity profiles. The new formulations translate to the quasi-static regime some volume integral equation strategies that have been successfully applied to high frequency electromagnetic scattering problems. This has been obtained by extending, to the volume case, the two classical surface integral formulations used in EEG imaging and by introducing an extra surface equation, in addition to the volume ones, to properly handle boundary conditions. Numerical results corroborate theoretical treatments, showing the competitiveness of our new schemes over existing techniques and qualifying them as a valid alternative to differential equation based methods.
NASA Technical Reports Server (NTRS)
Park, Han G.; Cannon, Howard; Bajwa, Anupa; Mackey, Ryan; James, Mark; Maul, William
2004-01-01
This paper describes the initial integration of a hybrid reasoning system utilizing a continuous domain feature-based detector, Beacon-based Exceptions Analysis for Multimissions (BEAM), and a discrete domain model-based reasoner, Livingstone.
CONNJUR Workflow Builder: A software integration environment for spectral reconstruction
Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O.; Ellis, Heidi J.C.; Gryk, Michael R.
2015-01-01
CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses. PMID:26066803
CONNJUR Workflow Builder: a software integration environment for spectral reconstruction.
Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O; Ellis, Heidi J C; Gryk, Michael R
2015-07-01
CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.
DOE R&D Accomplishments Database
Chandonia, John-Marc; Hon, Gary; Walker, Nigel S.; Lo Conte, Loredana; Koehl, Patrice; Levitt, Michael; Brenner, Steven E.
2003-09-15
The ASTRAL compendium provides several databases and tools to aid in the analysis of protein structures, particularly through the use of their sequences. Partially derived from the SCOP database of protein structure domains, it includes sequences for each domain and other resources useful for studying these sequences and domain structures. The current release of ASTRAL contains 54,745 domains, more than three times as many as the initial release four years ago. ASTRAL has undergone major transformations in the past two years. In addition to several complete updates each year, ASTRAL is now updated on a weekly basis with preliminary classifications of domains from newly released PDB structures. These classifications are available as a stand-alone database, as well as available integrated into other ASTRAL databases such as representative subsets. To enhance the utility of ASTRAL to structural biologists, all SCOP domains are now made available as PDB-style coordinate files as well as sequences. In addition to sequences and representative subsets based on SCOP domains, sequences and subsets based on PDB chains are newly included in ASTRAL. Several search tools have been added to ASTRAL to facilitate retrieval of data by individual users and automated methods.
Integral approximations to classical diffusion and smoothed particle hydrodynamics
Du, Qiang; Lehoucq, R. B.; Tartakovsky, A. M.
2014-12-31
The contribution of the paper is the approximation of a classical diffusion operator by an integral equation with a volume constraint. A particular focus is on classical diffusion problems associated with Neumann boundary conditions. By exploiting this approximation, we can also approximate other quantities such as the flux out of a domain. Our analysis of the model equation on the continuum level is closely related to the recent work on nonlocal diffusion and peridynamic mechanics. In particular, we elucidate the role of a volumetric constraint as an approximation to a classical Neumann boundary condition in the presence of physical boundary.more » The volume-constrained integral equation then provides the basis for accurate and robust discretization methods. As a result, an immediate application is to the understanding and improvement of the Smoothed Particle Hydrodynamics (SPH) method.« less
Lagrangian Particle Tracking Simulation for Warm-Rain Processes in Quasi-One-Dimensional Domain
NASA Astrophysics Data System (ADS)
Kunishima, Y.; Onishi, R.
2017-12-01
Conventional cloud simulations are based on the Euler method and compute each microphysics process in a stochastic way assuming infinite numbers of particles within each numerical grid. They therefore cannot provide the Lagrangian statistics of individual particles in cloud microphysics (i.e., aerosol particles, cloud particles, and rain drops) nor discuss the statistical fluctuations due to finite number of particles. We here simulate the entire precipitation process of warm-rain, with tracking individual particles. We use the Lagrangian Cloud Simulator (LCS), which is based on the Euler-Lagrangian framework. In that framework, flow motion and scalar transportation are computed with the Euler method, and particle motion with the Lagrangian one. The LCS tracks particle motions and collision events individually with considering the hydrodynamic interaction between approaching particles with a superposition method, that is, it can directly represent the collisional growth of cloud particles. It is essential for trustworthy collision detection to take account of the hydrodynamic interaction. In this study, we newly developed a stochastic model based on the Twomey cloud condensation nuclei (CCN) activation for the Lagrangian tracking simulation and integrated it into the LCS. Coupling with the Euler computation for water vapour and temperature fields, the initiation and condensational growth of water droplets were computed in the Lagrangian way. We applied the integrated LCS for a kinematic simulation of warm-rain processes in a vertically-elongated domain of, at largest, 0.03×0.03×3000 (m3) with horizontal periodicity. Aerosol particles with a realistic number density, 5×107 (m3), were evenly distributed over the domain at the initial state. Prescribed updraft at the early stage initiated development of a precipitating cloud. We have confirmed that the obtained bulk statistics fairly agree with those from a conventional spectral-bin scheme for a vertical column domain. The centre of the discussion will be the Lagrangian statistics which is collected from the individual behaviour of the tracked particles.
A Discussion on the Substitution Method for Trigonometric Rational Functions
ERIC Educational Resources Information Center
Ponce-Campuzano, Juan Carlos; Rivera-Figueroa, Antonio
2011-01-01
It is common to see, in the books on calculus, primitives of functions (some authors use the word "antiderivative" instead of primitive). However, the majority of authors pay scant attention to the domains over which the primitives are valid, which could lead to errors in the evaluation of definite integrals. In the teaching of calculus, in…
ERIC Educational Resources Information Center
Nzeadibe, Augustina Chinyere; Uchem, Rose Nkechi; Nzeadibe, Thaddeus Chidi
2018-01-01
This study utilized qualitative methods and the urban political ecology (UPE) framework to situate changes in scope and content of undergraduate geography curriculum in Nigeria within the domain of education for sustainability. It was stimulated by significant curriculum-related events in the geography department of the University of Nigeria, and…
Detection of Road Surface States from Tire Noise Using Neural Network Analysis
NASA Astrophysics Data System (ADS)
Kongrattanaprasert, Wuttiwat; Nomura, Hideyuki; Kamakura, Tomoo; Ueda, Koji
This report proposes a new processing method for automatically detecting the states of road surfaces from tire noises of passing vehicles. In addition to multiple indicators of the signal features in the frequency domain, we propose a few feature indicators in the time domain to successfully classify the road states into four categories: snowy, slushy, wet, and dry states. The method is based on artificial neural networks. The proposed classification is carried out in multiple neural networks using learning vector quantization. The outcomes of the networks are then integrated by the voting decision-making scheme. Experimental results obtained from recorded signals for ten days in the snowy season demonstrated that an accuracy of approximately 90% can be attained for predicting road surface states using only tire noise data.
The State of Nursing Home Information Technology Sophistication in Rural and Nonrural US Markets.
Alexander, Gregory L; Madsen, Richard W; Miller, Erin L; Wakefield, Douglas S; Wise, Keely K; Alexander, Rachel L
2017-06-01
To test for significant differences in information technology sophistication (ITS) in US nursing homes (NH) based on location. We administered a primary survey January 2014 to July 2015 to NH in each US state. The survey was cross-sectional and examined 3 dimensions (IT capabilities, extent of IT use, degree of IT integration) among 3 domains (resident care, clinical support, administrative activities) of ITS. ITS was broken down by NH location. Mean responses were compared across 4 NH categories (Metropolitan, Micropolitan, Small Town, and Rural) for all 9 ITS dimensions and domains. Least square means and Tukey's method were used for multiple comparisons. Methods yielded 815/1,799 surveys (45% response rate). In every health care domain (resident care, clinical support, and administrative activities) statistical differences in facility ITS occurred in larger (metropolitan or micropolitan) and smaller (small town or rural) populated areas. This study represents the most current national assessment of NH IT since 2004. Historically, NH IT has been used solely for administrative activities and much less for resident care and clinical support. However, results are encouraging as ITS in other domains appears to be greater than previously imagined. © 2016 National Rural Health Association.
NASA Technical Reports Server (NTRS)
Rummel, R.
1975-01-01
Integral formulas in the parameter domain are used instead of a representation by spherical harmonics. The neglected regions will cause a truncation error. The application of the discrete form of the integral equations connecting the satellite observations with surface gravity anomalies is discussed in comparison with the least squares prediction method. One critical point of downward continuation is the proper choice of the boundary surface. Practical feasibilities are in conflict with theoretical considerations. The properties of different approaches for this question are analyzed.
Ontology-Driven Information Integration
NASA Technical Reports Server (NTRS)
Tissot, Florence; Menzel, Chris
2005-01-01
Ontology-driven information integration (ODII) is a method of computerized, automated sharing of information among specialists who have expertise in different domains and who are members of subdivisions of a large, complex enterprise (e.g., an engineering project, a government agency, or a business). In ODII, one uses rigorous mathematical techniques to develop computational models of engineering and/or business information and processes. These models are then used to develop software tools that support the reliable processing and exchange of information among the subdivisions of this enterprise or between this enterprise and other enterprises.
The Crank Nicolson Time Integrator for EMPHASIS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McGregor, Duncan Alisdair Odum; Love, Edward; Kramer, Richard Michael Jack
2018-03-01
We investigate the use of implicit time integrators for finite element time domain approxi- mations of Maxwell's equations in vacuum. We discretize Maxwell's equations in time using Crank-Nicolson and in 3D space using compatible finite elements. We solve the system by taking a single step of Newton's method and inverting the Eddy-Current Schur complement allowing for the use of standard preconditioning techniques. This approach also generalizes to more complex material models that can include the Unsplit PML. We present verification results and demonstrate performance at CFL numbers up to 1000.
Hahn, Paul; Migacz, Justin; O'Donnell, Rachelle; Day, Shelley; Lee, Annie; Lin, Phoebe; Vann, Robin; Kuo, Anthony; Fekrat, Sharon; Mruthyunjaya, Prithvi; Postel, Eric A; Izatt, Joseph A; Toth, Cynthia A
2013-01-01
The authors have recently developed a high-resolution microscope-integrated spectral domain optical coherence tomography (MIOCT) device designed to enable OCT acquisition simultaneous with surgical maneuvers. The purpose of this report is to describe translation of this device from preclinical testing into human intraoperative imaging. Before human imaging, surgical conditions were fully simulated for extensive preclinical MIOCT evaluation in a custom model eye system. Microscope-integrated spectral domain OCT images were then acquired in normal human volunteers and during vitreoretinal surgery in patients who consented to participate in a prospective institutional review board-approved study. Microscope-integrated spectral domain OCT images were obtained before and at pauses in surgical maneuvers and were compared based on predetermined diagnostic criteria to images obtained with a high-resolution spectral domain research handheld OCT system (HHOCT; Bioptigen, Inc) at the same time point. Cohorts of five consecutive patients were imaged. Successful end points were predefined, including ≥80% correlation in identification of pathology between MIOCT and HHOCT in ≥80% of the patients. Microscope-integrated spectral domain OCT was favorably evaluated by study surgeons and scrub nurses, all of whom responded that they would consider participating in human intraoperative imaging trials. The preclinical evaluation identified significant improvements that were made before MIOCT use during human surgery. The MIOCT transition into clinical human research was smooth. Microscope-integrated spectral domain OCT imaging in normal human volunteers demonstrated high resolution comparable to tabletop scanners. In the operating room, after an initial learning curve, surgeons successfully acquired human macular MIOCT images before and after surgical maneuvers. Microscope-integrated spectral domain OCT imaging confirmed preoperative diagnoses, such as full-thickness macular hole and vitreomacular traction, and demonstrated postsurgical changes in retinal morphology. Two cohorts of five patients were imaged. In the second cohort, the predefined end points were exceeded with ≥80% correlation between microscope-mounted OCT and HHOCT imaging in 100% of the patients. This report describes high-resolution MIOCT imaging using the prototype device in human eyes during vitreoretinal surgery, with successful achievement of predefined end points for imaging. Further refinements and investigations will be directed toward fully integrating MIOCT with vitreoretinal and other ocular surgery to image surgical maneuvers in real time.
Eshkuvatov, Z K; Zulkarnain, F S; Nik Long, N M A; Muminov, Z
2016-01-01
Modified homotopy perturbation method (HPM) was used to solve the hypersingular integral equations (HSIEs) of the first kind on the interval [-1,1] with the assumption that the kernel of the hypersingular integral is constant on the diagonal of the domain. Existence of inverse of hypersingular integral operator leads to the convergence of HPM in certain cases. Modified HPM and its norm convergence are obtained in Hilbert space. Comparisons between modified HPM, standard HPM, Bernstein polynomials approach Mandal and Bhattacharya (Appl Math Comput 190:1707-1716, 2007), Chebyshev expansion method Mahiub et al. (Int J Pure Appl Math 69(3):265-274, 2011) and reproducing kernel Chen and Zhou (Appl Math Lett 24:636-641, 2011) are made by solving five examples. Theoretical and practical examples revealed that the modified HPM dominates the standard HPM and others. Finally, it is found that the modified HPM is exact, if the solution of the problem is a product of weights and polynomial functions. For rational solution the absolute error decreases very fast by increasing the number of collocation points.
Integrating art into science education: a survey of science teachers' practices
NASA Astrophysics Data System (ADS)
Turkka, Jaakko; Haatainen, Outi; Aksela, Maija
2017-07-01
Numerous case studies suggest that integrating art and science education could engage students with creative projects and encourage students to express science in multitude of ways. However, little is known about art integration practices in everyday science teaching. With a qualitative e-survey, this study explores the art integration of science teachers (n = 66). A pedagogical model for science teachers' art integration emerged from a qualitative content analysis conducted on examples of art integration. In the model, art integration is characterised as integration through content and activities. Whilst the links in the content were facilitated either directly between concepts and ideas or indirectly through themes or artefacts, the integration through activity often connected an activity in one domain and a concept, idea or artefact in the other domain with the exception of some activities that could belong to both domains. Moreover, the examples of art integration in everyday classroom did not include expression of emotions often associated with art. In addition, quantitative part of the survey confirmed that integration is infrequent in all mapped areas. The findings of this study have implications for science teacher education that should offer opportunities for more consistent art integration.
NASA Astrophysics Data System (ADS)
Liu, Youshan; Teng, Jiwen; Xu, Tao; Badal, José
2017-05-01
The mass-lumped method avoids the cost of inverting the mass matrix and simultaneously maintains spatial accuracy by adopting additional interior integration points, known as cubature points. To date, such points are only known analytically in tensor domains, such as quadrilateral or hexahedral elements. Thus, the diagonal-mass-matrix spectral element method (SEM) in non-tensor domains always relies on numerically computed interpolation points or quadrature points. However, only the cubature points for degrees 1 to 6 are known, which is the reason that we have developed a p-norm-based optimization algorithm to obtain higher-order cubature points. In this way, we obtain and tabulate new cubature points with all positive integration weights for degrees 7 to 9. The dispersion analysis illustrates that the dispersion relation determined from the new optimized cubature points is comparable to that of the mass and stiffness matrices obtained by exact integration. Simultaneously, the Lebesgue constant for the new optimized cubature points indicates its surprisingly good interpolation properties. As a result, such points provide both good interpolation properties and integration accuracy. The Courant-Friedrichs-Lewy (CFL) numbers are tabulated for the conventional Fekete-based triangular spectral element (TSEM), the TSEM with exact integration, and the optimized cubature-based TSEM (OTSEM). A complementary study demonstrates the spectral convergence of the OTSEM. A numerical example conducted on a half-space model demonstrates that the OTSEM improves the accuracy by approximately one order of magnitude compared to the conventional Fekete-based TSEM. In particular, the accuracy of the 7th-order OTSEM is even higher than that of the 14th-order Fekete-based TSEM. Furthermore, the OTSEM produces a result that can compete in accuracy with the quadrilateral SEM (QSEM). The high accuracy of the OTSEM is also tested with a non-flat topography model. In terms of computational efficiency, the OTSEM is more efficient than the Fekete-based TSEM, although it is slightly costlier than the QSEM when a comparable numerical accuracy is required.
Zhang, Meng; Liu, Zhigang; Zhu, Yu; Bu, Mingfan; Hong, Jun
2017-07-01
In this paper, a hybrid control system is developed by integrating the closed-loop force feedback and input shaping method to overcome the problem of the hysteresis and dynamic behavior in piezo-based scanning systems and increase the scanning speed of tunable external cavity diode lasers. The flexible hinge and piezoelectric actuators are analyzed, and a dynamic model of the scanning systems is established. A force sensor and an integral controller are utilized in integral force feedback (IFF) to directly augment the damping of the piezoelectric scanning systems. Hysteresis has been effectively eliminated, but the mechanical resonance is still evident. Noticeable residual vibration occurred after the inflection points and then gradually disappeared. For the further control of mechanical resonance, based on the theory of minimum-acceleration trajectory planning, the time-domain input shaping method was developed. The turning sections of a scanning trajectory are replaced by smooth curves, while the linear sections are retained. The IFF method is combined with the input shaping method to control the non-linearity and mechanical resonance in high-speed piezo-based scanning systems. Experiments are conducted, and the results demonstrate the effectiveness of the proposed control approach.
NASA Astrophysics Data System (ADS)
Zhang, Meng; Liu, Zhigang; Zhu, Yu; Bu, Mingfan; Hong, Jun
2017-07-01
In this paper, a hybrid control system is developed by integrating the closed-loop force feedback and input shaping method to overcome the problem of the hysteresis and dynamic behavior in piezo-based scanning systems and increase the scanning speed of tunable external cavity diode lasers. The flexible hinge and piezoelectric actuators are analyzed, and a dynamic model of the scanning systems is established. A force sensor and an integral controller are utilized in integral force feedback (IFF) to directly augment the damping of the piezoelectric scanning systems. Hysteresis has been effectively eliminated, but the mechanical resonance is still evident. Noticeable residual vibration occurred after the inflection points and then gradually disappeared. For the further control of mechanical resonance, based on the theory of minimum-acceleration trajectory planning, the time-domain input shaping method was developed. The turning sections of a scanning trajectory are replaced by smooth curves, while the linear sections are retained. The IFF method is combined with the input shaping method to control the non-linearity and mechanical resonance in high-speed piezo-based scanning systems. Experiments are conducted, and the results demonstrate the effectiveness of the proposed control approach.
Burt, Kate Gardner; Koch, Pamela; Contento, Isobel
2017-10-01
Researchers have established the benefits of school gardens on students' academic achievement, dietary outcomes, physical activity, and psychosocial skills, yet limited research has been conducted about how school gardens become institutionalized and sustained. Our aim was to develop a tool that captures how gardens are effectively established, integrated, and sustained in schools. We conducted a sequential, exploratory, mixed-methods study. Participants were identified with the help of Grow To Learn, the organization coordinating the New York City school garden initiative, and recruited via e-mail. A stratified, purposeful sample of 21 New York City elementary and middle schools participated in this study throughout the 2013/2014 school year. The sample was stratified in their garden budgets and purposeful in that each of the schools' gardens were determined to be well integrated and sustained. The processes and strategies used by school gardeners to establish well-integrated school gardens were assessed via data collected from surveys, interviews, observations, and concept mapping. Descriptive statistics as well as multidimensional scaling and hierarchical cluster analysis were used to examine the survey and concept mapping data. Qualitative data analysis consisted of thematic coding, pattern matching, explanation building and cross-case synthesis. Nineteen components within four domains of school garden integration were found through the mixed-methods concept mapping analysis. When the analyses of other data were combined, relationships between domains and components emerged. These data resulted in the development of the GREEN (Garden Resources, Education, and Environment Nexus) Tool. When schools with integrated and sustained gardens were studied, patterns emerged about how gardeners achieve institutionalization through different combinations of critical components. These patterns are best described by the GREEN Tool, the first framework to identify how to operationalize school gardening components and describe an evidence-based strategy of successful school garden integration. Copyright © 2017 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.
Classification of billiard motions in domains bounded by confocal parabolas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fokicheva, V V
2014-08-01
We consider the billiard dynamical system in a domain bounded by confocal parabolas. We describe such domains in which the billiard problem can be correctly stated. In each such domain we prove the integrability for the system, analyse the arising Liouville foliation, and calculate the invariant of Liouville equivalence--the so-called marked molecule. It turns out that billiard systems in certain parabolic domains have the same closures of solutions (integral trajectories) as the systems of Goryachev-Chaplygin-Sretenskii and Joukowski at suitable energy levels. We also describe the billiard motion in noncompact domains bounded by confocal parabolas, namely, we describe the topology of themore » Liouville foliation in terms of rough molecules. Bibliography: 16 titles.« less
Padwa, Howard; Teruya, Cheryl; Tran, Elise; Lovinger, Katherine; Antonini, Valerie P; Overholt, Colleen; Urada, Darren
2016-03-01
The majority of adults with mental health (MH) and substance use (SU) disorders in the United States do not receive treatment. The Affordable Care Act will create incentives for primary care centers to begin providing behavioral health (MH and SU) services, thus promising to address the MH and SU treatment gaps. This paper examines the implementation of integrated care protocols by three primary care organizations. The Behavioral Health Integration in Medical Care (BHIMC) tool was used to evaluate the integrated care capacity of primary care organizations that chose to participate in the Kern County (California) Mental Health Department's Project Care annually for 3years. For a subsample of clinics, change over time was measured. Informed by the Conceptual Model of Evidence-Based Practice Implementation in Public Service Sectors, inner and outer contextual factors impacting implementation were identified and analyzed using multiple data sources and qualitative analytic methods. The primary care organizations all offered partially integrated (PI) services throughout the study period. At baseline, organizations offered minimally integrated/partially integrated (MI/PI) services in the Program Milieu, Clinical Process - Treatment, and Staffing domains of the BHIMC, and scores on all domains were at the partially integrated (PI) level or higher in the first and second follow-ups. Integrated care services emphasized the identification and management of MH more than SU in 52.2% of evaluated domains, but did not emphasize SU more than MH in any of them. Many of the gaps between MH and SU emphases were associated with limited capacities related to SU medications. Several outer (socio-political context, funding, leadership) and inner (organizational characteristics, individual adopter characteristics, leadership, innovation-values fit) contextual factors impacted the development of integrated care capacity. This study of a small sample of primary care organizations showed that it is possible to improve their integrated care capacity as measured by the BHIMC, though it may be difficult or unfeasible for them to provide fully integrated behavioral health services. Integrated services emphasized MH more than SU, and enhancing primary care clinic capacities related to SU medications may help close this gap. Both inner and outer contextual factors may impact integrated service capacity development in primary care clinics. Study findings may be used to inform future research on integrated care and inform the implementation of efforts to enhance integrated care capacity in primary care clinics. Copyright © 2015 Elsevier Inc. All rights reserved.
Ma, Liyan; Qiu, Bo; Cui, Mingyue; Ding, Jianwei
2017-01-01
Depth image-based rendering (DIBR), which is used to render virtual views with a color image and the corresponding depth map, is one of the key techniques in the 2D to 3D conversion process. Due to the absence of knowledge about the 3D structure of a scene and its corresponding texture, DIBR in the 2D to 3D conversion process, inevitably leads to holes in the resulting 3D image as a result of newly-exposed areas. In this paper, we proposed a structure-aided depth map preprocessing framework in the transformed domain, which is inspired by recently proposed domain transform for its low complexity and high efficiency. Firstly, our framework integrates hybrid constraints including scene structure, edge consistency and visual saliency information in the transformed domain to improve the performance of depth map preprocess in an implicit way. Then, adaptive smooth localization is cooperated and realized in the proposed framework to further reduce over-smoothness and enhance optimization in the non-hole regions. Different from the other similar methods, the proposed method can simultaneously achieve the effects of hole filling, edge correction and local smoothing for typical depth maps in a united framework. Thanks to these advantages, it can yield visually satisfactory results with less computational complexity for high quality 2D to 3D conversion. Numerical experimental results demonstrate the excellent performances of the proposed method. PMID:28407027
Computational prediction of host-pathogen protein-protein interactions.
Dyer, Matthew D; Murali, T M; Sobral, Bruno W
2007-07-01
Infectious diseases such as malaria result in millions of deaths each year. An important aspect of any host-pathogen system is the mechanism by which a pathogen can infect its host. One method of infection is via protein-protein interactions (PPIs) where pathogen proteins target host proteins. Developing computational methods that identify which PPIs enable a pathogen to infect a host has great implications in identifying potential targets for therapeutics. We present a method that integrates known intra-species PPIs with protein-domain profiles to predict PPIs between host and pathogen proteins. Given a set of intra-species PPIs, we identify the functional domains in each of the interacting proteins. For every pair of functional domains, we use Bayesian statistics to assess the probability that two proteins with that pair of domains will interact. We apply our method to the Homo sapiens-Plasmodium falciparum host-pathogen system. Our system predicts 516 PPIs between proteins from these two organisms. We show that pairs of human proteins we predict to interact with the same Plasmodium protein are close to each other in the human PPI network and that Plasmodium pairs predicted to interact with same human protein are co-expressed in DNA microarray datasets measured during various stages of the Plasmodium life cycle. Finally, we identify functionally enriched sub-networks spanned by the predicted interactions and discuss the plausibility of our predictions. Supplementary data are available at http://staff.vbi.vt.edu/dyermd/publications/dyer2007a.html. Supplementary data are available at Bioinformatics online.
Chin, Jessie; Payne, Brennan; Gao, Xuefei; Conner-Garcia, Thembi; Graumlich, James F.; Murray, Michael D.; Morrow, Daniel G.; Stine-Morrow, Elizabeth A.L.
2014-01-01
While there is evidence that knowledge influences understanding of health information, less is known about the processing mechanisms underlying this effect and its impact on memory. We used the moving window paradigm to examine how older adults varying in domain-general crystallized ability (verbal ability) and health knowledge allocate attention to understand health and domain-general texts. Participants (n=107, aged 60 to 88 yrs) read and recalled single sentences about hypertension and about non-health topics. Mixed-effects modeling of word-by-word reading times suggested that domain-general crystallized ability increased conceptual integration regardless of text domain, while health knowledge selectively increased resource allocation to conceptual integration at clause boundaries in health texts. These patterns of attentional allocation were related to subsequent recall performance. Although older adults with lower levels of crystallized ability were less likely to engage in integrative processing, when they did, this strategy had a compensatory effect in improving recall. These findings suggest that semantic integration during reading is an important comprehension process that supports the construction of the memory representation and is engendered by knowledge. Implications of the findings for theories of text processing and memory as well as for designing patient education materials are discussed. PMID:24787361
Chin, Jessie; Payne, Brennan; Gao, Xuefei; Conner-Garcia, Thembi; Graumlich, James F; Murray, Michael D; Morrow, Daniel G; Stine-Morrow, Elizabeth A L
2015-01-01
While there is evidence that knowledge influences understanding of health information, less is known about the processing mechanisms underlying this effect and its impact on memory. We used the moving window paradigm to examine how older adults varying in domain-general crystallised ability (verbal ability) and health knowledge allocate attention to understand health and domain-general texts. Participants (n = 107, age: 60-88 years) read and recalled single sentences about hypertension and about non-health topics. Mixed-effects modelling of word-by-word reading times suggested that domain-general crystallised ability increased conceptual integration regardless of text domain, while health knowledge selectively increased resource allocation to conceptual integration at clause boundaries in health texts. These patterns of attentional allocation were related to subsequent recall performance. Although older adults with lower levels of crystallised ability were less likely to engage in integrative processing, when they did, this strategy had a compensatory effect in improving recall. These findings suggest that semantic integration during reading is an important comprehension process that supports the construction of the memory representation and is engendered by knowledge. Implications of the findings for theories of text processing and memory as well as for designing patient education materials are discussed.
Macmillan, Donna S; Canipa, Steven J; Chilton, Martyn L; Williams, Richard V; Barber, Christopher G
2016-04-01
There is a pressing need for non-animal methods to predict skin sensitisation potential and a number of in chemico and in vitro assays have been designed with this in mind. However, some compounds can fall outside the applicability domain of these in chemico/in vitro assays and may not be predicted accurately. Rule-based in silico models such as Derek Nexus are expert-derived from animal and/or human data and the mechanism-based alert domain can take a number of factors into account (e.g. abiotic/biotic activation). Therefore, Derek Nexus may be able to predict for compounds outside the applicability domain of in chemico/in vitro assays. To this end, an integrated testing strategy (ITS) decision tree using Derek Nexus and a maximum of two assays (from DPRA, KeratinoSens, LuSens, h-CLAT and U-SENS) was developed. Generally, the decision tree improved upon other ITS evaluated in this study with positive and negative predictivity calculated as 86% and 81%, respectively. Our results demonstrate that an ITS using an in silico model such as Derek Nexus with a maximum of two in chemico/in vitro assays can predict the sensitising potential of a number of chemicals, including those outside the applicability domain of existing non-animal assays. Copyright © 2016 Elsevier Inc. All rights reserved.
Ferraro, Jeffrey P; Daumé, Hal; Duvall, Scott L; Chapman, Wendy W; Harkema, Henk; Haug, Peter J
2013-01-01
Natural language processing (NLP) tasks are commonly decomposed into subtasks, chained together to form processing pipelines. The residual error produced in these subtasks propagates, adversely affecting the end objectives. Limited availability of annotated clinical data remains a barrier to reaching state-of-the-art operating characteristics using statistically based NLP tools in the clinical domain. Here we explore the unique linguistic constructions of clinical texts and demonstrate the loss in operating characteristics when out-of-the-box part-of-speech (POS) tagging tools are applied to the clinical domain. We test a domain adaptation approach integrating a novel lexical-generation probability rule used in a transformation-based learner to boost POS performance on clinical narratives. Two target corpora from independent healthcare institutions were constructed from high frequency clinical narratives. Four leading POS taggers with their out-of-the-box models trained from general English and biomedical abstracts were evaluated against these clinical corpora. A high performing domain adaptation method, Easy Adapt, was compared to our newly proposed method ClinAdapt. The evaluated POS taggers drop in accuracy by 8.5-15% when tested on clinical narratives. The highest performing tagger reports an accuracy of 88.6%. Domain adaptation with Easy Adapt reports accuracies of 88.3-91.0% on clinical texts. ClinAdapt reports 93.2-93.9%. ClinAdapt successfully boosts POS tagging performance through domain adaptation requiring a modest amount of annotated clinical data. Improving the performance of critical NLP subtasks is expected to reduce pipeline error propagation leading to better overall results on complex processing tasks.
A Galleria Boundary Element Method for two-dimensional nonlinear magnetostatics
NASA Astrophysics Data System (ADS)
Brovont, Aaron D.
The Boundary Element Method (BEM) is a numerical technique for solving partial differential equations that is used broadly among the engineering disciplines. The main advantage of this method is that one needs only to mesh the boundary of a solution domain. A key drawback is the myriad of integrals that must be evaluated to populate the full system matrix. To this day these integrals have been evaluated using numerical quadrature. In this research, a Galerkin formulation of the BEM is derived and implemented to solve two-dimensional magnetostatic problems with a focus on accurate, rapid computation. To this end, exact, closed-form solutions have been derived for all the integrals comprising the system matrix as well as those required to compute fields in post-processing; the need for numerical integration has been eliminated. It is shown that calculation of the system matrix elements using analytical solutions is 15-20 times faster than with numerical integration of similar accuracy. Furthermore, through the example analysis of a c-core inductor, it is demonstrated that the present BEM formulation is a competitive alternative to the Finite Element Method (FEM) for linear magnetostatic analysis. Finally, the BEM formulation is extended to analyze nonlinear magnetostatic problems via the Dual Reciprocity Method (DRBEM). It is shown that a coarse, meshless analysis using the DRBEM is able to achieve RMS error of 3-6% compared to a commercial FEM package in lightly saturated conditions.
TDR method for determine IC's parameters
NASA Astrophysics Data System (ADS)
Timoshenkov, V.; Rodionov, D.; Khlybov, A.
2016-12-01
Frequency domain simulation is a widely used approach for determine integrated circuits parameters. This approach can be found in most of software tools used in IC industry. Time domain simulation approach shows intensive usage last years due to some advantages. In particular it applicable for analysis of nonlinear and nonstationary systems where frequency domain is inapplicable. Resolution of time domain systems allow see heterogeneities on distance 1mm, determine it parameters and properties. Authors used approach based on detecting reflected signals from heterogeneities - time domain reflectometry (TDR). Field effect transistor technology scaling up to 30-60nm gate length and 10nm gate dielectric, heterojunction bi-polar transistors with 10-30nm base width allows fabricate digital IC's with 20GHz clock frequency and RF-IC's with tens GHz bandwidth. Such devices and operation speed suppose transit signal by use microwave lines. There are local heterogeneities can be found inside of the signal path due to connections between different parts of signal lines (stripe line-RF-connector pin, stripe line - IC package pin). These heterogeneities distort signals that cause bandwidth decrease for RF-devices. Time domain research methods of transmission and reflected signals give the opportunities to determine heterogeneities, it properties, parameters and built up equivalent circuits. Experimental results are provided and show possibility for inductance and capacitance measurement up to 25GHz. Measurements contains result of signal path research on IC and printed circuit board (PCB) used for 12GHz RF chips. Also dielectric constant versus frequency was measured up to 35GHz.
Maximum-entropy reconstruction method for moment-based solution of the Boltzmann equation
NASA Astrophysics Data System (ADS)
Summy, Dustin; Pullin, Dale
2013-11-01
We describe a method for a moment-based solution of the Boltzmann equation. This starts with moment equations for a 10 + 9 N , N = 0 , 1 , 2 . . . -moment representation. The partial-differential equations (PDEs) for these moments are unclosed, containing both higher-order moments and molecular-collision terms. These are evaluated using a maximum-entropy construction of the velocity distribution function f (c , x , t) , using the known moments, within a finite-box domain of single-particle-velocity (c) space. Use of a finite-domain alleviates known problems (Junk and Unterreiter, Continuum Mech. Thermodyn., 2002) concerning existence and uniqueness of the reconstruction. Unclosed moments are evaluated with quadrature while collision terms are calculated using a Monte-Carlo method. This allows integration of the moment PDEs in time. Illustrative examples will include zero-space- dimensional relaxation of f (c , t) from a Mott-Smith-like initial condition toward equilibrium and one-space dimensional, finite Knudsen number, planar Couette flow. Comparison with results using the direct-simulation Monte-Carlo method will be presented.
Lexicon-enhanced sentiment analysis framework using rule-based classification scheme.
Asghar, Muhammad Zubair; Khan, Aurangzeb; Ahmad, Shakeel; Qasim, Maria; Khan, Imran Ali
2017-01-01
With the rapid increase in social networks and blogs, the social media services are increasingly being used by online communities to share their views and experiences about a particular product, policy and event. Due to economic importance of these reviews, there is growing trend of writing user reviews to promote a product. Nowadays, users prefer online blogs and review sites to purchase products. Therefore, user reviews are considered as an important source of information in Sentiment Analysis (SA) applications for decision making. In this work, we exploit the wealth of user reviews, available through the online forums, to analyze the semantic orientation of words by categorizing them into +ive and -ive classes to identify and classify emoticons, modifiers, general-purpose and domain-specific words expressed in the public's feedback about the products. However, the un-supervised learning approach employed in previous studies is becoming less efficient due to data sparseness, low accuracy due to non-consideration of emoticons, modifiers, and presence of domain specific words, as they may result in inaccurate classification of users' reviews. Lexicon-enhanced sentiment analysis based on Rule-based classification scheme is an alternative approach for improving sentiment classification of users' reviews in online communities. In addition to the sentiment terms used in general purpose sentiment analysis, we integrate emoticons, modifiers and domain specific terms to analyze the reviews posted in online communities. To test the effectiveness of the proposed method, we considered users reviews in three domains. The results obtained from different experiments demonstrate that the proposed method overcomes limitations of previous methods and the performance of the sentiment analysis is improved after considering emoticons, modifiers, negations, and domain specific terms when compared to baseline methods.
Vector tomography for reconstructing electric fields with non-zero divergence in bounded domains
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koulouri, Alexandra, E-mail: koulouri@uni-muenster.de; Department of Electrical and Electronic Engineering, Imperial College London, Exhibition Road, London SW7 2BT; Brookes, Mike
In vector tomography (VT), the aim is to reconstruct an unknown multi-dimensional vector field using line integral data. In the case of a 2-dimensional VT, two types of line integral data are usually required. These data correspond to integration of the parallel and perpendicular projection of the vector field along the integration lines and are called the longitudinal and transverse measurements, respectively. In most cases, however, the transverse measurements cannot be physically acquired. Therefore, the VT methods are typically used to reconstruct divergence-free (or source-free) velocity and flow fields that can be reconstructed solely from the longitudinal measurements. In thismore » paper, we show how vector fields with non-zero divergence in a bounded domain can also be reconstructed from the longitudinal measurements without the need of explicitly evaluating the transverse measurements. To the best of our knowledge, VT has not previously been used for this purpose. In particular, we study low-frequency, time-harmonic electric fields generated by dipole sources in convex bounded domains which arise, for example, in electroencephalography (EEG) source imaging. We explain in detail the theoretical background, the derivation of the electric field inverse problem and the numerical approximation of the line integrals. We show that fields with non-zero divergence can be reconstructed from the longitudinal measurements with the help of two sparsity constraints that are constructed from the transverse measurements and the vector Laplace operator. As a comparison to EEG source imaging, we note that VT does not require mathematical modeling of the sources. By numerical simulations, we show that the pattern of the electric field can be correctly estimated using VT and the location of the source activity can be determined accurately from the reconstructed magnitudes of the field. - Highlights: • Vector tomography is used to reconstruct electric fields generated by dipole sources. • Inverse solutions are based on longitudinal and transverse line integral measurements. • Transverse line integral measurements are used as a sparsity constraint. • Numerical procedure to approximate the line integrals is described in detail. • Patterns of the studied electric fields are correctly estimated.« less
NASA Astrophysics Data System (ADS)
Ikuta, Nobuaki; Takeda, Akihide
2017-12-01
Research on the flight behavior of electrons and ions in a gas under an electric field has recently moved in a direction of clarifying the mechanism of the spatiotemporal development of a swarm, but the symbolic unknown state function f(r,c,t) of the Boltzmann equation has not been obtained in an explicit form. However, a few papers on the spatiotemporal development of an electron swarm using the Monte Carlo simulation have been published. On the other hand, a new simulation procedure for obtaining the lifelong state function FfT(t,x,ɛ) and local transport quantities J(t,x,ɛ) of electrons in the three domains of time t, one-dimensional position x, and energy ɛ under arbitrary initial and boundary conditions has been developed by extending the flight-time-integral (FTI) methods previously reported and is named the 3D-FTI method. A preliminary calculation has shown that this method can extensively provide the flight behavior of individual electrons in a swarm and local transport quantities consistent in the three domains with reasonable accuracy and career dependences.
NASA Astrophysics Data System (ADS)
Tuganbaev, A. A.
1982-04-01
This paper studies integrally closed rings. It is shown that a semiprime integrally closed Goldie ring is the direct product of a semisimple artinian ring and a finite number of integrally closed invariant domains that are classically integrally closed in their (division) rings of fractions. It is shown also that an integrally closed ring has a classical ring of fractions and is classically integrally closed in it.Next, integrally closed noetherian rings are considered. It is shown that an integrally closed noetherian ring all of whose nonzero prime ideals are maximal is either a quasi-Frobenius ring or a hereditary invariant domain.Finally, those noetherian rings all of whose factor rings are invariant are described, and the connection between integrally closed rings and distributive rings is examined.Bibliography: 13 titles.
Putting people on the map through an approach that integrates social data in conservation planning.
Stephanson, Sheri L; Mascia, Michael B
2014-10-01
Conservation planning is integral to strategic and effective operations of conservation organizations. Drawing upon biological sciences, conservation planning has historically made limited use of social data. We offer an approach for integrating data on social well-being into conservation planning that captures and places into context the spatial patterns and trends in human needs and capacities. This hierarchical approach provides a nested framework for characterizing and mapping data on social well-being in 5 domains: economic well-being, health, political empowerment, education, and culture. These 5 domains each have multiple attributes; each attribute may be characterized by one or more indicators. Through existing or novel data that display spatial and temporal heterogeneity in social well-being, conservation scientists, planners, and decision makers may measure, benchmark, map, and integrate these data within conservation planning processes. Selecting indicators and integrating these data into conservation planning is an iterative, participatory process tailored to the local context and planning goals. Social well-being data complement biophysical and threat-oriented social data within conservation planning processes to inform decisions regarding where and how to conserve biodiversity, provide a structure for exploring socioecological relationships, and to foster adaptive management. Building upon existing conservation planning methods and insights from multiple disciplines, this approach to putting people on the map can readily merge with current planning practices to facilitate more rigorous decision making. © 2014 Society for Conservation Biology.
The Research on Automatic Construction of Domain Model Based on Deep Web Query Interfaces
NASA Astrophysics Data System (ADS)
JianPing, Gu
The integration of services is transparent, meaning that users no longer face the millions of Web services, do not care about the required data stored, but do not need to learn how to obtain these data. In this paper, we analyze the uncertainty of schema matching, and then propose a series of similarity measures. To reduce the cost of execution, we propose the type-based optimization method and schema matching pruning method of numeric data. Based on above analysis, we propose the uncertain schema matching method. The experiments prove the effectiveness and efficiency of our method.
Integration of a photonic crystal polarization beam splitter and waveguide bend.
Zheng, Wanhua; Xing, Mingxin; Ren, Gang; Johnson, Steven G; Zhou, Wenjun; Chen, Wei; Chen, Lianghui
2009-05-11
In this work, we present the design of an integrated photonic-crystal polarization beam splitter (PC-PBS) and a low-loss photonic-crystal 60 degrees waveguide bend. Firstly, the modal properties of the PC-PBS and the mechanism of the low-loss waveguide bend are investigated by the two-dimensional finite-difference time-domain (FDTD) method, and then the integration of the two devices is studied. It shows that, although the individual devices perform well separately, the performance of the integrated circuit is poor due to the multi-mode property of the PC-PBS. By introducing deformed airhole structures, a single-mode PC-PBS is proposed, which significantly enhance the performance of the circuit with the extinction ratios remaining above 20 dB for both transverse-electric (TE) and transverse-magnetic (TM) polarizations. Both the specific result and the general idea of integration design are promising in the photonic crystal integrated circuits in the future.
A square wave is the most efficient and reliable waveform for resonant actuation of micro switches
NASA Astrophysics Data System (ADS)
Ben Sassi, S.; Khater, M. E.; Najar, F.; Abdel-Rahman, E. M.
2018-05-01
This paper investigates efficient actuation methods of shunt MEMS switches and other parallel-plate actuators. We start by formulating a multi-physics model of the micro switch, coupling the nonlinear Euler-Bernoulli beam theory with the nonlinear Reynolds equation to describe the structural and fluidic domains, respectively. The model takes into account fringing field effects as well as mid-plane stretching and squeeze film damping nonlinearities. Static analysis is undertaken using the differential quadrature method (DQM) to obtain the pull-in voltage, which is verified by means of the finite element model and validated experimentally. We develop a reduced order model employing the Galerkin method for the structural domain and DQM for the fluidic domain. The proposed waveforms are intended to be more suitable for integrated circuit standards. The dynamic response of the micro switch to harmonic, square and triangular waveforms are evaluated and compared experimentally and analytically. Low voltage actuation is obtained using dynamic pull-in with the proposed waveforms. In addition, global stability analysis carried out for the three signals shows advantages of employing the square signal as the actuation method in enhancing the performance of the micro switch in terms of actuation voltage, switching time, and sensitivity to initial conditions.
Pharmacist perceptions of new competency standards
Maitreemit, Pagamas; Pongcharoensuk, Petcharat; Kapol, Nattiya; Armstrong, Edward P.
2008-01-01
Objective To suggest revisions to the Thai pharmacy competency standards and determine the perceptions of Thai pharmacy practitioners and faculty about the proposed pharmacy competency standards. Methods The current competency standards were revised by brainstorming session with nine Thai pharmacy experts according to their perceptions of society’s pharmacy needs. The revised standards were proposed and validated by 574 pharmacy practitioners and faculty members by using a written questionnaire. The respondents were classified based on their practice setting. Results The revision of pharmacy competency standard proposed the integration and addition to current competencies. Of 830 distributed questionnaires, 574 completed questionnaires were received (69.2% response rate). The proposed new competency standards contained 7 domains and 46 competencies. The majority of the respondents were supportive of all 46 proposed competencies. The highest ranked domain was Domain 1 (Practice Pharmacy within Laws, Professional Standards, and Ethics). The second and third highest expectations of pharmacy graduates were Domain 4 (Provide pharmaceutical care) and Domain 3 (Communicate and disseminate knowledge effectively). Conclusion The expectation for pharmacy graduates’ competencies were high and respondents encouraged additional growth in multidisciplinary efforts to improve patient care. PMID:25177401
ERIC Educational Resources Information Center
Pellas, Nikolaos
2016-01-01
The contemporary era provides several challenges which extend from the reconstitution of an innovative knowledge domain and curricula to candidate learning platforms that support online course delivery methods. Educators and scholars on these demands have recently started to rethink alternative ways for the assimilation of the experiential…
Integrated Photonics Research Topical Meeting (1993)
1994-06-01
81 DMD Time Domain Methods .................................. 107 IME Photonic Circuits and Lightwave Reception...index change near the band edge using a small interference -ellipsometry bridge and presented several results of nt of refractive index change An[51. In... interference -ellipsometry bridge at the photon energies near Eg, especially E>Eg, and compared to previous theories. [11. Manning, R Olshans]y, and C. B. Su
ERIC Educational Resources Information Center
Buss, Ray R.; Wetzel, Keith; Foulger, Teresa S.; Lindsey, LeeAnn
2015-01-01
We compared the effectiveness of learning technological, pedagogical, and content knowledge (TPACK) domain knowledge in a new technology-infused approach for teaching technology to teacher candidates with a more traditional, stand-alone course. In the new approach, learning to use technology is infused into program methods courses. Candidates all…
ERIC Educational Resources Information Center
Smith-Christmas, Cassie; Armstrong, Timothy Currie
2014-01-01
Heritage learners of minority languages can play a lynchpin role in reversing language shift (RLS) in their families; however, in order to enact this role, they must first overcome certain barriers to re-integrate the minority language into the home domain. Using a combination of conversation and narrative analysis methods, we demonstrate how both…
Unconventional Dentistry in India – An Insight into the Traditional Methods
Boloor, Vinita Ashutosh; Hosadurga, Rajesh; Rao, Anupama; Jenifer, Haziel; Pratap, Sruthy
2014-01-01
Unconventional medicine (UM) has been known and practised since the recorded history of civilization. Some unconventional practices may be viewed as “the continuity of traditions, religious beliefs, and even quackery that non-specialists practice.” These practices have been associated with religious beliefs and the spiritual domain as well as with the physical domain. In ancient Old World civilizations, UM was performed by skilled experts or wise men; in today's Western civilization, practitioners may or may not be licensed, and some are charlatans. Dentistry, like medicine, is a traditional, science-based, highly regulated healthcare profession that serves increasingly sophisticated and demanding clients. Today, traditional dental practice is dealing with an array of challenges to the established professional system; these challenges are generally termed “alternative” (or complementary, unconventional, or integrative). Genuine alternatives are comparable methods of equal value that have met scientific and regulatory criteria for safety and effectiveness. Because “alternative care” has become politicized and is often a misnomer – referring to practices that are not alternative to, complementary to, or integrating with conventional health care – the more accurate term “unconventional” is used. PMID:25161919
Time-dependent spectral renormalization method
NASA Astrophysics Data System (ADS)
Cole, Justin T.; Musslimani, Ziad H.
2017-11-01
The spectral renormalization method was introduced by Ablowitz and Musslimani (2005) as an effective way to numerically compute (time-independent) bound states for certain nonlinear boundary value problems. In this paper, we extend those ideas to the time domain and introduce a time-dependent spectral renormalization method as a numerical means to simulate linear and nonlinear evolution equations. The essence of the method is to convert the underlying evolution equation from its partial or ordinary differential form (using Duhamel's principle) into an integral equation. The solution sought is then viewed as a fixed point in both space and time. The resulting integral equation is then numerically solved using a simple renormalized fixed-point iteration method. Convergence is achieved by introducing a time-dependent renormalization factor which is numerically computed from the physical properties of the governing evolution equation. The proposed method has the ability to incorporate physics into the simulations in the form of conservation laws or dissipation rates. This novel scheme is implemented on benchmark evolution equations: the classical nonlinear Schrödinger (NLS), integrable PT symmetric nonlocal NLS and the viscous Burgers' equations, each of which being a prototypical example of a conservative and dissipative dynamical system. Numerical implementation and algorithm performance are also discussed.
NASA Astrophysics Data System (ADS)
Stetter, R.; Simundsson, A.
2015-11-01
This paper is concerned with the integration of control and diagnosis functionalities into the development of complete systems which include mechanical, electrical and electronic subsystems. For the development of such systems the strategies, methods and tools of integrated product development have attracted significant attention during the last decades. Today, it is generally observed that product development processes of complex systems can only be successful if the activities in the different domains are well connected and synchronised and if an ongoing communication is present - an ongoing communication spanning the technical domains and also including functions such as production planning, marketing/distribution, quality assurance, service and project planning. Obviously, numerous approaches to tackle this challenge are present in scientific literature and in industrial practice, as well. Today, the functionality and safety of most products is to a large degree dependent on control and diagnosis functionalities. Still, there is comparatively little research concentrating on the integration of the development of these functionalities into the overall product development processes. The main source of insight of the presented research is the product development process of an Automated Guided Vehicle (AGV) which is intended to be used on rough terrain. The paper starts with a background describing Integrated Product Development. The second section deals with the product development of the sample product. The third part summarizes some insights and formulates first hypotheses concerning control and diagnosis in Integrated Product Development.
The prediction of the noise of supersonic propellers in time domain - New theoretical results
NASA Technical Reports Server (NTRS)
Farassat, F.
1983-01-01
In this paper, a new formula for the prediction of the noise of supersonic propellers is derived in the time domain which is superior to the previous formulations in several respects. The governing equation is based on the Ffowcs Williams-Hawkings (FW-H) equation with the thickness source term replaced by an equivalent loading source term derived by Isom (1975). Using some results of generalized function theory and simple four-dimensional space-time geometry, the formal solution of the governing equation is manipulated to a form requiring only the knowledge of blade surface pressure data and geometry. The final form of the main result of this paper consists of some surface and line integrals. The surface integrals depend on the surface pressure, time rate of change of surface pressure, and surface pressure gradient. These integrals also involve blade surface curvatures. The line integrals which depend on local surface pressure are along the trailing edge, the shock traces on the blade, and the perimeter of the airfoil section at the inner radius of the blade. The new formulation is for the full blade surface and does not involve any numerical observer time differentiation. The method of implementation on a computer for numerical work is also discussed.
Data surrounding the needs of human disease and toxicity modeling are largely siloed limiting the ability to extend and reuse modules across knowledge domains. Using an infrastructure that supports integration across knowledge domains (animal toxicology, high-throughput screening...
NASA Astrophysics Data System (ADS)
Costantini, Mario; Malvarosa, Fabio; Minati, Federico
2010-03-01
Phase unwrapping and integration of finite differences are key problems in several technical fields. In SAR interferometry and differential and persistent scatterers interferometry digital elevation models and displacement measurements can be obtained after unambiguously determining the phase values and reconstructing the mean velocities and elevations of the observed targets, which can be performed by integrating differential estimates of these quantities (finite differences between neighboring points).In this paper we propose a general formulation for robust and efficient integration of finite differences and phase unwrapping, which includes standard techniques methods as sub-cases. The proposed approach allows obtaining more reliable and accurate solutions by exploiting redundant differential estimates (not only between nearest neighboring points) and multi-dimensional information (e.g. multi-temporal, multi-frequency, multi-baseline observations), or external data (e.g. GPS measurements). The proposed approach requires the solution of linear or quadratic programming problems, for which computationally efficient algorithms exist.The validation tests obtained on real SAR data confirm the validity of the method, which was integrated in our production chain and successfully used also in massive productions.
Ebina, Hirotaka; Chatterjee, Atreyi Ghatak; Judson, Robert L.; Levin, Henry L.
2008-01-01
Integrases (INs) of retroviruses and long terminal repeat retrotransposons possess a C-terminal domain with DNA binding activity. Other than this binding activity, little is known about how the C-terminal domain contributes to integration. A stretch of conserved amino acids called the GP(Y/F) domain has been identified within the C-terminal IN domains of two distantly related families, the γ-retroviruses and the metavirus retrotransposons. To enhance understanding of the C-terminal domain, we examined the function of the GP(Y/F) domain in the IN of Tf1, a long terminal repeat retrotransposon of Schizosaccharomyces pombe. The activities of recombinant IN were measured with an assay that modeled the reverse of integration called disintegration. Although deletion of the entire C-terminal domain disrupted disintegration activity, an alanine substitution (P365A) in a conserved amino acid of the GP(Y/F) domain did not significantly reduce disintegration. When assayed for the ability to join two molecules of DNA in a reaction that modeled forward integration, the P365A substitution disrupted activity. UV cross-linking experiments detected DNA binding activity in the C-terminal domain and found that this activity was not reduced by substitutions in two conserved amino acids of the GP(Y/F) domain, G364A and P365A. Gel filtration and cross-linking of a 71-amino acid fragment containing the GP(Y/F) domain revealed a surprising ability to form dimers, trimers, and tetramers that was disrupted by the G364A and P365A substitutions. These results suggest that the GP(Y/F) residues may play roles in promoting multimerization and intermolecular strand joining. PMID:18397885
Ebina, Hirotaka; Chatterjee, Atreyi Ghatak; Judson, Robert L; Levin, Henry L
2008-06-06
Integrases (INs) of retroviruses and long terminal repeat retrotransposons possess a C-terminal domain with DNA binding activity. Other than this binding activity, little is known about how the C-terminal domain contributes to integration. A stretch of conserved amino acids called the GP(Y/F) domain has been identified within the C-terminal IN domains of two distantly related families, the gamma-retroviruses and the metavirus retrotransposons. To enhance understanding of the C-terminal domain, we examined the function of the GP(Y/F) domain in the IN of Tf1, a long terminal repeat retrotransposon of Schizosaccharomyces pombe. The activities of recombinant IN were measured with an assay that modeled the reverse of integration called disintegration. Although deletion of the entire C-terminal domain disrupted disintegration activity, an alanine substitution (P365A) in a conserved amino acid of the GP(Y/F) domain did not significantly reduce disintegration. When assayed for the ability to join two molecules of DNA in a reaction that modeled forward integration, the P365A substitution disrupted activity. UV cross-linking experiments detected DNA binding activity in the C-terminal domain and found that this activity was not reduced by substitutions in two conserved amino acids of the GP(Y/F) domain, G364A and P365A. Gel filtration and cross-linking of a 71-amino acid fragment containing the GP(Y/F) domain revealed a surprising ability to form dimers, trimers, and tetramers that was disrupted by the G364A and P365A substitutions. These results suggest that the GP(Y/F) residues may play roles in promoting multimerization and intermolecular strand joining.
Carquin, Mélanie; D'Auria, Ludovic; Pollet, Hélène; Bongarzone, Ernesto R.; Tyteca, Donatienne
2016-01-01
The concept of transient nanometric domains known as lipid rafts has brought interest to reassess the validity of the Singer-Nicholson model of a fluid bilayer for cell membranes. However, this new view is still insufficient to explain the cellular control of surface lipid diversity or membrane deformability. During the past decade, the hypothesis that some lipids form large (submicrometric/mesoscale vs nanometric rafts) and stable (> min vs sec) membrane domains has emerged, largely based on indirect methods. Morphological evidence for stable submicrometric lipid domains, well-accepted for artificial and highly specialized biological membranes, was further reported for a variety of living cells from prokaryotes to yeast and mammalian cells. However, results remained questioned based on limitations of available fluorescent tools, use of poor lipid fixatives, and imaging artifacts due to non-resolved membrane projections. In this review, we will discuss recent evidence generated using powerful and innovative approaches such as lipid-specific toxin fragments that support the existence of submicrometric domains. We will integrate documented mechanisms involved in the formation and maintenance of these domains, and provide a perspective on their relevance on membrane deformability and regulation of membrane protein distribution. PMID:26738447
Convergence issues in domain decomposition parallel computation of hovering rotor
NASA Astrophysics Data System (ADS)
Xiao, Zhongyun; Liu, Gang; Mou, Bin; Jiang, Xiong
2018-05-01
Implicit LU-SGS time integration algorithm has been widely used in parallel computation in spite of its lack of information from adjacent domains. When applied to parallel computation of hovering rotor flows in a rotating frame, it brings about convergence issues. To remedy the problem, three LU factorization-based implicit schemes (consisting of LU-SGS, DP-LUR and HLU-SGS) are investigated comparatively. A test case of pure grid rotation is designed to verify these algorithms, which show that LU-SGS algorithm introduces errors on boundary cells. When partition boundaries are circumferential, errors arise in proportion to grid speed, accumulating along with the rotation, and leading to computational failure in the end. Meanwhile, DP-LUR and HLU-SGS methods show good convergence owing to boundary treatment which are desirable in domain decomposition parallel computations.
2006-03-01
2000. http://www.grc.nasa.gov/WWW/wind/valid/tutorial/spatconv.html. Toro , Eleuterio F . Riemann Solvers and Numerical Methods for Fluid Dynamics...Invariants along the characteristics are used ( Toro , 1999:120). A generalized pressure function, ( )* f p ,ξ ξW , whereξ indicates the appropriate...dx dt ,⎡ ⎤− =⎢ ⎥⎣ ⎦∫ U F U 0 (4.9) where the line integration is performed, counter-clockwise, along the boundary of the domain ( Toro , 1999:62
Experiments In Characterizing Vibrations Of A Structure
NASA Technical Reports Server (NTRS)
Yam, Yeung; Hadaegh, Fred Y.; Bayard, David S.
1993-01-01
Report discusses experiments conducted to test methods of identification of vibrational and coupled rotational/vibrational modes of flexible structure. Report one in series that chronicle development of integrated system of methods, sensors, actuators, analog and digital signal-processing equipment, and algorithms to suppress vibrations in large, flexible structure even when dynamics of structure partly unknown and/or changing. Two prior articles describing aspects of research, "Autonomous Frequency-Domain Indentification" (NPO-18099), and "Automated Characterization Of Vibrations Of A Structure" (NPO-18141).
Synergistic Instance-Level Subspace Alignment for Fine-Grained Sketch-Based Image Retrieval.
Li, Ke; Pang, Kaiyue; Song, Yi-Zhe; Hospedales, Timothy M; Xiang, Tao; Zhang, Honggang
2017-08-25
We study the problem of fine-grained sketch-based image retrieval. By performing instance-level (rather than category-level) retrieval, it embodies a timely and practical application, particularly with the ubiquitous availability of touchscreens. Three factors contribute to the challenging nature of the problem: (i) free-hand sketches are inherently abstract and iconic, making visual comparisons with photos difficult, (ii) sketches and photos are in two different visual domains, i.e. black and white lines vs. color pixels, and (iii) fine-grained distinctions are especially challenging when executed across domain and abstraction-level. To address these challenges, we propose to bridge the image-sketch gap both at the high-level via parts and attributes, as well as at the low-level, via introducing a new domain alignment method. More specifically, (i) we contribute a dataset with 304 photos and 912 sketches, where each sketch and image is annotated with its semantic parts and associated part-level attributes. With the help of this dataset, we investigate (ii) how strongly-supervised deformable part-based models can be learned that subsequently enable automatic detection of part-level attributes, and provide pose-aligned sketch-image comparisons. To reduce the sketch-image gap when comparing low-level features, we also (iii) propose a novel method for instance-level domain-alignment, that exploits both subspace and instance-level cues to better align the domains. Finally (iv) these are combined in a matching framework integrating aligned low-level features, mid-level geometric structure and high-level semantic attributes. Extensive experiments conducted on our new dataset demonstrate effectiveness of the proposed method.
NASA Astrophysics Data System (ADS)
Zednik, S.
2015-12-01
Recent data publication practices have made increasing amounts of diverse datasets available online for the general research community to explore and integrate. Even with the abundance of data online, relevant data discovery and successful integration is still highly dependent upon the data being published with well-formed and understandable metadata. Tagging a dataset with well-known or controlled community terms is a common mechanism to indicate the intended purpose, subject matter, or other relevant facts of a dataset, however controlled domain terminology can be difficult for cross-domain researchers to interpret and leverage. It is also a challenge for integration portals to successfully provide cross-domain search capabilities over data holdings described using many different controlled vocabularies. Mappings between controlled vocabularies can be challenging because communities frequently develop specialized terminologies and have highly specific and contextual usages of common words. Despite this specificity it is highly desirable to produce cross-domain mappings to support data integration. In this contribution we evaluate the applicability of several data analytic techniques for the purpose of generating mappings between hierarchies of controlled science terms. We hope our efforts initiate more discussion on the topic and encourage future mapping efforts.
Interoperable cross-domain semantic and geospatial framework for automatic change detection
NASA Astrophysics Data System (ADS)
Kuo, Chiao-Ling; Hong, Jung-Hong
2016-01-01
With the increasingly diverse types of geospatial data established over the last few decades, semantic interoperability in integrated applications has attracted much interest in the field of Geographic Information System (GIS). This paper proposes a new strategy and framework to process cross-domain geodata at the semantic level. This framework leverages the semantic equivalence of concepts between domains through bridge ontology and facilitates the integrated use of different domain data, which has been long considered as an essential superiority of GIS, but is impeded by the lack of understanding about the semantics implicitly hidden in the data. We choose the task of change detection to demonstrate how the introduction of ontology concept can effectively make the integration possible. We analyze the common properties of geodata and change detection factors, then construct rules and summarize possible change scenario for making final decisions. The use of topographic map data to detect changes in land use shows promising success, as far as the improvement of efficiency and level of automation is concerned. We believe the ontology-oriented approach will enable a new way for data integration across different domains from the perspective of semantic interoperability, and even open a new dimensionality for the future GIS.
Exploring MEDLINE Space with Random Indexing and Pathfinder Networks
Cohen, Trevor
2008-01-01
The integration of disparate research domains is a prerequisite for the success of the translational science initiative. MEDLINE abstracts contain content from a broad range of disciplines, presenting an opportunity for the development of methods able to integrate the knowledge they contain. Latent Semantic Analysis (LSA) and related methods learn human-like associations between terms from unannotated text. However, their computational and memory demands limits their ability to address a corpus of this size. Furthermore, visualization methods previously used in conjunction with LSA have limited ability to define the local structure of the associative networks LSA learns. This paper explores these issues by (1) processing the entire MEDLINE corpus using Random Indexing, a variant of LSA, and (2) exploring learned associations using Pathfinder Networks. Meaningful associations are inferred from MEDLINE, including a drug-disease association undetected by PUBMED search. PMID:18999236
Exploring MEDLINE space with random indexing and pathfinder networks.
Cohen, Trevor
2008-11-06
The integration of disparate research domains is a prerequisite for the success of the translational science initiative. MEDLINE abstracts contain content from a broad range of disciplines, presenting an opportunity for the development of methods able to integrate the knowledge they contain. Latent Semantic Analysis (LSA) and related methods learn human-like associations between terms from unannotated text. However, their computational and memory demands limits their ability to address a corpus of this size. Furthermore, visualization methods previously used in conjunction with LSA have limited ability to define the local structure of the associative networks LSA learns. This paper explores these issues by (1) processing the entire MEDLINE corpus using Random Indexing, a variant of LSA, and (2) exploring learned associations using Pathfinder Networks. Meaningful associations are inferred from MEDLINE, including a drug-disease association undetected by PUBMED search.
A New Method for Conceptual Modelling of Information Systems
NASA Astrophysics Data System (ADS)
Gustas, Remigijus; Gustiene, Prima
Service architecture is not necessarily bound to the technical aspects of information system development. It can be defined by using conceptual models that are independent of any implementation technology. Unfortunately, the conventional information system analysis and design methods cover just a part of required modelling notations for engineering of service architectures. They do not provide effective support to maintain semantic integrity between business processes and data. Service orientation is a paradigm that can be applied for conceptual modelling of information systems. The concept of service is rather well understood in different domains. It can be applied equally well for conceptualization of organizational and technical information system components. This chapter concentrates on analysis of the differences between service-oriented modelling and object-oriented modelling. Service-oriented method is used for semantic integration of information system static and dynamic aspects.
A new aerodynamic integral equation based on an acoustic formula in the time domain
NASA Technical Reports Server (NTRS)
Farassat, F.
1984-01-01
An aerodynamic integral equation for bodies moving at transonic and supersonic speeds is presented. Based on a time-dependent acoustic formula for calculating the noise emanating from the outer portion of a propeller blade travelling at high speed (the Ffowcs Williams-Hawking formulation), the loading terms and a conventional thickness source terms are retained. Two surface and three line integrals are employed to solve an equation for the loading noise. The near-field term is regularized using the collapsing sphere approach to obtain semiconvergence on the blade surface. A singular integral equation is thereby derived for the unknown surface pressure, and is amenable to numerical solutions using Galerkin or collocation methods. The technique is useful for studying the nonuniform inflow to the propeller.
Individual differences and their measurement: A review of 100 years of research.
Sackett, Paul R; Lievens, Filip; Van Iddekinge, Chad H; Kuncel, Nathan R
2017-03-01
This article reviews 100 years of research on individual differences and their measurement, with a focus on research published in the Journal of Applied Psychology. We focus on 3 major individual differences domains: (a) knowledge, skill, and ability, including both the cognitive and physical domains; (b) personality, including integrity, emotional intelligence, stable motivational attributes (e.g., achievement motivation, core self-evaluations), and creativity; and (c) vocational interests. For each domain, we describe the evolution of the domain across the years and highlight major theoretical, empirical, and methodological developments, including relationships between individual differences and variables such as job performance, job satisfaction, and career development. We conclude by discussing future directions for individual differences research. Trends in the literature include a growing focus on substantive issues rather than on the measurement of individual differences, a differentiation between constructs and measurement methods, and the use of innovative ways of assessing individual differences, such as simulations, other-reports, and implicit measures. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Monte Carlo simulation of non-invasive glucose measurement based on FMCW LIDAR
NASA Astrophysics Data System (ADS)
Xiong, Bing; Wei, Wenxiong; Liu, Nan; He, Jian-Jun
2010-11-01
Continuous non-invasive glucose monitoring is a powerful tool for the treatment and management of diabetes. A glucose measurement method, with the potential advantage of miniaturizability with no moving parts, based on the frequency modulated continuous wave (FMCW) LIDAR technology is proposed and investigated. The system mainly consists of an integrated near-infrared tunable semiconductor laser and a detector, using heterodyne technology to convert the signal from time-domain to frequency-domain. To investigate the feasibility of the method, Monte Carlo simulations have been performed on tissue phantoms with optical parameters similar to those of human interstitial fluid. The simulation showed that the sensitivity of the FMCW LIDAR system to glucose concentration can reach 0.2mM. Our analysis suggests that the FMCW LIDAR technique has good potential for noninvasive blood glucose monitoring.
Flow Charts: Visualization of Vector Fields on Arbitrary Surfaces
Li, Guo-Shi; Tricoche, Xavier; Weiskopf, Daniel; Hansen, Charles
2009-01-01
We introduce a novel flow visualization method called Flow Charts, which uses a texture atlas approach for the visualization of flows defined over curved surfaces. In this scheme, the surface and its associated flow are segmented into overlapping patches, which are then parameterized and packed in the texture domain. This scheme allows accurate particle advection across multiple charts in the texture domain, providing a flexible framework that supports various flow visualization techniques. The use of surface parameterization enables flow visualization techniques requiring the global view of the surface over long time spans, such as Unsteady Flow LIC (UFLIC), particle-based Unsteady Flow Advection Convolution (UFAC), or dye advection. It also prevents visual artifacts normally associated with view-dependent methods. Represented as textures, Flow Charts can be naturally integrated into hardware accelerated flow visualization techniques for interactive performance. PMID:18599918
Quantitative Frequency-Domain Passive Cavitation Imaging
Haworth, Kevin J.; Bader, Kenneth B.; Rich, Kyle T.; Holland, Christy K.; Mast, T. Douglas
2017-01-01
Passive cavitation detection has been an instrumental technique for measuring cavitation dynamics, elucidating concomitant bioeffects, and guiding ultrasound therapies. Recently, techniques have been developed to create images of cavitation activity to provide investigators with a more complete set of information. These techniques use arrays to record and subsequently beamform received cavitation emissions, rather than processing emissions received on a single-element transducer. In this paper, the methods for performing frequency-domain delay, sum, and integrate passive imaging are outlined. The method can be applied to any passively acquired acoustic scattering or emissions, including cavitation emissions. In order to compare data across different systems, techniques for normalizing Fourier transformed data and converting the data to the acoustic energy received by the array are described. A discussion of hardware requirements and alternative imaging approaches are additionally outlined. Examples are provided in MATLAB. PMID:27992331
Bulger, Carrie A; Matthews, Russell A; Hoffman, Mark E
2007-10-01
While researchers are increasingly interested in understanding the boundaries surrounding the work and personal life domains, few have tested the propositions set forth by theory. Boundary theory proposes that individuals manage the boundaries between work and personal life through processes of segmenting and/or integrating the domains. The authors investigated boundary management profiles of 332 workers in an investigation of the segmentation-integration continuum. Cluster analysis indicated consistent clusters of boundary management practices related to varying segmentation and integration of the work and personal life domains. But, the authors suggest that the segmentation-integration continuum may be more complicated. Results also indicated relationships between boundary management practices and work-personal life interference and work-personal life enhancement. Less flexible and more permeable boundaries were related to more interference, while more flexible and more permeable boundaries were related to more enhancement.
Refining and validating a conceptual model of Clinical Nurse Leader integrated care delivery.
Bender, Miriam; Williams, Marjory; Su, Wei; Hites, Lisle
2017-02-01
To empirically validate a conceptual model of Clinical Nurse Leader integrated care delivery. There is limited evidence of frontline care delivery models that consistently achieve quality patient outcomes. Clinical Nurse Leader integrated care delivery is a promising nursing model with a growing record of success. However, theoretical clarity is necessary to generate causal evidence of effectiveness. Sequential mixed methods. A preliminary Clinical Nurse Leader practice model was refined and survey items developed to correspond with model domains, using focus groups and a Delphi process with a multi-professional expert panel. The survey was administered in 2015 to clinicians and administrators involved in Clinical Nurse Leader initiatives. Confirmatory factor analysis and structural equation modelling were used to validate the measurement and model structure. Final sample n = 518. The model incorporates 13 components organized into five conceptual domains: 'Readiness for Clinical Nurse Leader integrated care delivery'; 'Structuring Clinical Nurse Leader integrated care delivery'; 'Clinical Nurse Leader Practice: Continuous Clinical Leadership'; 'Outcomes of Clinical Nurse Leader integrated care delivery'; and 'Value'. Sample data had good fit with specified model and two-level measurement structure. All hypothesized pathways were significant, with strong coefficients suggesting good fit between theorized and observed path relationships. The validated model articulates an explanatory pathway of Clinical Nurse Leader integrated care delivery, including Clinical Nurse Leader practices that result in improved care dynamics and patient outcomes. The validated model provides a basis for testing in practice to generate evidence that can be deployed across the healthcare spectrum. © 2016 John Wiley & Sons Ltd.
Searching Across the International Space Station Databases
NASA Technical Reports Server (NTRS)
Maluf, David A.; McDermott, William J.; Smith, Ernest E.; Bell, David G.; Gurram, Mohana
2007-01-01
Data access in the enterprise generally requires us to combine data from different sources and different formats. It is advantageous thus to focus on the intersection of the knowledge across sources and domains; keeping irrelevant knowledge around only serves to make the integration more unwieldy and more complicated than necessary. A context search over multiple domain is proposed in this paper to use context sensitive queries to support disciplined manipulation of domain knowledge resources. The objective of a context search is to provide the capability for interrogating many domain knowledge resources, which are largely semantically disjoint. The search supports formally the tasks of selecting, combining, extending, specializing, and modifying components from a diverse set of domains. This paper demonstrates a new paradigm in composition of information for enterprise applications. In particular, it discusses an approach to achieving data integration across multiple sources, in a manner that does not require heavy investment in database and middleware maintenance. This lean approach to integration leads to cost-effectiveness and scalability of data integration with an underlying schemaless object-relational database management system. This highly scalable, information on demand system framework, called NX-Search, which is an implementation of an information system built on NETMARK. NETMARK is a flexible, high-throughput open database integration framework for managing, storing, and searching unstructured or semi-structured arbitrary XML and HTML used widely at the National Aeronautics Space Administration (NASA) and industry.
NASA Astrophysics Data System (ADS)
Jerome, Diane C.
This study explored how science teachers and school administrators perceive the use of the affective domain during science instruction situated within a high-stakes testing environment. Through a multimethodological inquiry using phenomenology and critical ethnography, the researcher conducted semi-structured interviews with six fifth-grade science teachers and two administrators from two Texas school districts. Data reconstructions from interviews formed a bricolage of diagrams that trace the researcher's steps through a reflective exploration of these phenomena. This study addressed the following research questions: (a) What are the attitudes, interests, and values (affective domain) that fifth-grade science teachers integrate into science instruction? (b) How do fifth-grade science teachers attempt to integrate attitudes, interests and values (affective domain) in science instruction? and (c) How do fifth-grade science teachers manage to balance the tension from the seeming pressures caused by a high-stakes testing environment and the integration of attitudes, interests and values (affective domain) in science instruction? The findings from this study indicate that as teachers tried to integrate the affective domain during science instruction, (a) their work was set within a framework of institutional values, (b) teaching science for understanding looked different before and after the onset of the science Texas Assessment of Knowledge and Skills (TAKS), and (c) upon administration of the science TAKS---teachers broadened their aim, raised their expectations, and furthered their professional development. The integration of the affective domain fell into two distinct categories: 1) teachers targeted student affect and 2) teachers modeled affective behavior.
Kerrissey, Michaela J; Clark, Jonathan R; Friedberg, Mark W; Jiang, Wei; Fryer, Ashley K; Frean, Molly; Shortell, Stephen M; Ramsay, Patricia P; Casalino, Lawrence P; Singer, Sara J
2017-05-01
Structural integration is increasing among medical groups, but whether these changes yield care that is more integrated remains unclear. We explored the relationships between structural integration characteristics of 144 medical groups and perceptions of integrated care among their patients. Patients' perceptions were measured by a validated national survey of 3,067 Medicare beneficiaries with multiple chronic conditions across six domains that reflect knowledge and support of, and communication with, the patient. Medical groups' structural characteristics were taken from the National Study of Physician Organizations and included practice size, specialty mix, technological capabilities, and care management processes. Patients' survey responses were most favorable for the domain of test result communication and least favorable for the domain of provider support for medication and home health management. Medical groups' characteristics were not consistently associated with patients' perceptions of integrated care. However, compared to patients of primary care groups, patients of multispecialty groups had strong favorable perceptions of medical group staff knowledge of patients' medical histories. Opportunities exist to improve patient care, but structural integration of medical groups might not be sufficient for delivering care that patients perceive as integrated. Project HOPE—The People-to-People Health Foundation, Inc.
NASA Astrophysics Data System (ADS)
Liu, Qimao
2018-02-01
This paper proposes an assumption that the fibre is elastic material and polymer matrix is viscoelastic material so that the energy dissipation depends only on the polymer matrix in dynamic response process. The damping force vectors in frequency and time domains, of FRP (Fibre-Reinforced Polymer matrix) laminated composite plates, are derived based on this assumption. The governing equations of FRP laminated composite plates are formulated in both frequency and time domains. The direct inversion method and direct time integration method for nonviscously damped systems are employed to solve the governing equations and achieve the dynamic responses in frequency and time domains, respectively. The computational procedure is given in detail. Finally, dynamic responses (frequency responses with nonzero and zero initial conditions, free vibration, forced vibrations with nonzero and zero initial conditions) of a FRP laminated composite plate are computed using the proposed methodology. The proposed methodology in this paper is easy to be inserted into the commercial finite element analysis software. The proposed assumption, based on the theory of material mechanics, needs to be further proved by experiment technique in the future.
Using robust Bayesian network to estimate the residuals of fluoroquinolone antibiotic in soil.
Li, Xuewen; Xie, Yunfeng; Li, Lianfa; Yang, Xunfeng; Wang, Ning; Wang, Jinfeng
2015-11-01
Prediction of antibiotic pollution and its consequences is difficult, due to the uncertainties and complexities associated with multiple related factors. This article employed domain knowledge and spatial data to construct a Bayesian network (BN) model to assess fluoroquinolone antibiotic (FQs) pollution in the soil of an intensive vegetable cultivation area. The results show: (1) The relationships between FQs pollution and contributory factors: Three factors (cultivation methods, crop rotations, and chicken manure types) were consistently identified as predictors in the topological structures of three FQs, indicating their importance in FQs pollution; deduced with domain knowledge, the cultivation methods are determined by the crop rotations, which require different nutrients (derived from the manure) according to different plant biomass. (2) The performance of BN model: The integrative robust Bayesian network model achieved the highest detection probability (pd) of high-risk and receiver operating characteristic (ROC) area, since it incorporates domain knowledge and model uncertainty. Our encouraging findings have implications for the use of BN as a robust approach to assessment of FQs pollution and for informing decisions on appropriate remedial measures.
2012-01-01
Background An integrative theoretical framework, developed for cross-disciplinary implementation and other behaviour change research, has been applied across a wide range of clinical situations. This study tests the validity of this framework. Methods Validity was investigated by behavioural experts sorting 112 unique theoretical constructs using closed and open sort tasks. The extent of replication was tested by Discriminant Content Validation and Fuzzy Cluster Analysis. Results There was good support for a refinement of the framework comprising 14 domains of theoretical constructs (average silhouette value 0.29): ‘Knowledge’, ‘Skills’, ‘Social/Professional Role and Identity’, ‘Beliefs about Capabilities’, ‘Optimism’, ‘Beliefs about Consequences’, ‘Reinforcement’, ‘Intentions’, ‘Goals’, ‘Memory, Attention and Decision Processes’, ‘Environmental Context and Resources’, ‘Social Influences’, ‘Emotions’, and ‘Behavioural Regulation’. Conclusions The refined Theoretical Domains Framework has a strengthened empirical base and provides a method for theoretically assessing implementation problems, as well as professional and other health-related behaviours as a basis for intervention development. PMID:22530986
Mueller, Daniel R.; Schmidt, Stefanie J.; Roder, Volker
2015-01-01
Objective: Cognitive remediation (CR) approaches have demonstrated to be effective in improving cognitive functions in schizophrenia. However, there is a lack of integrated CR approaches that target multiple neuro- and social-cognitive domains with a special focus on the generalization of therapy effects to functional outcome. Method: This 8-site randomized controlled trial evaluated the efficacy of a novel CR group therapy approach called integrated neurocognitive therapy (INT). INT includes well-defined exercises to improve all neuro- and social-cognitive domains as defined by the Measurement And Treatment Research to Improve Cognition in Schizophrenia (MATRICS) initiative by compensation and restitution. One hundred and fifty-six outpatients with a diagnosis of schizophrenia or schizoaffective disorder according to DSM-IV-TR or ICD-10 were randomly assigned to receive 15 weeks of INT or treatment as usual (TAU). INT patients received 30 bi-weekly therapy sessions. Each session lasted 90min. Mixed models were applied to assess changes in neurocognition, social cognition, symptoms, and functional outcome at post-treatment and at 9-month follow-up. Results: In comparison to TAU, INT patients showed significant improvements in several neuro- and social-cognitive domains, negative symptoms, and functional outcome after therapy and at 9-month follow-up. Number-needed-to-treat analyses indicate that only 5 INT patients are necessary to produce durable and meaningful improvements in functional outcome. Conclusions: Integrated interventions on neurocognition and social cognition have the potential to improve not only cognitive performance but also functional outcome. These findings are important as treatment guidelines for schizophrenia have criticized CR for its poor generalization effects. PMID:25713462
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tzuang, C.K.C.
1986-01-01
Various MMIC (monolithic microwave integrated circuit) planar waveguides have shown possible existence of a slow-wave propagation. In many practical applications of these slow-wave circuits, the semiconductor devices have nonuniform material properties that may affect the slow-wave propagation. In the first part of the dissertation, the effects of the nonuniform material properties are studied by a finite-element method. In addition, the transient pulse excitations of these slow-wave circuits also have great theoretical and practical interests. In the second part, the time-domain analysis of a slow-wave coplanar waveguide is presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manjunath, Naren; Samajdar, Rhine; Jain, Sudhir R., E-mail: srjain@barc.gov.in
Recently, the nodal domain counts of planar, integrable billiards with Dirichlet boundary conditions were shown to satisfy certain difference equations in Samajdar and Jain (2014). The exact solutions of these equations give the number of domains explicitly. For complete generality, we demonstrate this novel formulation for three additional separable systems and thus extend the statement to all integrable billiards.
The Adapted Dance Process: Planning, Partnering, and Performing
ERIC Educational Resources Information Center
Block, Betty A.; Johnson, Peggy V.
2011-01-01
This article contains specific planning, partnering, and performing techniques for fully integrating dancers with special needs into a dance pedagogy program. Each aspect is discussed within the context of the domains of learning. Fundamental partnering strategies are related to each domain as part of the integration process. The authors recommend…
Nam, Sung-Ki; Kim, Jung-Kyun; Cho, Sung-Cheon; Lee, Sun-Kyu
2010-01-01
A complementary metal-oxide semiconductor-compatible process was used in the design and fabrication of a suspended membrane microfluidic heat flux sensor with a thermopile for the purpose of measuring the heat flow rate. The combination of a thirty-junction gold and nickel thermoelectric sensor with an ultralow noise preamplifier, a low pass filter, and a lock-in amplifier can yield a resolution 20 nW with a sensitivity of 461 V/W. The thermal modulation method is used to eliminate low-frequency noise from the sensor output, and various amounts of fluidic heat were applied to the sensor to investigate its suitability for microfluidic applications. For sensor design and analysis of signal output, a method of modeling and simulating electro-thermal behavior in a microfluidic heat flux sensor with an integrated electronic circuit is presented and validated. The electro-thermal domain model was constructed by using system dynamics, particularly the bond graph. The electro-thermal domain system model in which the thermal and the electrical domains are coupled expresses the heat generation of samples and converts thermal input to electrical output. The proposed electro-thermal domain system model is in good agreement with the measured output voltage response in both the transient and the steady state. PMID:22163568
Integrating different perspectives on socialization theory and research: a domain-specific approach.
Grusec, Joan E; Davidov, Maayan
2010-01-01
There are several different theoretical and research approaches to the study of socialization, characterized by frequently competing basic tenets and apparently contradictory evidence. As a way of integrating approaches and understanding discrepancies, it is proposed that socialization processes be viewed from a domain perspective, with each domain characterized by a particular form of social interaction between the object and agent of socialization and by specific socialization mechanisms and outcomes. It is argued that this approach requires researchers to identify the domain of social interaction they are investigating, to understand that phenotypically similar behaviors may belong to different domains, and to acknowledge that caregivers who are effective in one type of interaction may not be effective in another.
Bian, Xu; Li, Yibo; Feng, Hao; Wang, Jiaqiang; Qi, Lei; Jin, Shijiu
2015-01-01
This paper proposes a continuous leakage location method based on the ultrasonic array sensor, which is specific to continuous gas leakage in a pressure container with an integral stiffener. This method collects the ultrasonic signals generated from the leakage hole through the piezoelectric ultrasonic sensor array, and analyzes the space-time correlation of every collected signal in the array. Meanwhile, it combines with the method of frequency compensation and superposition in time domain (SITD), based on the acoustic characteristics of the stiffener, to obtain a high-accuracy location result on the stiffener wall. According to the experimental results, the method successfully solves the orientation problem concerning continuous ultrasonic signals generated from leakage sources, and acquires high accuracy location information on the leakage source using a combination of multiple sets of orienting results. The mean value of location absolute error is 13.51 mm on the one-square-meter plate with an integral stiffener (4 mm width; 20 mm height; 197 mm spacing), and the maximum location absolute error is generally within a ±25 mm interval. PMID:26404316
Feature generation and representations for protein-protein interaction classification.
Lan, Man; Tan, Chew Lim; Su, Jian
2009-10-01
Automatic detecting protein-protein interaction (PPI) relevant articles is a crucial step for large-scale biological database curation. The previous work adopted POS tagging, shallow parsing and sentence splitting techniques, but they achieved worse performance than the simple bag-of-words representation. In this paper, we generated and investigated multiple types of feature representations in order to further improve the performance of PPI text classification task. Besides the traditional domain-independent bag-of-words approach and the term weighting methods, we also explored other domain-dependent features, i.e. protein-protein interaction trigger keywords, protein named entities and the advanced ways of incorporating Natural Language Processing (NLP) output. The integration of these multiple features has been evaluated on the BioCreAtIvE II corpus. The experimental results showed that both the advanced way of using NLP output and the integration of bag-of-words and NLP output improved the performance of text classification. Specifically, in comparison with the best performance achieved in the BioCreAtIvE II IAS, the feature-level and classifier-level integration of multiple features improved the performance of classification 2.71% and 3.95%, respectively.
NASA Astrophysics Data System (ADS)
McKnight, Timothy E.; Melechko, Anatoli V.; Griffin, Guy D.; Guillorn, Michael A.; Merkulov, Vladimir I.; Serna, Francisco; Hensley, Dale K.; Doktycz, Mitchel J.; Lowndes, Douglas H.; Simpson, Michael L.
2003-05-01
We demonstrate the integration of vertically aligned carbon nanofibre (VACNF) elements with the intracellular domains of viable cells for controlled biochemical manipulation. Deterministically synthesized VACNFs were modified with either adsorbed or covalently-linked plasmid DNA and were subsequently inserted into cells. Post insertion viability of the cells was demonstrated by continued proliferation of the interfaced cells and long-term (> 22 day) expression of the introduced plasmid. Adsorbed plasmids were typically desorbed in the intracellular domain and segregated to progeny cells. Covalently bound plasmids remained tethered to nanofibres and were expressed in interfaced cells but were not partitioned into progeny, and gene expression ceased when the nanofibre was no longer retained. This provides a method for achieving a genetic modification that is non-inheritable and whose extent in time can be directly and precisely controlled. These results demonstrate the potential of VACNF arrays as an intracellular interface for monitoring and controlling subcellular and molecular phenomena within viable cells for applications including biosensors, in vivo diagnostics, and in vivo logic devices.
NASA Astrophysics Data System (ADS)
Ling, Shenglong; Wang, Wei; Yu, Lu; Peng, Junhui; Cai, Xiaoying; Xiong, Ying; Hayati, Zahra; Zhang, Longhua; Zhang, Zhiyong; Song, Likai; Tian, Changlin
2016-01-01
Electron paramagnetic resonance (EPR)-based hybrid experimental and computational approaches were applied to determine the structure of a full-length E. coli integral membrane sulfurtransferase, dimeric YgaP, and its structural and dynamic changes upon ligand binding. The solution NMR structures of the YgaP transmembrane domain (TMD) and cytosolic catalytic rhodanese domain were reported recently, but the tertiary fold of full-length YgaP was not yet available. Here, systematic site-specific EPR analysis defined a helix-loop-helix secondary structure of the YagP-TMD monomers using mobility, accessibility and membrane immersion measurements. The tertiary folds of dimeric YgaP-TMD and full-length YgaP in detergent micelles were determined through inter- and intra-monomer distance mapping and rigid-body computation. Further EPR analysis demonstrated the tight packing of the two YgaP second transmembrane helices upon binding of the catalytic product SCN-, which provides insight into the thiocyanate exportation mechanism of YgaP in the E. coli membrane.
Cao, Renzhi; Bhattacharya, Debswapna; Adhikari, Badri; Li, Jilong; Cheng, Jianlin
2015-01-01
Model evaluation and selection is an important step and a big challenge in template-based protein structure prediction. Individual model quality assessment methods designed for recognizing some specific properties of protein structures often fail to consistently select good models from a model pool because of their limitations. Therefore, combining multiple complimentary quality assessment methods is useful for improving model ranking and consequently tertiary structure prediction. Here, we report the performance and analysis of our human tertiary structure predictor (MULTICOM) based on the massive integration of 14 diverse complementary quality assessment methods that was successfully benchmarked in the 11th Critical Assessment of Techniques of Protein Structure prediction (CASP11). The predictions of MULTICOM for 39 template-based domains were rigorously assessed by six scoring metrics covering global topology of Cα trace, local all-atom fitness, side chain quality, and physical reasonableness of the model. The results show that the massive integration of complementary, diverse single-model and multi-model quality assessment methods can effectively leverage the strength of single-model methods in distinguishing quality variation among similar good models and the advantage of multi-model quality assessment methods of identifying reasonable average-quality models. The overall excellent performance of the MULTICOM predictor demonstrates that integrating a large number of model quality assessment methods in conjunction with model clustering is a useful approach to improve the accuracy, diversity, and consequently robustness of template-based protein structure prediction. PMID:26369671
Linear-scaling explicitly correlated treatment of solids: periodic local MP2-F12 method.
Usvyat, Denis
2013-11-21
Theory and implementation of the periodic local MP2-F12 method in the 3*A fixed-amplitude ansatz is presented. The method is formulated in the direct space, employing local representation for the occupied, virtual, and auxiliary orbitals in the form of Wannier functions (WFs), projected atomic orbitals (PAOs), and atom-centered Gaussian-type orbitals, respectively. Local approximations are introduced, restricting the list of the explicitly correlated pairs, as well as occupied, virtual, and auxiliary spaces in the strong orthogonality projector to the pair-specific domains on the basis of spatial proximity of respective orbitals. The 4-index two-electron integrals appearing in the formalism are approximated via the direct-space density fitting technique. In this procedure, the fitting orbital spaces are also restricted to local fit-domains surrounding the fitted densities. The formulation of the method and its implementation exploits the translational symmetry and the site-group symmetries of the WFs. Test calculations are performed on LiH crystal. The results show that the periodic LMP2-F12 method substantially accelerates basis set convergence of the total correlation energy, and even more so the correlation energy differences. The resulting energies are quite insensitive to the resolution-of-the-identity domain sizes and the quality of the auxiliary basis sets. The convergence with the orbital domain size is somewhat slower, but still acceptable. Moreover, inclusion of slightly more diffuse functions, than those usually used in the periodic calculations, improves the convergence of the LMP2-F12 correlation energy with respect to both the size of the PAO-domains and the quality of the orbital basis set. At the same time, the essentially diffuse atomic orbitals from standard molecular basis sets, commonly utilized in molecular MP2-F12 calculations, but problematic in the periodic context, are not necessary for LMP2-F12 treatment of crystals.
Pan, Xiaoyong; Shen, Hong-Bin
2017-02-28
RNAs play key roles in cells through the interactions with proteins known as the RNA-binding proteins (RBP) and their binding motifs enable crucial understanding of the post-transcriptional regulation of RNAs. How the RBPs correctly recognize the target RNAs and why they bind specific positions is still far from clear. Machine learning-based algorithms are widely acknowledged to be capable of speeding up this process. Although many automatic tools have been developed to predict the RNA-protein binding sites from the rapidly growing multi-resource data, e.g. sequence, structure, their domain specific features and formats have posed significant computational challenges. One of current difficulties is that the cross-source shared common knowledge is at a higher abstraction level beyond the observed data, resulting in a low efficiency of direct integration of observed data across domains. The other difficulty is how to interpret the prediction results. Existing approaches tend to terminate after outputting the potential discrete binding sites on the sequences, but how to assemble them into the meaningful binding motifs is a topic worth of further investigation. In viewing of these challenges, we propose a deep learning-based framework (iDeep) by using a novel hybrid convolutional neural network and deep belief network to predict the RBP interaction sites and motifs on RNAs. This new protocol is featured by transforming the original observed data into a high-level abstraction feature space using multiple layers of learning blocks, where the shared representations across different domains are integrated. To validate our iDeep method, we performed experiments on 31 large-scale CLIP-seq datasets, and our results show that by integrating multiple sources of data, the average AUC can be improved by 8% compared to the best single-source-based predictor; and through cross-domain knowledge integration at an abstraction level, it outperforms the state-of-the-art predictors by 6%. Besides the overall enhanced prediction performance, the convolutional neural network module embedded in iDeep is also able to automatically capture the interpretable binding motifs for RBPs. Large-scale experiments demonstrate that these mined binding motifs agree well with the experimentally verified results, suggesting iDeep is a promising approach in the real-world applications. The iDeep framework not only can achieve promising performance than the state-of-the-art predictors, but also easily capture interpretable binding motifs. iDeep is available at http://www.csbio.sjtu.edu.cn/bioinf/iDeep.
B. Tyler Wilson; Andrew J. Lister; Rachel I. Riemann
2012-01-01
The paper describes an efficient approach for mapping multiple individual tree species over large spatial domains. The method integrates vegetation phenology derived from MODIS imagery and raster data describing relevant environmental parameters with extensive field plot data of tree species basal area to create maps of tree species abundance and distribution at a 250-...
Applied Cognitive Models of Behavior and Errors Patterns
2017-09-01
methods offer an opportunity to deliver good , effective introductory and basic training , thus potentially enabling a single human instructor to train ...emergency medical technician (EMT) domain, which offers a standardized curriculum on which we can create training scenarios. 2. Develop...complexity of software integration and limited access to physical devices can result in commitment to a de- sign that turns out to not offer many training
1993-10-29
natural logarithm of the ratio of two maxima a period apart. Both methods are based on the results from the numerical integration. The details of this...check and okay member funtions are for sofware handshaking between the client and sever pracrss. Finally, the Forward function is used to initiate a
NASA Astrophysics Data System (ADS)
Godoy, William F.; DesJardin, Paul E.
2010-05-01
The application of flux limiters to the discrete ordinates method (DOM), SN, for radiative transfer calculations is discussed and analyzed for 3D enclosures for cases in which the intensities are strongly coupled to each other such as: radiative equilibrium and scattering media. A Newton-Krylov iterative method (GMRES) solves the final systems of linear equations along with a domain decomposition strategy for parallel computation using message passing libraries in a distributed memory system. Ray effects due to angular discretization and errors due to domain decomposition are minimized until small variations are introduced by these effects in order to focus on the influence of flux limiters on errors due to spatial discretization, known as numerical diffusion, smearing or false scattering. Results are presented for the DOM-integrated quantities such as heat flux, irradiation and emission. A variety of flux limiters are compared to "exact" solutions available in the literature, such as the integral solution of the RTE for pure absorbing-emitting media and isotropic scattering cases and a Monte Carlo solution for a forward scattering case. Additionally, a non-homogeneous 3D enclosure is included to extend the use of flux limiters to more practical cases. The overall balance of convergence, accuracy, speed and stability using flux limiters is shown to be superior compared to step schemes for any test case.
Source imaging of potential fields through a matrix space-domain algorithm
NASA Astrophysics Data System (ADS)
Baniamerian, Jamaledin; Oskooi, Behrooz; Fedi, Maurizio
2017-01-01
Imaging of potential fields yields a fast 3D representation of the source distribution of potential fields. Imaging methods are all based on multiscale methods allowing the source parameters of potential fields to be estimated from a simultaneous analysis of the field at various scales or, in other words, at many altitudes. Accuracy in performing upward continuation and differentiation of the field has therefore a key role for this class of methods. We here describe an accurate method for performing upward continuation and vertical differentiation in the space-domain. We perform a direct discretization of the integral equations for upward continuation and Hilbert transform; from these equations we then define matrix operators performing the transformation, which are symmetric (upward continuation) or anti-symmetric (differentiation), respectively. Thanks to these properties, just the first row of the matrices needs to be computed, so to decrease dramatically the computation cost. Our approach allows a simple procedure, with the advantage of not involving large data extension or tapering, as due instead in case of Fourier domain computation. It also allows level-to-drape upward continuation and a stable differentiation at high frequencies; finally, upward continuation and differentiation kernels may be merged into a single kernel. The accuracy of our approach is shown to be important for multi-scale algorithms, such as the continuous wavelet transform or the DEXP (depth from extreme point method), because border errors, which tend to propagate largely at the largest scales, are radically reduced. The application of our algorithm to synthetic and real-case gravity and magnetic data sets confirms the accuracy of our space domain strategy over FFT algorithms and standard convolution procedures.
Resolution enhancement of robust Bayesian pre-stack inversion in the frequency domain
NASA Astrophysics Data System (ADS)
Yin, Xingyao; Li, Kun; Zong, Zhaoyun
2016-10-01
AVO/AVA (amplitude variation with an offset or angle) inversion is one of the most practical and useful approaches to estimating model parameters. So far, publications on AVO inversion in the Fourier domain have been quite limited in view of its poor stability and sensitivity to noise compared with time-domain inversion. For the resolution and stability of AVO inversion in the Fourier domain, a novel robust Bayesian pre-stack AVO inversion based on the mixed domain formulation of stationary convolution is proposed which could solve the instability and achieve superior resolution. The Fourier operator will be integrated into the objective equation and it avoids the Fourier inverse transform in our inversion process. Furthermore, the background constraints of model parameters are taken into consideration to improve the stability and reliability of inversion which could compensate for the low-frequency components of seismic signals. Besides, the different frequency components of seismic signals can realize decoupling automatically. This will help us to solve the inverse problem by means of multi-component successive iterations and the convergence precision of the inverse problem could be improved. So, superior resolution compared with the conventional time-domain pre-stack inversion could be achieved easily. Synthetic tests illustrate that the proposed method could achieve high-resolution results with a high degree of agreement with the theoretical model and verify the quality of anti-noise. Finally, applications on a field data case demonstrate that the proposed method could obtain stable inversion results of elastic parameters from pre-stack seismic data in conformity with the real logging data.
Modeling and Detecting Feature Interactions among Integrated Services of Home Network Systems
NASA Astrophysics Data System (ADS)
Igaki, Hiroshi; Nakamura, Masahide
This paper presents a framework for formalizing and detecting feature interactions (FIs) in the emerging smart home domain. We first establish a model of home network system (HNS), where every networked appliance (or the HNS environment) is characterized as an object consisting of properties and methods. Then, every HNS service is defined as a sequence of method invocations of the appliances. Within the model, we next formalize two kinds of FIs: (a) appliance interactions and (b) environment interactions. An appliance interaction occurs when two method invocations conflict on the same appliance, whereas an environment interaction arises when two method invocations conflict indirectly via the environment. Finally, we propose offline and online methods that detect FIs before service deployment and during execution, respectively. Through a case study with seven practical services, it is shown that the proposed framework is generic enough to capture feature interactions in HNS integrated services. We also discuss several FI resolution schemes within the proposed framework.
Integral method for transient He II heat transfer in a semi-infinite domain
NASA Astrophysics Data System (ADS)
Baudouy, B.
2002-05-01
Integral methods are suited to solve a non-linear system of differential equations where the non-linearity can be found either in the differential equations or in the boundary conditions. Though they are approximate methods, they have proven to give simple solutions with acceptable accuracy for transient heat transfer in He II. Taking in account the temperature dependence of thermal properties, direct solutions are found without the need of adjusting a parameter. Previously, we have presented a solution for the clamped heat flux and in the present study this method is used to accommodate the clamped-temperature problem. In the case of constant thermal properties, this method yields results that are within a few percent of the exact solution for the heat flux at the axis origin. We applied this solution to analyze recovery from burnout and find an agreement within 10% at low heat flux, whereas at high heat flux the model deviates from the experimental data suggesting the need for a more refined thermal model.
Numerical conversion of transient to harmonic response functions for linear viscoelastic materials.
Buschmann, M D
1997-02-01
Viscoelastic material behavior is often characterized using one of the three measurements: creep, stress-relaxation or dynamic sinusoidal tests. A two-stage numerical method was developed to allow representation of data from creep and stress-relaxation tests on the Fourier axis in the Laplace domain. The method assumes linear behavior and is theoretically applicable to any transient test which attains an equilibrium state. The first stage numerically resolves the Laplace integral to convert temporal stress and strain data, from creep or stress-relaxation, to the stiffness function, G(s), evaluated on the positive real axis in the Laplace domain. This numerical integration alone allows the direct comparison of data from transient experiments which attain a final equilibrium state, such as creep and stress relaxation, and allows such data to be fitted to models expressed in the Laplace domain. The second stage of this numerical procedure maps the stiffness function, G(s), from the positive real axis to the positive imaginary axis to reveal the harmonic response function, or dynamic stiffness, G(j omega). The mapping for each angular frequency, s, is accomplished by fitting a polynomial to a subset of G(s) centered around a particular value of s, substituting js for s and thereby evaluating G(j omega). This two-stage transformation circumvents previous numerical difficulties associated with obtaining Fourier transforms of the stress and strain time domain signals. The accuracy of these transforms is verified using model functions from poroelasticity, corresponding to uniaxial confined compression of an isotropic material and uniaxial unconfined compression of a transversely isotropic material. The addition of noise to the model data does not significantly deteriorate the transformed results and data points need not be equally spaced in time. To exemplify its potential utility, this two-stage transform is applied to experimental stress relaxation data to obtain the dynamic stiffness which is then compared to direct measurements of dynamic stiffness using steady-state sinusoidal tests of the same cartilage disk in confined compression. In addition to allowing calculation of the dynamic stiffness from transient tests and the direct comparison of experimental data from different tests, these numerical methods should aid in the experimental analysis of linear and nonlinear material behavior, and increase the speed of curve-fitting routines by fitting creep or stress relaxation data to models expressed in the Laplace domain.
Strasser, T; Peters, T; Jagle, H; Zrenner, E; Wilke, R
2010-01-01
Electrophysiology of vision - especially the electroretinogram (ERG) - is used as a non-invasive way for functional testing of the visual system. The ERG is a combined electrical response generated by neural and non-neuronal cells in the retina in response to light stimulation. This response can be recorded and used for diagnosis of numerous disorders. For both clinical practice and clinical trials it is important to process those signals in an accurate and fast way and to provide the results as structured, consistent reports. Therefore, we developed a freely available and open-source framework in Java (http://www.eye.uni-tuebingen.de/project/idsI4sigproc). The framework is focused on an easy integration with existing applications. By leveraging well-established software patterns like pipes-and-filters and fluent interfaces as well as by designing the application programming interfaces (API) as an integrated domain specific language (DSL) the overall framework provides a smooth learning curve. Additionally, it already contains several processing methods and visualization features and can be extended easily by implementing the provided interfaces. In this way, not only can new processing methods be added but the framework can also be adopted for other areas of signal processing. This article describes in detail the structure and implementation of the framework and demonstrate its application through the software package used in clinical practice and clinical trials at the University Eye Hospital Tuebingen one of the largest departments in the field of visual electrophysiology in Europe.
Rheological Models in the Time-Domain Modeling of Seismic Motion
NASA Astrophysics Data System (ADS)
Moczo, P.; Kristek, J.
2004-12-01
The time-domain stress-strain relation in a viscoelastic medium has a form of the convolutory integral which is numerically intractable. This was the reason for the oversimplified models of attenuation in the time-domain seismic wave propagation and earthquake motion modeling. In their pioneering work, Day and Minster (1984) showed the way how to convert the integral into numerically tractable differential form in the case of a general viscoelastic modulus. In response to the work by Day and Minster, Emmerich and Korn (1987) suggested using the rheology of their generalized Maxwell body (GMB) while Carcione et al. (1988) suggested using the generalized Zener body (GZB). The viscoelastic moduli of both rheological models have a form of the rational function and thus the differential form of the stress-strain relation is rather easy to obtain. After the papers by Emmerich and Korn and Carcione et al. numerical modelers decided either for the GMB or GZB rheology and developed 'non-communicating' algorithms. In the many following papers the authors using the GMB never commented the GZB rheology and the corresponding algorithms, and the authors using the GZB never related their methods to the GMB rheology and algorithms. We analyze and compare both rheologies and the corresponding incorporations of the realistic attenuation into the time-domain computations. We then focus on the most recent staggered-grid finite-difference modeling, mainly on accounting for the material heterogeneity in the viscoelastic media, and the computational efficiency of the finite-difference algorithms.
III-nitride integration on ferroelectric materials of lithium niobate by molecular beam epitaxy
NASA Astrophysics Data System (ADS)
Namkoong, Gon; Lee, Kyoung-Keun; Madison, Shannon M.; Henderson, Walter; Ralph, Stephen E.; Doolittle, W. Alan
2005-10-01
Integration of III-nitride electrical devices on the ferroelectric material lithium niobate (LiNbO3) has been demonstrated. As a ferroelectric material, lithium niobate has a polarization which may provide excellent control of the polarity of III-nitrides. However, while high temperature, 1000°C, thermal treatments produce atomically smooth surfaces, improving adhesion of GaN epitaxial layers on lithium niobate, repolarization of the substrate in local domains occurs. These effects result in multi domains of mixed polarization in LiNbO3, producing inversion domains in subsequent GaN epilayers. However, it is found that AlN buffer layers suppress inversion domains of III-nitrides. Therefore, two-dimensional electron gases in AlGaN /GaN heterojunction structures are obtained. Herein, the demonstration of the monolithic integration of high power devices with ferroelectric materials presents possibilities to control LiNbO3 modulators on compact optoelectronic/electronic chips.
Simplified computational methods for elastic and elastic-plastic fracture problems
NASA Technical Reports Server (NTRS)
Atluri, Satya N.
1992-01-01
An overview is given of some of the recent (1984-1991) developments in computational/analytical methods in the mechanics of fractures. Topics covered include analytical solutions for elliptical or circular cracks embedded in isotropic or transversely isotropic solids, with crack faces being subjected to arbitrary tractions; finite element or boundary element alternating methods for two or three dimensional crack problems; a 'direct stiffness' method for stiffened panels with flexible fasteners and with multiple cracks; multiple site damage near a row of fastener holes; an analysis of cracks with bonded repair patches; methods for the generation of weight functions for two and three dimensional crack problems; and domain-integral methods for elastic-plastic or inelastic crack mechanics.
Hutcherson, Cendri A
2018-01-01
Are some people generally more successful using cognitive regulation or does it depend on the choice domain? Why? We combined behavioral computational modeling and multivariate decoding of fMRI responses to identify neural loci of regulation-related shifts in value representations across goals and domains (dietary or altruistic choice). Surprisingly, regulatory goals did not alter integrative value representations in the ventromedial prefrontal cortex, which represented all choice-relevant attributes across goals and domains. Instead, the dorsolateral prefrontal cortex (DLPFC) flexibly encoded goal-consistent values and predicted regulatory success for the majority of choice-relevant attributes, using attribute-specific neural codes. We also identified domain-specific exceptions: goal-dependent encoding of prosocial attributes localized to precuneus and temporo-parietal junction (not DLPFC). Our results suggest that cognitive regulation operated by changing specific attribute representations (not integrated values). Evidence of domain-general and domain-specific neural loci reveals important divisions of labor, explaining when and why regulatory success generalizes (or doesn’t) across contexts and domains. PMID:29813018
Ge, Liang; Sotiropoulos, Fotis
2007-08-01
A novel numerical method is developed that integrates boundary-conforming grids with a sharp interface, immersed boundary methodology. The method is intended for simulating internal flows containing complex, moving immersed boundaries such as those encountered in several cardiovascular applications. The background domain (e.g the empty aorta) is discretized efficiently with a curvilinear boundary-fitted mesh while the complex moving immersed boundary (say a prosthetic heart valve) is treated with the sharp-interface, hybrid Cartesian/immersed-boundary approach of Gilmanov and Sotiropoulos [1]. To facilitate the implementation of this novel modeling paradigm in complex flow simulations, an accurate and efficient numerical method is developed for solving the unsteady, incompressible Navier-Stokes equations in generalized curvilinear coordinates. The method employs a novel, fully-curvilinear staggered grid discretization approach, which does not require either the explicit evaluation of the Christoffel symbols or the discretization of all three momentum equations at cell interfaces as done in previous formulations. The equations are integrated in time using an efficient, second-order accurate fractional step methodology coupled with a Jacobian-free, Newton-Krylov solver for the momentum equations and a GMRES solver enhanced with multigrid as preconditioner for the Poisson equation. Several numerical experiments are carried out on fine computational meshes to demonstrate the accuracy and efficiency of the proposed method for standard benchmark problems as well as for unsteady, pulsatile flow through a curved, pipe bend. To demonstrate the ability of the method to simulate flows with complex, moving immersed boundaries we apply it to calculate pulsatile, physiological flow through a mechanical, bileaflet heart valve mounted in a model straight aorta with an anatomical-like triple sinus.
Ge, Liang; Sotiropoulos, Fotis
2008-01-01
A novel numerical method is developed that integrates boundary-conforming grids with a sharp interface, immersed boundary methodology. The method is intended for simulating internal flows containing complex, moving immersed boundaries such as those encountered in several cardiovascular applications. The background domain (e.g the empty aorta) is discretized efficiently with a curvilinear boundary-fitted mesh while the complex moving immersed boundary (say a prosthetic heart valve) is treated with the sharp-interface, hybrid Cartesian/immersed-boundary approach of Gilmanov and Sotiropoulos [1]. To facilitate the implementation of this novel modeling paradigm in complex flow simulations, an accurate and efficient numerical method is developed for solving the unsteady, incompressible Navier-Stokes equations in generalized curvilinear coordinates. The method employs a novel, fully-curvilinear staggered grid discretization approach, which does not require either the explicit evaluation of the Christoffel symbols or the discretization of all three momentum equations at cell interfaces as done in previous formulations. The equations are integrated in time using an efficient, second-order accurate fractional step methodology coupled with a Jacobian-free, Newton-Krylov solver for the momentum equations and a GMRES solver enhanced with multigrid as preconditioner for the Poisson equation. Several numerical experiments are carried out on fine computational meshes to demonstrate the accuracy and efficiency of the proposed method for standard benchmark problems as well as for unsteady, pulsatile flow through a curved, pipe bend. To demonstrate the ability of the method to simulate flows with complex, moving immersed boundaries we apply it to calculate pulsatile, physiological flow through a mechanical, bileaflet heart valve mounted in a model straight aorta with an anatomical-like triple sinus. PMID:19194533
Ding, Yi S; He, Yang
2017-08-21
An isotropic impedance sheet model is proposed for a loop-type hexagonal periodic metasurface. Both frequency and wave-vector dispersion are considered near the resonance frequency. Therefore both the angle and polarization dependences of the metasurface impedance can be properly and simultaneously described in our model. The constitutive relation of this model is transformed into auxiliary differential equations which are integrated into the finite-difference time-domain algorithm. Finally, a finite large metasurface sample under oblique illumination is used to test the model and the algorithm. Our model and algorithm can significantly increase the accuracy of the homogenization methods for modeling periodic metasurfaces.
A swash-backwash model of the single epidemic wave
NASA Astrophysics Data System (ADS)
Cliff, Andrew D.; Haggett, Peter
2006-09-01
While there is a large literature on the form of epidemic waves in the time domain, models of their structure and shape in the spatial domain remain poorly developed. This paper concentrates on the changing spatial distribution of an epidemic wave over time and presents a simple method for identifying the leading and trailing edges of the spatial advance and retreat of such waves. Analysis of edge characteristics is used to (a) disaggregate waves into ‘swash’ and ‘backwash’ stages, (b) measure the phase transitions of areas from susceptible, S, through infective, I, to recovered, R, status ( S → I → R) as dimensionless integrals and (c) estimate a spatial version of the basic reproduction number, R 0. The methods used are illustrated by application to measles waves in Iceland over a 60-year period from 1915 to 1974. Extensions of the methods for use with more complex waves are possible through modifying the threshold values used to define the start and end points of an event.
Zhang, T; Yang, M; Xiao, X; Feng, Z; Li, C; Zhou, Z; Ren, Q; Li, X
2014-03-01
Many infectious diseases exhibit repetitive or regular behaviour over time. Time-domain approaches, such as the seasonal autoregressive integrated moving average model, are often utilized to examine the cyclical behaviour of such diseases. The limitations for time-domain approaches include over-differencing and over-fitting; furthermore, the use of these approaches is inappropriate when the assumption of linearity may not hold. In this study, we implemented a simple and efficient procedure based on the fast Fourier transformation (FFT) approach to evaluate the epidemic dynamic of scarlet fever incidence (2004-2010) in China. This method demonstrated good internal and external validities and overcame some shortcomings of time-domain approaches. The procedure also elucidated the cycling behaviour in terms of environmental factors. We concluded that, under appropriate circumstances of data structure, spectral analysis based on the FFT approach may be applicable for the study of oscillating diseases.
Cane, James; O'Connor, Denise; Michie, Susan
2012-04-24
An integrative theoretical framework, developed for cross-disciplinary implementation and other behaviour change research, has been applied across a wide range of clinical situations. This study tests the validity of this framework. Validity was investigated by behavioural experts sorting 112 unique theoretical constructs using closed and open sort tasks. The extent of replication was tested by Discriminant Content Validation and Fuzzy Cluster Analysis. There was good support for a refinement of the framework comprising 14 domains of theoretical constructs (average silhouette value 0.29): 'Knowledge', 'Skills', 'Social/Professional Role and Identity', 'Beliefs about Capabilities', 'Optimism', 'Beliefs about Consequences', 'Reinforcement', 'Intentions', 'Goals', 'Memory, Attention and Decision Processes', 'Environmental Context and Resources', 'Social Influences', 'Emotions', and 'Behavioural Regulation'. The refined Theoretical Domains Framework has a strengthened empirical base and provides a method for theoretically assessing implementation problems, as well as professional and other health-related behaviours as a basis for intervention development.
ERIC Educational Resources Information Center
Nucci, Larry; Creane, Michael W.; Powers, Deborah W.
2015-01-01
Eleven teachers and 254 urban middle-school students comprised the sample of this study examining the social and moral development outcomes of the integration of social cognitive domain theory within regular classroom instruction. Participating teachers were trained to construct and implement history lessons that stimulated students' moral…
Lin, Nan; Yang, Xiaohong; Li, Jing; Wang, Shaonan; Hua, Huimin; Ma, Yujun; Li, Xingshan
2018-04-01
Neuroimaging studies have found that theory of mind (ToM) and discourse comprehension involve similar brain regions. These brain regions may be associated with three cognitive components that are necessarily or frequently involved in ToM and discourse comprehension, including social concept representation and retrieval, domain-general semantic integration, and domain-specific integration of social semantic contents. Using fMRI, we investigated the neural correlates of these three cognitive components by exploring how discourse topic (social/nonsocial) and discourse processing period (ending/beginning) modulate brain activation in a discourse comprehension (and also ToM) task. Different sets of brain areas showed sensitivity to discourse topic, discourse processing period, and the interaction between them, respectively. The most novel finding was that the right temporoparietal junction and middle temporal gyrus showed sensitivity to discourse processing period only during social discourse comprehension, indicating that they selectively contribute to domain-specific semantic integration. Our finding indicates how different domains of semantic information are processed and integrated in the brain and provides new insights into the neural correlates of ToM and discourse comprehension.
Kim, Taehyung; Tyndel, Marc S; Huang, Haiming; Sidhu, Sachdev S; Bader, Gary D; Gfeller, David; Kim, Philip M
2012-03-01
Peptide recognition domains and transcription factors play crucial roles in cellular signaling. They bind linear stretches of amino acids or nucleotides, respectively, with high specificity. Experimental techniques that assess the binding specificity of these domains, such as microarrays or phage display, can retrieve thousands of distinct ligands, providing detailed insight into binding specificity. In particular, the advent of next-generation sequencing has recently increased the throughput of such methods by several orders of magnitude. These advances have helped reveal the presence of distinct binding specificity classes that co-exist within a set of ligands interacting with the same target. Here, we introduce a software system called MUSI that can rapidly analyze very large data sets of binding sequences to determine the relevant binding specificity patterns. Our pipeline provides two major advances. First, it can detect previously unrecognized multiple specificity patterns in any data set. Second, it offers integrated processing of very large data sets from next-generation sequencing machines. The results are visualized as multiple sequence logos describing the different binding preferences of the protein under investigation. We demonstrate the performance of MUSI by analyzing recent phage display data for human SH3 domains as well as microarray data for mouse transcription factors.
NASA Astrophysics Data System (ADS)
Wang, Jun-Wei; Liu, Ya-Qiang; Hu, Yan-Yan; Sun, Chang-Yin
2017-12-01
This paper discusses the design problem of distributed H∞ Luenberger-type partial differential equation (PDE) observer for state estimation of a linear unstable parabolic distributed parameter system (DPS) with external disturbance and measurement disturbance. Both pointwise measurement in space and local piecewise uniform measurement in space are considered; that is, sensors are only active at some specified points or applied at part thereof of the spatial domain. The spatial domain is decomposed into multiple subdomains according to the location of the sensors such that only one sensor is located at each subdomain. By using Lyapunov technique, Wirtinger's inequality at each subdomain, and integration by parts, a Lyapunov-based design of Luenberger-type PDE observer is developed such that the resulting estimation error system is exponentially stable with an H∞ performance constraint, and presented in terms of standard linear matrix inequalities (LMIs). For the case of local piecewise uniform measurement in space, the first mean value theorem for integrals is utilised in the observer design development. Moreover, the problem of optimal H∞ observer design is also addressed in the sense of minimising the attenuation level. Numerical simulation results are presented to show the satisfactory performance of the proposed design method.
Patient similarity for precision medicine: a systematic review.
Parimbelli, E; Marini, S; Sacchi, L; Bellazzi, R
2018-06-01
Evidence-based medicine is the most prevalent paradigm adopted by physicians. Clinical practice guidelines typically define a set of recommendations together with eligibility criteria that restrict their applicability to a specific group of patients. The ever-growing size and availability of health-related data is currently challenging the broad definitions of guideline-defined patient groups. Precision medicine leverages on genetic, phenotypic, or psychosocial characteristics to provide precise identification of patient subsets for treatment targeting. Defining a patient similarity measure is thus an essential step to allow stratification of patients into clinically-meaningful subgroups. The present review investigates the use of patient similarity as a tool to enable precision medicine. 279 articles were analyzed along four dimensions: data types considered, clinical domains of application, data analysis methods, and translational stage of findings. Cancer-related research employing molecular profiling and standard data analysis techniques such as clustering constitute the majority of the retrieved studies. Chronic and psychiatric diseases follow as the second most represented clinical domains. Interestingly, almost one quarter of the studies analyzed presented a novel methodology, with the most advanced employing data integration strategies and being portable to different clinical domains. Integration of such techniques into decision support systems constitutes and interesting trend for future research. Copyright © 2018. Published by Elsevier Inc.
Yakhforoshha, Afsaneh; Emami, Seyed Amir Hossein; Shahi, Farhad; Shahsavari, Saeed; Cheraghi, Mohammadali; Mojtahedzadeh, Rita; Mahmoodi-Bakhtiari, Behrooz; Shirazi, Mandana
2018-02-21
The task of breaking bad news (BBN) may be improved by incorporating simulation with art-based teaching methods. The aim of the present study was to assess the effect of an integrating simulation with art-based teaching strategies, on fellows' performance regarding BBN, in Iran. The study was carried out using quasi-experimental methods, interrupted time series. The participants were selected from medical oncology fellows at two teaching hospitals of Tehran University of Medical Sciences (TUMS), Iran. Participants were trained through workshop, followed by engaging participants with different types of art-based teaching methods. In order to assess the effectiveness of the integrating model, fellows' performance was rated by two independent raters (standardized patients (SPs) and faculty members) using the BBN assessment checklist. This assessment tool measured seven different domains of BBN skill. Segmented regression was used to analyze the results of study. Performance of all oncology fellows (n = 19) was assessed for 228 time points during the study, by rating three time points before and three time points after the intervention by two raters. Based on SP ratings, fellows' performance scores in post-training showed significant level changes in three domains of BBN checklist (B = 1.126, F = 3.221, G = 2.241; p < 0.05). Similarly, the significant level change in fellows' score rated by faculty members in post-training was B = 1.091, F = 3.273, G = 1.724; p < 0.05. There was no significant change in trend of fellows' performance after the intervention. Our results showed that using an integrating simulation with art-based teaching strategies may help oncology fellows to improve their communication skills in different facets of BBN performance. Iranian Registry of Clinical Trials ID: IRCT2016011626039N1.
2016-01-01
Background Contributing to health informatics research means using conceptual models that are integrative and explain the research in terms of the two broad domains of health science and information science. However, it can be hard for novice health informatics researchers to find exemplars and guidelines in working with integrative conceptual models. Objectives The aim of this paper is to support the use of integrative conceptual models in research on information and communication technologies in the health sector, and to encourage discussion of these conceptual models in scholarly forums. Methods A two-part method was used to summarize and structure ideas about how to work effectively with conceptual models in health informatics research that included (1) a selective review and summary of the literature of conceptual models; and (2) the construction of a step-by-step approach to developing a conceptual model. Results The seven-step methodology for developing conceptual models in health informatics research explained in this paper involves (1) acknowledging the limitations of health science and information science conceptual models; (2) giving a rationale for one’s choice of integrative conceptual model; (3) explicating a conceptual model verbally and graphically; (4) seeking feedback about the conceptual model from stakeholders in both the health science and information science domains; (5) aligning a conceptual model with an appropriate research plan; (6) adapting a conceptual model in response to new knowledge over time; and (7) disseminating conceptual models in scholarly and scientific forums. Conclusions Making explicit the conceptual model that underpins a health informatics research project can contribute to increasing the number of well-formed and strongly grounded health informatics research projects. This explication has distinct benefits for researchers in training, research teams, and researchers and practitioners in information, health, and other disciplines. PMID:26912288
Parallel computation of three-dimensional aeroelastic fluid-structure interaction
NASA Astrophysics Data System (ADS)
Sadeghi, Mani
This dissertation presents a numerical method for the parallel computation of aeroelasticity (ParCAE). A flow solver is coupled to a structural solver by use of a fluid-structure interface method. The integration of the three-dimensional unsteady Navier-Stokes equations is performed in the time domain, simultaneously to the integration of a modal three-dimensional structural model. The flow solution is accelerated by using a multigrid method and a parallel multiblock approach. Fluid-structure coupling is achieved by subiteration. A grid-deformation algorithm is developed to interpolate the deformation of the structural boundaries onto the flow grid. The code is formulated to allow application to general, three-dimensional, complex configurations with multiple independent structures. Computational results are presented for various configurations, such as turbomachinery blade rows and aircraft wings. Investigations are performed on vortex-induced vibrations, effects of cascade mistuning on flutter, and cases of nonlinear cascade and wing flutter.
Infrared and visible image fusion method based on saliency detection in sparse domain
NASA Astrophysics Data System (ADS)
Liu, C. H.; Qi, Y.; Ding, W. R.
2017-06-01
Infrared and visible image fusion is a key problem in the field of multi-sensor image fusion. To better preserve the significant information of the infrared and visible images in the final fused image, the saliency maps of the source images is introduced into the fusion procedure. Firstly, under the framework of the joint sparse representation (JSR) model, the global and local saliency maps of the source images are obtained based on sparse coefficients. Then, a saliency detection model is proposed, which combines the global and local saliency maps to generate an integrated saliency map. Finally, a weighted fusion algorithm based on the integrated saliency map is developed to achieve the fusion progress. The experimental results show that our method is superior to the state-of-the-art methods in terms of several universal quality evaluation indexes, as well as in the visual quality.
Analysis of 3D poroelastodynamics using BEM based on modified time-step scheme
NASA Astrophysics Data System (ADS)
Igumnov, L. A.; Petrov, A. N.; Vorobtsov, I. V.
2017-10-01
The development of 3d boundary elements modeling of dynamic partially saturated poroelastic media using a stepping scheme is presented in this paper. Boundary Element Method (BEM) in Laplace domain and the time-stepping scheme for numerical inversion of the Laplace transform are used to solve the boundary value problem. The modified stepping scheme with a varied integration step for quadrature coefficients calculation using the symmetry of the integrand function and integral formulas of Strongly Oscillating Functions was applied. The problem with force acting on a poroelastic prismatic console end was solved using the developed method. A comparison of the results obtained by the traditional stepping scheme with the solutions obtained by this modified scheme shows that the computational efficiency is better with usage of combined formulas.
Sando, Yusuke; Barada, Daisuke; Jackin, Boaz Jessie; Yatagai, Toyohiko
2017-07-10
This study proposes a method to reduce the calculation time and memory usage required for calculating cylindrical computer-generated holograms. The wavefront on the cylindrical observation surface is represented as a convolution integral in the 3D Fourier domain. The Fourier transformation of the kernel function involving this convolution integral is analytically performed using a Bessel function expansion. The analytical solution can drastically reduce the calculation time and the memory usage without any cost, compared with the numerical method using fast Fourier transform to Fourier transform the kernel function. In this study, we present the analytical derivation, the efficient calculation of Bessel function series, and a numerical simulation. Furthermore, we demonstrate the effectiveness of the analytical solution through comparisons of calculation time and memory usage.
A Framework for Integrating Oceanographic Data Repositories
NASA Astrophysics Data System (ADS)
Rozell, E.; Maffei, A. R.; Beaulieu, S. E.; Fox, P. A.
2010-12-01
Oceanographic research covers a broad range of science domains and requires a tremendous amount of cross-disciplinary collaboration. Advances in cyberinfrastructure are making it easier to share data across disciplines through the use of web services and community vocabularies. Best practices in the design of web services and vocabularies to support interoperability amongst science data repositories are only starting to emerge. Strategic design decisions in these areas are crucial to the creation of end-user data and application integration tools. We present S2S, a novel framework for deploying customizable user interfaces to support the search and analysis of data from multiple repositories. Our research methods follow the Semantic Web methodology and technology development process developed by Fox et al. This methodology stresses the importance of close scientist-technologist interactions when developing scientific use cases, keeping the project well scoped and ensuring the result meets a real scientific need. The S2S framework motivates the development of standardized web services with well-described parameters, as well as the integration of existing web services and applications in the search and analysis of data. S2S also encourages the use and development of community vocabularies and ontologies to support federated search and reduce the amount of domain expertise required in the data discovery process. S2S utilizes the Web Ontology Language (OWL) to describe the components of the framework, including web service parameters, and OpenSearch as a standard description for web services, particularly search services for oceanographic data repositories. We have created search services for an oceanographic metadata database, a large set of quality-controlled ocean profile measurements, and a biogeographic search service. S2S provides an application programming interface (API) that can be used to generate custom user interfaces, supporting data and application integration across these repositories and other web resources. Although initially targeted towards a general oceanographic audience, the S2S framework shows promise in many science domains, inspired in part by the broad disciplinary coverage of oceanography. This presentation will cover the challenges addressed by the S2S framework, the research methods used in its development, and the resulting architecture for the system. It will demonstrate how S2S is remarkably extensible, and can be generalized to many science domains. Given these characteristics, the framework can simplify the process of data discovery and analysis for the end user, and can help to shift the responsibility of search interface development away from data managers.
Research-IQ: Development and Evaluation of an Ontology-anchored Integrative Query Tool
Borlawsky, Tara B.; Lele, Omkar; Payne, Philip R. O.
2011-01-01
Investigators in the translational research and systems medicine domains require highly usable, efficient and integrative tools and methods that allow for the navigation of and reasoning over emerging large-scale data sets. Such resources must cover a spectrum of granularity from bio-molecules to population phenotypes. Given such information needs, we report upon the initial design and evaluation of an ontology-anchored integrative query tool, Research-IQ, which employs a combination of conceptual knowledge engineering and information retrieval techniques to enable the intuitive and rapid construction of queries, in terms of semi-structured textual propositions, that can subsequently be applied to integrative data sets. Our initial results, based upon both quantitative and qualitative evaluations of the efficacy and usability of Research-IQ, demonstrate its potential to increase clinical and translational research throughput. PMID:21821150
A class of Fourier integrals based on the electric potential of an elongated dipole.
Skianis, Georgios Aim
2014-01-01
In the present paper the closed expressions of a class of non tabulated Fourier integrals are derived. These integrals are associated with a group of functions at space domain, which represent the electric potential of a distribution of elongated dipoles which are perpendicular to a flat surface. It is shown that the Fourier integrals are produced by the Fourier transform of the Green's function of the potential of the dipole distribution, times a definite integral in which the distribution of the polarization is involved. Therefore the form of this distribution controls the expression of the Fourier integral. Introducing various dipole distributions, the respective Fourier integrals are derived. These integrals may be useful in the quantitative interpretation of electric potential anomalies produced by elongated dipole distributions, at spatial frequency domain.
Sockalingam, Sanjeev; Tehrani, Hedieh; Lin, Elizabeth; Lieff, Susan; Harris, Ilene; Soklaridis, Sophie
2016-04-01
To explore the perspectives of leaders in psychiatry and continuing professional development (CPD) regarding the relationship, opportunities, and challenges in integrating quality improvement (QI) and CPD. In 2013-2014, the authors interviewed 18 participants in Canada: 10 psychiatrists-in-chief, 6 CPD leaders in psychiatry, and 2 individuals with experience integrating these domains in psychiatry who were identified through snowball sampling. Questions were designed to identify participants' perspectives about the definition, relationship, and integration of QI and CPD in psychiatry. Interviews were recorded and transcribed. An iterative, inductive method was used to thematically analyze the transcripts. To ensure the rigor of the analysis, the authors performed member checking and sampling until theoretical saturation was achieved. Participants defined QI as a concept measured at the individual, hospital, and health care system levels and CPD as a concept measured predominantly at the individual and hospital levels. Four themes related to the relationship between QI and CPD were identified: challenges with QI training, adoption of QI into the mental health care system, implementation of QI in CPD, and practice improvement outcomes. Despite participants describing QI and CPD as mutually beneficial, they expressed uncertainty about the appropriateness of aligning these domains within a mental health care context because of the identified challenges. This study identified challenges with aligning QI and CPD in psychiatry and yielded a framework to inform future integration efforts. Further research is needed to determine the generalizability of this framework to other specialties and health care professions.
Aldridge, Melissa D; Hasselaar, Jeroen; Garralda, Eduardo; van der Eerden, Marlieke; Stevenson, David; McKendrick, Karen; Centeno, Carlos; Meier, Diane E
2016-03-01
Early integration of palliative care into the management of patients with serious disease has the potential to both improve quality of life of patients and families and reduce healthcare costs. Despite these benefits, significant barriers exist in the United States to the early integration of palliative care in the disease trajectory of individuals with serious illness. To provide an overview of the barriers to more widespread palliative care integration in the United States. A literature review using PubMed from 2005 to March 2015 augmented by primary data collected from 405 hospitals included in the Center to Advance Palliative Care's National Palliative Care Registry for years 2012 and 2013. We use the World Health Organization's Public Health Strategy for Palliative Care as a framework for analyzing barriers to palliative care integration. We identified key barriers to palliative care integration across three World Health Organization domains: (1) education domain: lack of adequate education/training and perception of palliative care as end-of-life care; (2) implementation domain: inadequate size of palliative medicine-trained workforce, challenge of identifying patients appropriate for palliative care referral, and need for culture change across settings; (3) policy domain: fragmented healthcare system, need for greater funding for research, lack of adequate reimbursement for palliative care, and regulatory barriers. We describe the key policy and educational opportunities in the United States to address and potentially overcome the barriers to greater integration of palliative care into the healthcare of Americans with serious illness. © The Author(s) 2015.
NASA Astrophysics Data System (ADS)
Demasi, L.; Livne, E.
2009-07-01
Two different time domain formulations of integrating commonly used frequency-domain unsteady aerodynamic models based on a modal approach with full order finite element models for structures with geometric nonlinearities are presented. Both approaches are tailored to flight vehicle configurations where geometric stiffness effects are important but where deformations are moderate, flow is attached, and linear unsteady aerodynamic modeling is adequate, such as low aspect ratio wings or joined-wing and strut-braced wings at small to moderate angles of attack. Results obtained using the two approaches are compared using both planar and non-planar wing configurations. Sub-critical and post-flutter speeds are considered. It is demonstrated that the two methods lead to the same steady solution for the sub-critical case after the transients subside. It is also shown that the two methods predict the amplitude and frequency of limit cycle oscillation (when present) with the same accuracy.
Nondestructive methods of integrating energy harvesting systems with structures
NASA Astrophysics Data System (ADS)
Inamdar, Sumedh; Zimowski, Krystian; Crawford, Richard; Wood, Kristin; Jensen, Dan
2012-04-01
Designing an attachment structure that is both novel and meets the system requirements can be a difficult task especially for inexperienced designers. This paper presents a design methodology for concept generation of a "parent/child" attachment system. The "child" is broadly defined as any device, part, or subsystem that will attach to any existing system, part, or device called the "parent." An inductive research process was used to study a variety of products, patents, and biological examples that exemplified the parent/child system. Common traits among these products were found and categorized as attachment principles in three different domains: mechanical, material, and field. The attachment principles within the mechanical domain and accompanying examples are the focus of this paper. As an example of the method, a case study of generating concepts for a bridge mounted wind energy harvester using the mechanical attachment principles derived from the methodology and TRIZ principles derived from Altshuller's matrix of contradictions is presented.
Exponential convergence through linear finite element discretization of stratified subdomains
NASA Astrophysics Data System (ADS)
Guddati, Murthy N.; Druskin, Vladimir; Vaziri Astaneh, Ali
2016-10-01
Motivated by problems where the response is needed at select localized regions in a large computational domain, we devise a novel finite element discretization that results in exponential convergence at pre-selected points. The key features of the discretization are (a) use of midpoint integration to evaluate the contribution matrices, and (b) an unconventional mapping of the mesh into complex space. Named complex-length finite element method (CFEM), the technique is linked to Padé approximants that provide exponential convergence of the Dirichlet-to-Neumann maps and thus the solution at specified points in the domain. Exponential convergence facilitates drastic reduction in the number of elements. This, combined with sparse computation associated with linear finite elements, results in significant reduction in the computational cost. The paper presents the basic ideas of the method as well as illustration of its effectiveness for a variety of problems involving Laplace, Helmholtz and elastodynamics equations.
An Improved Neutron Transport Algorithm for HZETRN
NASA Technical Reports Server (NTRS)
Slaba, Tony C.; Blattnig, Steve R.; Clowdsley, Martha S.; Walker, Steven A.; Badavi, Francis F.
2010-01-01
Long term human presence in space requires the inclusion of radiation constraints in mission planning and the design of shielding materials, structures, and vehicles. In this paper, the numerical error associated with energy discretization in HZETRN is addressed. An inadequate numerical integration scheme in the transport algorithm is shown to produce large errors in the low energy portion of the neutron and light ion fluence spectra. It is further shown that the errors result from the narrow energy domain of the neutron elastic cross section spectral distributions, and that an extremely fine energy grid is required to resolve the problem under the current formulation. Two numerical methods are developed to provide adequate resolution in the energy domain and more accurately resolve the neutron elastic interactions. Convergence testing is completed by running the code for various environments and shielding materials with various energy grids to ensure stability of the newly implemented method.
NASA Technical Reports Server (NTRS)
Kaljevic, Igor; Patnaik, Surya N.; Hopkins, Dale A.
1996-01-01
The Integrated Force Method has been developed in recent years for the analysis of structural mechanics problems. This method treats all independent internal forces as unknown variables that can be calculated by simultaneously imposing equations of equilibrium and compatibility conditions. In this paper a finite element library for analyzing two-dimensional problems by the Integrated Force Method is presented. Triangular- and quadrilateral-shaped elements capable of modeling arbitrary domain configurations are presented. The element equilibrium and flexibility matrices are derived by discretizing the expressions for potential and complementary energies, respectively. The displacement and stress fields within the finite elements are independently approximated. The displacement field is interpolated as it is in the standard displacement method, and the stress field is approximated by using complete polynomials of the correct order. A procedure that uses the definitions of stress components in terms of an Airy stress function is developed to derive the stress interpolation polynomials. Such derived stress fields identically satisfy the equations of equilibrium. Moreover, the resulting element matrices are insensitive to the orientation of local coordinate systems. A method is devised to calculate the number of rigid body modes, and the present elements are shown to be free of spurious zero-energy modes. A number of example problems are solved by using the present library, and the results are compared with corresponding analytical solutions and with results from the standard displacement finite element method. The Integrated Force Method not only gives results that agree well with analytical and displacement method results but also outperforms the displacement method in stress calculations.
Transactivation domain of p53 regulates DNA repair and integrity in human iPS cells.
Kannappan, Ramaswamy; Mattapally, Saidulu; Wagle, Pooja A; Zhang, Jianyi
2018-05-18
The role of p53 transactivation domain (p53-TAD), a multifunctional and dynamic domain, on DNA repair and retaining DNA integrity in human iPS cells has never been studied. p53-TAD was knocked out in iPS cells using CRISPR/Cas9 and was confirmed by DNA sequencing. p53-TAD KO cells were characterized by: accelerated proliferation, decreased population doubling time, and unaltered Bcl2, BBC3, IGF1R, Bax and altered Mdm2, p21, and PIDD transcripts expression. In p53-TAD KO cells p53 regulated DNA repair proteins XPA, DNA polH and DDB2 expression were found to be reduced compared to p53-WT cells. Exposure to low dose of doxorubicin (Doxo) induced similar DNA damage and DNA damage response (DDR) measured by RAD50 and MRE11 expression, Checkpoint kinase 2 activation and γH2A.X recruitment at DNA strand breaks in both the cell groups indicating silencing p53-TAD do not affect DDR mechanism upstream of p53. Following removal of Doxo p53-WT hiPS cells underwent DNA repair, corrected their damaged DNA and restored DNA integrity. Conversely, p53-TAD KO hiPS cells did not undergo complete DNA repair and failed to restore DNA integrity. More importantly continuous culture of p53-TAD KO hiPS cells underwent G2/M cell cycle arrest and expressed cellular senescent marker p16 INK4a . Our data clearly shows that silencing transactivation domain of p53 did not affect DDR but affected the DNA repair process implying the crucial role of p53 transactivation domain in maintaining DNA integrity. Therefore, activating p53-TAD domain using small molecules may promote DNA repair and integrity of cells and prevent senescence.
NASA Astrophysics Data System (ADS)
Maione, F.; De Pietri, R.; Feo, A.; Löffler, F.
2016-09-01
We present results from three-dimensional general relativistic simulations of binary neutron star coalescences and mergers using public codes. We considered equal mass models where the baryon mass of the two neutron stars is 1.4{M}⊙ , described by four different equations of state (EOS) for the cold nuclear matter (APR4, SLy, H4, and MS1; all parametrized as piecewise polytropes). We started the simulations from four different initial interbinary distances (40,44.3,50, and 60 km), including up to the last 16 orbits before merger. That allows us to show the effects on the gravitational wave (GW) phase evolution, radiated energy and angular momentum due to: the use of different EOS, the orbital eccentricity present in the initial data and the initial separation (in the simulation) between the two stars. Our results show that eccentricity has a major role in the discrepancy between numerical and analytical waveforms until the very last few orbits, where ‘tidal’ effects and missing high-order post-Newtonian coefficients also play a significant role. We test different methods for extrapolating the GW signal extracted at finite radii to null infinity. We show that an effective procedure for integrating the Newman-Penrose {\\psi }4 signal to obtain the GW strain h is to apply a simple high-pass digital filter to h after a time domain integration, where only the two physical motivated integration constants are introduced. That should be preferred to the more common procedures of introducing additional integration constants, integrating in the frequency domain or filtering {\\psi }4 before integration.
Evolving Postmortems as Teams Evolve Through TxP
2014-12-01
Instead of waiting for SEI to compile enough data to repeat this kind of analysis for the system integration test domain , a system integration test team...and stand up their Team Test Process (TTP). Some abilities, like planning on how many mistakes will be made by the team in producing a test procedure...can only be performed after the team has determined a) which mistakes count in the domain of system integration testing, b) what units to use to
Event extraction of bacteria biotopes: a knowledge-intensive NLP-based approach
2012-01-01
Background Bacteria biotopes cover a wide range of diverse habitats including animal and plant hosts, natural, medical and industrial environments. The high volume of publications in the microbiology domain provides a rich source of up-to-date information on bacteria biotopes. This information, as found in scientific articles, is expressed in natural language and is rarely available in a structured format, such as a database. This information is of great importance for fundamental research and microbiology applications (e.g., medicine, agronomy, food, bioenergy). The automatic extraction of this information from texts will provide a great benefit to the field. Methods We present a new method for extracting relationships between bacteria and their locations using the Alvis framework. Recognition of bacteria and their locations was achieved using a pattern-based approach and domain lexical resources. For the detection of environment locations, we propose a new approach that combines lexical information and the syntactic-semantic analysis of corpus terms to overcome the incompleteness of lexical resources. Bacteria location relations extend over sentence borders, and we developed domain-specific rules for dealing with bacteria anaphors. Results We participated in the BioNLP 2011 Bacteria Biotope (BB) task with the Alvis system. Official evaluation results show that it achieves the best performance of participating systems. New developments since then have increased the F-score by 4.1 points. Conclusions We have shown that the combination of semantic analysis and domain-adapted resources is both effective and efficient for event information extraction in the bacteria biotope domain. We plan to adapt the method to deal with a larger set of location types and a large-scale scientific article corpus to enable microbiologists to integrate and use the extracted knowledge in combination with experimental data. PMID:22759462
FVCOM one-way and two-way nesting using ESMF: Development and validation
NASA Astrophysics Data System (ADS)
Qi, Jianhua; Chen, Changsheng; Beardsley, Robert C.
2018-04-01
Built on the Earth System Modeling Framework (ESMF), the one-way and two-way nesting methods were implemented into the unstructured-grid Finite-Volume Community Ocean Model (FVCOM). These methods help utilize the unstructured-grid multi-domain nesting of FVCOM with an aim at resolving the multi-scale physical and ecosystem processes. A detail of procedures on implementing FVCOM into ESMF was described. The experiments were made to validate and evaluate the performance of the nested-grid FVCOM system. The first was made for a wave-current interaction case with a two-domain nesting with an emphasis on qualifying a critical need of nesting to resolve a high-resolution feature near the coast and harbor with little loss in computational efficiency. The second was conducted for the pseudo river plume cases to examine the differences in the model-simulated salinity between one-way and two-way nesting approaches and evaluate the performance of mass conservative two-way nesting method. The third was carried out for the river plume case in the realistic geometric domain in Mass Bay, supporting the importance for having the two-way nesting for coastal-estuarine integrated modeling. The nesting method described in this paper has been used in the Northeast Coastal Ocean Forecast System (NECOFS)-a global-regional-coastal nesting FVCOM system that has been placed into the end-to-end forecast and hindcast operations since 2007.
Physiomodel - an integrative physiology in Modelica.
Matejak, Marek; Kofranek, Jiri
2015-08-01
Physiomodel (http://www.physiomodel.org) is our reimplementation and extension of an integrative physiological model called HumMod 1.6 (http://www.hummod.org) using our Physiolibrary (http://www.physiolibrary.org). The computer language Modelica is well-suited to exactly formalize integrative physiology. Modelica is an equation-based, and object-oriented language for hybrid ordinary differential equations (http:// www.modelica.org). Almost every physiological term can be defined as a class in this language and can be instantiated as many times as it occurs in the body. Each class has a graphical icon for use in diagrams. These diagrams are self-describing; the Modelica code generated from them is the full representation of the underlying mathematical model. Special Modelica constructs of physical connectors from Physiolibrary allow us to create diagrams that are analogies of electrical circuits with Kirchhoff's laws. As electric currents and electric potentials are connected in electrical domain, so are molar flows and concentrations in the chemical domain; volumetric flows and pressures in the hydraulic domain; flows of heat energy and temperatures in the thermal domain; and changes and amounts of members in the population domain.
NASA Astrophysics Data System (ADS)
Bigeon, John; Huby, Nolwenn; Duvail, Jean-Luc; Bêche, Bruno
2014-04-01
We report photonic concepts related to injection and sub-wavelength propagation in nanotubes, an unusual but promising geometry for highly integrated photonic devices. Theoretical simulation by the finite domain time-dependent (FDTD) method was first used to determine the features of the direct light injection and sub-wavelength propagation regime within nanotubes. Then, the injection into nanotubes of SU8, a photoresist used for integrated photonics, was successfully achieved by using polymer microlensed fibers with a sub-micronic radius of curvature, as theoretically expected from FDTD simulations. The propagation losses in a single SU8 nanotube were determined by using a comprehensive set-up and a protocol for optical characterization. The attenuation coefficient has been evaluated at 1.25 dB mm-1 by a cut-back method transposed to such nanostructures. The mechanisms responsible for losses in nanotubes were identified with FDTD theoretical support. Both injection and cut-back methods developed here are compatible with any sub-micronic structures. This work on SU8 nanotubes suggests broader perspectives for future nanophotonics.
Bigeon, John; Huby, Nolwenn; Duvail, Jean-Luc; Bêche, Bruno
2014-05-21
We report photonic concepts related to injection and sub-wavelength propagation in nanotubes, an unusual but promising geometry for highly integrated photonic devices. Theoretical simulation by the finite domain time-dependent (FDTD) method was first used to determine the features of the direct light injection and sub-wavelength propagation regime within nanotubes. Then, the injection into nanotubes of SU8, a photoresist used for integrated photonics, was successfully achieved by using polymer microlensed fibers with a sub-micronic radius of curvature, as theoretically expected from FDTD simulations. The propagation losses in a single SU8 nanotube were determined by using a comprehensive set-up and a protocol for optical characterization. The attenuation coefficient has been evaluated at 1.25 dB mm(-1) by a cut-back method transposed to such nanostructures. The mechanisms responsible for losses in nanotubes were identified with FDTD theoretical support. Both injection and cut-back methods developed here are compatible with any sub-micronic structures. This work on SU8 nanotubes suggests broader perspectives for future nanophotonics.
Windowed Green function method for the Helmholtz equation in the presence of multiply layered media
NASA Astrophysics Data System (ADS)
Bruno, O. P.; Pérez-Arancibia, C.
2017-06-01
This paper presents a new methodology for the solution of problems of two- and three-dimensional acoustic scattering (and, in particular, two-dimensional electromagnetic scattering) by obstacles and defects in the presence of an arbitrary number of penetrable layers. Relying on the use of certain slow-rise windowing functions, the proposed windowed Green function approach efficiently evaluates oscillatory integrals over unbounded domains, with high accuracy, without recourse to the highly expensive Sommerfeld integrals that have typically been used to account for the effect of underlying planar multilayer structures. The proposed methodology, whose theoretical basis was presented in the recent contribution (Bruno et al. 2016 SIAM J. Appl. Math. 76, 1871-1898. (doi:10.1137/15M1033782)), is fast, accurate, flexible and easy to implement. Our numerical experiments demonstrate that the numerical errors resulting from the proposed approach decrease faster than any negative power of the window size. In a number of examples considered in this paper, the proposed method is up to thousands of times faster, for a given accuracy, than corresponding methods based on the use of Sommerfeld integrals.
Windowed Green function method for the Helmholtz equation in the presence of multiply layered media.
Bruno, O P; Pérez-Arancibia, C
2017-06-01
This paper presents a new methodology for the solution of problems of two- and three-dimensional acoustic scattering (and, in particular, two-dimensional electromagnetic scattering) by obstacles and defects in the presence of an arbitrary number of penetrable layers. Relying on the use of certain slow-rise windowing functions, the proposed windowed Green function approach efficiently evaluates oscillatory integrals over unbounded domains, with high accuracy, without recourse to the highly expensive Sommerfeld integrals that have typically been used to account for the effect of underlying planar multilayer structures. The proposed methodology, whose theoretical basis was presented in the recent contribution (Bruno et al. 2016 SIAM J. Appl. Math. 76 , 1871-1898. (doi:10.1137/15M1033782)), is fast, accurate, flexible and easy to implement. Our numerical experiments demonstrate that the numerical errors resulting from the proposed approach decrease faster than any negative power of the window size. In a number of examples considered in this paper, the proposed method is up to thousands of times faster, for a given accuracy, than corresponding methods based on the use of Sommerfeld integrals.
Sonar Imaging of Elastic Fluid-Filled Cylindrical Shells.
NASA Astrophysics Data System (ADS)
Dodd, Stirling Scott
1995-01-01
Previously a method of describing spherical acoustic waves in cylindrical coordinates was applied to the problem of point source scattering by an elastic infinite fluid -filled cylindrical shell (S. Dodd and C. Loeffler, J. Acoust. Soc. Am. 97, 3284(A) (1995)). This method is applied to numerically model monostatic oblique incidence scattering from a truncated cylinder by a narrow-beam high-frequency imaging sonar. The narrow beam solution results from integrating the point source solution over the spatial extent of a line source and line receiver. The cylinder truncation is treated by the method of images, and assumes that the reflection coefficient at the truncation is unity. The scattering form functions, calculated using this method, are applied as filters to a narrow bandwidth, high ka pulse to find the time domain scattering response. The time domain pulses are further processed and displayed in the form of a sonar image. These images compare favorably to experimentally obtained images (G. Kaduchak and C. Loeffler, J. Acoust. Soc. Am. 97, 3289(A) (1995)). The impact of the s_{ rm o} and a_{rm o} Lamb waves is vividly apparent in the images.
Generalized fourier analyses of the advection-diffusion equation - Part II: two-dimensional domains
NASA Astrophysics Data System (ADS)
Voth, Thomas E.; Martinez, Mario J.; Christon, Mark A.
2004-07-01
Part I of this work presents a detailed multi-methods comparison of the spatial errors associated with the one-dimensional finite difference, finite element and finite volume semi-discretizations of the scalar advection-diffusion equation. In Part II we extend the analysis to two-dimensional domains and also consider the effects of wave propagation direction and grid aspect ratio on the phase speed, and the discrete and artificial diffusivities. The observed dependence of dispersive and diffusive behaviour on propagation direction makes comparison of methods more difficult relative to the one-dimensional results. For this reason, integrated (over propagation direction and wave number) error and anisotropy metrics are introduced to facilitate comparison among the various methods. With respect to these metrics, the consistent mass Galerkin and consistent mass control-volume finite element methods, and their streamline upwind derivatives, exhibit comparable accuracy, and generally out-perform their lumped mass counterparts and finite-difference based schemes. While this work can only be considered a first step in a comprehensive multi-methods analysis and comparison, it serves to identify some of the relative strengths and weaknesses of multiple numerical methods in a common mathematical framework. Published in 2004 by John Wiley & Sons, Ltd.
A Numerical Model of Unsteady, Subsonic Aeroelastic Behavior. Ph.D Thesis
NASA Technical Reports Server (NTRS)
Strganac, Thomas W.
1987-01-01
A method for predicting unsteady, subsonic aeroelastic responses was developed. The technique accounts for aerodynamic nonlinearities associated with angles of attack, vortex-dominated flow, static deformations, and unsteady behavior. The fluid and the wing together are treated as a single dynamical system, and the equations of motion for the structure and flow field are integrated simultaneously and interactively in the time domain. The method employs an iterative scheme based on a predictor-corrector technique. The aerodynamic loads are computed by the general unsteady vortex-lattice method and are determined simultaneously with the motion of the wing. Because the unsteady vortex-lattice method predicts the wake as part of the solution, the history of the motion is taken into account; hysteresis is predicted. Two models are used to demonstrate the technique: a rigid wing on an elastic support experiencing plunge and pitch about the elastic axis, and an elastic wing rigidly supported at the root chord experiencing spanwise bending and twisting. The method can be readily extended to account for structural nonlinearities and/or substitute aerodynamic load models. The time domain solution coupled with the unsteady vortex-lattice method provides the capability of graphically depicting wing and wake motion.
Barriers to the Integration of Care in Inter-Organisational Settings: A Literature Review
2018-01-01
Introduction: In recent years, inter-organisational collaboration between healthcare organisations has become of increasingly vital importance in order to improve the integration of health service delivery. However, different barriers reported in academic literature seem to hinder the formation and development of such collaboration. Theory and methods: This systematic literature review of forty studies summarises and categorises the barriers to integrated care in inter-organisational settings as reported in previous studies. It analyses how these barriers operate. Results: Within these studies, twenty types of barriers have been identified and then categorised in six groups (barriers related to administration and regulation, barriers related to funding, barriers related to the inter-organisational domain, barriers related to the organisational domain, barriers related to service delivery, and barriers related to clinical practices). Not all of these barriers emerge passively, some are set up intentionally. They are not only context-specific, but are also often related and influence each other. Discussion and conclusion: The compilation of these results allows for a better understanding of the characteristics and reasons for the occurrence of barriers that impede collaboration aiming for the integration of care, not only for researchers but also for practitioners. It can help to explain and counteract the slow progress and limited efficiency and effectiveness of some of the inter-organisational collaboration in healthcare settings. PMID:29632455
Musi, Valeria; Birdsall, Berry; Fernandez-Ballester, Gregorio; Guerrini, Remo; Salvatori, Severo; Serrano, Luis; Pastore, Annalisa
2006-04-01
SH3 domains are small protein modules that are involved in protein-protein interactions in several essential metabolic pathways. The availability of the complete genome and the limited number of clearly identifiable SH3 domains make the yeast Saccharomyces cerevisae an ideal proteomic-based model system to investigate the structural rules dictating the SH3-mediated protein interactions and to develop new tools to assist these studies. In the present work, we have determined the solution structure of the SH3 domain from Myo3 and modeled by homology that of the highly homologous Myo5, two myosins implicated in actin polymerization. We have then implemented an integrated approach that makes use of experimental and computational methods to characterize their binding properties. While accommodating their targets in the classical groove, the two domains have selectivity in both orientation and sequence specificity of the target peptides. From our study, we propose a consensus sequence that may provide a useful guideline to identify new natural partners and suggest a strategy of more general applicability that may be of use in other structural proteomic studies.
Scheerhagen, Marisja; van Stel, Henk F.; Birnie, Erwin; Franx, Arie; Bonsel, Gouke J.
2015-01-01
Background Maternity care is an integrated care process, which consists of different services, involves different professionals and covers different time windows. To measure performance of maternity care based on clients' experiences, we developed and validated a questionnaire. Methods and Findings We used the 8-domain WHO Responsiveness model, and previous materials to develop a self-report questionnaire. A dual study design was used for development and validation. Content validity of the ReproQ-version-0 was determined through structured interviews with 11 pregnant women (≥28 weeks), 10 women who recently had given birth (≤12 weeks), and 19 maternity care professionals. Structured interviews established the domain relevance to the women; all items were separately commented on. All Responsiveness domains were judged relevant, with Dignity and Communication ranking highest. Main missing topic was the assigned expertise of the health professional. After first adaptation, construct validity of the ReproQ-version-1 was determined through a web-based survey. Respondents were approached by maternity care organizations with different levels of integration of services of midwives and obstetricians. We sent questionnaires to 605 third trimester pregnant women (response 65%), and 810 women 6 weeks after delivery (response 55%). Construct validity was based on: response patterns; exploratory factor analysis; association of the overall score with a Visual Analogue Scale (VAS), known group comparisons. Median overall ReproQ score was 3.70 (range 1–4) showing good responsiveness. The exploratory factor analysis supported the assumed domain structure and suggested several adaptations. Correlation of the VAS rating and overall ReproQ score (antepartum, postpartum) supported validity (r = 0.56; 0.59, p<0.001 Spearman's correlation coefficient). Pre-stated group comparisons confirmed the expected difference following a good vs. adverse birth outcome. Fully integrated organizations performed slightly better (median = 3.78) than less integrated organizations (median = 3.63; p<0.001). Participation rate of women with a low educational level and/or a non-western origin was low. Conclusions The ReproQ appears suitable for assessing quality of maternity care from the clients' perspective. Recruitment of disadvantaged groups requires additional non-digital approaches. PMID:25671310
Chabalier, Julie; Capponi, Cécile; Quentin, Yves; Fichant, Gwennaele
2005-04-01
Complex biological functions emerge from interactions between proteins in stable supra-molecular assemblies and/or through transitory contacts. Most of the time protein partners of the assemblies are composed of one or several domains which exhibit different biochemical functions. Thus the study of cellular process requires the identification of different functional units and their integration in an interaction network; such complexes are referred to as integrated systems. In order to exploit with optimum efficiency the increased release of data, automated bioinformatics strategies are needed to identify, reconstruct and model such systems. For that purpose, we have developed a knowledge warehouse dedicated to the representation and acquisition of bacterial integrated systems involved in the exchange of the bacterial cell with its environment. ISYMOD is a knowledge warehouse that consistently integrates in the same environment the data and the methods used for their acquisition. This is achieved through the construction of (1) a domain knowledge base (DKB) devoted to the storage of the knowledge about the systems, their functional specificities, their partners and how they are related and (2) a methodological knowledge base (MKB) which depicts the task layout used to identify and reconstruct functional integrated systems. Instantiation of the DKB is obtained by solving the tasks of the MKB, whereas some tasks need instances of the DKB to be solved. AROM, an object-based knowledge representation system, has been used to design the DKB, and its task manager, AROMTasks, for developing the MKB. In this study two integrated systems, ABC transporters and two component systems, both involved in adaptation processes of a bacterial cell to its biotope, have been used to evaluate the feasibility of the approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yin, Zhiqi; Shi, Ke; Banerjee, Surajit
Integration of the reverse-transcribed viral DNA into the host genome is an essential step in the life cycle of retroviruses. Retrovirus integrase catalyses insertions of both ends of the linear viral DNA into a host chromosome. Integrase from HIV-1 and closely related retroviruses share the three-domain organization, consisting of a catalytic core domain flanked by amino- and carboxy-terminal domains essential for the concerted integration reaction. Although structures of the tetrameric integrase–DNA complexes have been reported for integrase from prototype foamy virus featuring an additional DNA-binding domain and longer interdomain linkers, the architecture of a canonical three-domain integrase bound to DNAmore » remained elusive. In this paper, we report a crystal structure of the three-domain integrase from Rous sarcoma virus in complex with viral and target DNAs. The structure shows an octameric assembly of integrase, in which a pair of integrase dimers engage viral DNA ends for catalysis while another pair of non-catalytic integrase dimers bridge between the two viral DNA molecules and help capture target DNA. The individual domains of the eight integrase molecules play varying roles to hold the complex together, making an extensive network of protein–DNA and protein–protein contacts that show both conserved and distinct features compared with those observed for prototype foamy virus integrase. Finally, our work highlights the diversity of retrovirus intasome assembly and provides insights into the mechanisms of integration by HIV-1 and related retroviruses.« less
An Approach to Formalizing Ontology Driven Semantic Integration: Concepts, Dimensions and Framework
ERIC Educational Resources Information Center
Gao, Wenlong
2012-01-01
The ontology approach has been accepted as a very promising approach to semantic integration today. However, because of the diversity of focuses and its various connections to other research domains, the core concepts, theoretical and technical approaches, and research areas of this domain still remain unclear. Such ambiguity makes it difficult to…
Real-time contingency handling in MAESTRO
NASA Technical Reports Server (NTRS)
Britt, Daniel L.; Geoffroy, Amy L.
1992-01-01
A scheduling and resource management system named MAESTRO was interfaced with a Space Station Module Power Management and Distribution (SSM/PMAD) breadboard at MSFC. The combined system serves to illustrate the integration of planning, scheduling, and control in a realistic, complex domain. This paper briefly describes the functional elements of the combined system, including normal and contingency operational scenarios, then focusses on the method used by the scheduler to handle real-time contingencies.
Low-dimensional, morphologically accurate models of subthreshold membrane potential
Kellems, Anthony R.; Roos, Derrick; Xiao, Nan; Cox, Steven J.
2009-01-01
The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasi-active model, which in turn we reduce by both time-domain (Balanced Truncation) and frequency-domain (ℋ2 approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speed-up in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasi-integrate and fire model. PMID:19172386
Guidelines for managing data and processes in bone and cartilage tissue engineering.
Viti, Federica; Scaglione, Silvia; Orro, Alessandro; Milanesi, Luciano
2014-01-01
In the last decades, a wide number of researchers/clinicians involved in tissue engineering field published several works about the possibility to induce a tissue regeneration guided by the use of biomaterials. To this aim, different scaffolds have been proposed, and their effectiveness tested through in vitro and/or in vivo experiments. In this context, integration and meta-analysis approaches are gaining importance for analyses and reuse of data as, for example, those concerning the bone and cartilage biomarkers, the biomolecular factors intervening in cell differentiation and growth, the morphology and the biomechanical performance of a neo-formed tissue, and, in general, the scaffolds' ability to promote tissue regeneration. Therefore standards and ontologies are becoming crucial, to provide a unifying knowledge framework for annotating data and supporting the semantic integration and the unambiguous interpretation of novel experimental results. In this paper a conceptual framework has been designed for bone/cartilage tissue engineering domain, by now completely lacking standardized methods. A set of guidelines has been provided, defining the minimum information set necessary for describing an experimental study involved in bone and cartilage regenerative medicine field. In addition, a Bone/Cartilage Tissue Engineering Ontology (BCTEO) has been developed to provide a representation of the domain's concepts, specifically oriented to cells, and chemical composition, morphology, physical characterization of biomaterials involved in bone/cartilage tissue engineering research. Considering that tissue engineering is a discipline that traverses different semantic fields and employs many data types, the proposed instruments represent a first attempt to standardize the domain knowledge and can provide a suitable means to integrate data across the field.
Lim, Kelvin O.; Ardekani, Babak A.; Nierenberg, Jay; Butler, Pamela D.; Javitt, Daniel C.; Hoptman, Matthew J.
2007-01-01
Patients with schizophrenia show deficits in several neurocognitive domains. However, the relationship between white matter integrity and performance in these domains is poorly understood. The authors conducted neurocognitive testing and diffusion tensor imaging in 25 patients with schizophrenia. Performance was examined for tests of verbal declarative memory, attention, and executive function. Relationships between fractional anisotropy and cognitive performance were examined by using voxelwise correlational analyses. In each case, better performance on these tasks was associated with higher levels of fractional anisotropy in task-relevant regions. PMID:17074956
Mechanisms for integration of information models across related domains
NASA Astrophysics Data System (ADS)
Atkinson, Rob
2010-05-01
It is well recognised that there are opportunities and challenges in cross-disciplinary data integration. A significant barrier, however, is creating a conceptual model of the combined domains and the area of integration. For example, a groundwater domain application may require information from several related domains: geology, hydrology, water policy, etc. Each domain may have its own data holdings and conceptual models, but these will share various common concepts (eg. The concept of an aquifer). These areas of semantic overlap present significant challenges, firstly to choose a single representation (model) of a concept that appears in multiple disparate models,, then to harmonise these other models with the single representation. In addition, models may exist at different levels of abstraction depending on how closely aligned they are with a particular implementation. This makes it hard for modellers in one domain to introduce elements from another domain without either introducing a specific style of implementation, or conversely dealing with a set of abstract patterns that are hard to integrate with existing implementations. Models are easier to integrate if they are broken down into small units, with common concepts implemented using common models from well-known, and predictably managed shared libraries. This vision however requires development of a set of mechanisms (tools and procedures) for implementing and exploiting libraries of model components. These mechanisms need to handle publication, discovery, subscription, versioning and implementation of models in different forms. In this presentation a coherent suite of such mechanisms is proposed, using a scenario based on re-use of geosciences models. This approach forms the basis of a comprehensive strategy to empower domain modellers to create more interoperable systems. The strategy address a range of concerns and practice, and includes methodologies, an accessible toolkit, improvements to available modelling software, a community of practice and design of model registries. These mechanisms have been used to decouple the generation of simplified data products from a data and metadata maintenance environment, where the simplified products conform to implementation styles, and the data maintenance environment is a modular, extensible implementation of a more complete set of related domain models. Another case study is the provisioning of authoritative place names (a gazetteer) from more complex multi-lingual and historical archives of related place name usage.
Lim, Kwang-il; Klimczak, Ryan; Yu, Julie H.; Schaffer, David V.
2010-01-01
Retroviral vectors offer benefits of efficient delivery and stable gene expression; however, their clinical use raises the concerns of insertional mutagenesis and potential oncogenesis due to genomic integration preferences in transcriptional start sites (TSS). We have shifted the integration preferences of retroviral vectors by generating a library of viral variants with a DNA-binding domain inserted at random positions throughout murine leukemia virus Gag-Pol, then selecting for variants that are viable and exhibit altered integration properties. We found seven permissive zinc finger domain (ZFD) insertion sites throughout Gag-Pol, including within p12, reverse transcriptase, and integrase. Comprehensive genome integration analysis showed that several ZFD insertions yielded retroviral vector variants with shifted integration patterns that did not favor TSS. Furthermore, integration site analysis revealed selective integration for numerous mutants. For example, two retroviral variants with a given ZFD at appropriate positions in Gag-Pol strikingly integrated primarily into four common sites out of 3.1 × 109 possible human genome locations (P = 4.6 × 10-29). Our findings demonstrate that insertion of DNA-binding motifs into multiple locations in Gag-Pol can make considerable progress toward engineering safer retroviral vectors that integrate into a significantly narrowed pool of sites on human genome and overcome the preference for TSS. PMID:20616052
NASA Astrophysics Data System (ADS)
Hanan, Lu; Qiushi, Li; Shaobin, Li
2016-12-01
This paper presents an integrated optimization design method in which uniform design, response surface methodology and genetic algorithm are used in combination. In detail, uniform design is used to select the experimental sampling points in the experimental domain and the system performance is evaluated by means of computational fluid dynamics to construct a database. After that, response surface methodology is employed to generate a surrogate mathematical model relating the optimization objective and the design variables. Subsequently, genetic algorithm is adopted and applied to the surrogate model to acquire the optimal solution in the case of satisfying some constraints. The method has been applied to the optimization design of an axisymmetric diverging duct, dealing with three design variables including one qualitative variable and two quantitative variables. The method of modeling and optimization design performs well in improving the duct aerodynamic performance and can be also applied to wider fields of mechanical design and seen as a useful tool for engineering designers, by reducing the design time and computation consumption.
Co-simulation coupling spectral/finite elements for 3D soil/structure interaction problems
NASA Astrophysics Data System (ADS)
Zuchowski, Loïc; Brun, Michael; De Martin, Florent
2018-05-01
The coupling between an implicit finite elements (FE) code and an explicit spectral elements (SE) code has been explored for solving the elastic wave propagation in the case of soil/structure interaction problem. The coupling approach is based on domain decomposition methods in transient dynamics. The spatial coupling at the interface is managed by a standard coupling mortar approach, whereas the time integration is dealt with an hybrid asynchronous time integrator. An external coupling software, handling the interface problem, has been set up in order to couple the FE software Code_Aster with the SE software EFISPEC3D.
Radakovics, Katharina; Smith, Terry K.; Bobik, Nina; Round, Adam; Djinović-Carugo, Kristina; Usón, Isabel
2016-01-01
Vaccinia virus interferes with early events of the activation pathway of the transcriptional factor NF-kB by binding to numerous host TIR-domain containing adaptor proteins. We have previously determined the X-ray structure of the A46 C-terminal domain; however, the structure and function of the A46 N-terminal domain and its relationship to the C-terminal domain have remained unclear. Here, we biophysically characterize residues 1–83 of the N-terminal domain of A46 and present the X-ray structure at 1.55 Å. Crystallographic phases were obtained by a recently developed ab initio method entitled ARCIMBOLDO_BORGES that employs tertiary structure libraries extracted from the Protein Data Bank; data analysis revealed an all β-sheet structure. This is the first such structure solved by this method which should be applicable to any protein composed entirely of β-sheets. The A46(1–83) structure itself is a β-sandwich containing a co-purified molecule of myristic acid inside a hydrophobic pocket and represents a previously unknown lipid-binding fold. Mass spectrometry analysis confirmed the presence of long-chain fatty acids in both N-terminal and full-length A46; mutation of the hydrophobic pocket reduced the lipid content. Using a combination of high resolution X-ray structures of the N- and C-terminal domains and SAXS analysis of full-length protein A46(1–240), we present here a structural model of A46 in a tetrameric assembly. Integrating affinity measurements and structural data, we propose how A46 simultaneously interferes with several TIR-domain containing proteins to inhibit NF-κB activation and postulate that A46 employs a bipartite binding arrangement to sequester the host immune adaptors TRAM and MyD88. PMID:27973613
De novo identification of replication-timing domains in the human genome by deep learning.
Liu, Feng; Ren, Chao; Li, Hao; Zhou, Pingkun; Bo, Xiaochen; Shu, Wenjie
2016-03-01
The de novo identification of the initiation and termination zones-regions that replicate earlier or later than their upstream and downstream neighbours, respectively-remains a key challenge in DNA replication. Building on advances in deep learning, we developed a novel hybrid architecture combining a pre-trained, deep neural network and a hidden Markov model (DNN-HMM) for the de novo identification of replication domains using replication timing profiles. Our results demonstrate that DNN-HMM can significantly outperform strong, discriminatively trained Gaussian mixture model-HMM (GMM-HMM) systems and other six reported methods that can be applied to this challenge. We applied our trained DNN-HMM to identify distinct replication domain types, namely the early replication domain (ERD), the down transition zone (DTZ), the late replication domain (LRD) and the up transition zone (UTZ), using newly replicated DNA sequencing (Repli-Seq) data across 15 human cells. A subsequent integrative analysis revealed that these replication domains harbour unique genomic and epigenetic patterns, transcriptional activity and higher-order chromosomal structure. Our findings support the 'replication-domain' model, which states (1) that ERDs and LRDs, connected by UTZs and DTZs, are spatially compartmentalized structural and functional units of higher-order chromosomal structure, (2) that the adjacent DTZ-UTZ pairs form chromatin loops and (3) that intra-interactions within ERDs and LRDs tend to be short-range and long-range, respectively. Our model reveals an important chromatin organizational principle of the human genome and represents a critical step towards understanding the mechanisms regulating replication timing. Our DNN-HMM method and three additional algorithms can be freely accessed at https://github.com/wenjiegroup/DNN-HMM The replication domain regions identified in this study are available in GEO under the accession ID GSE53984. shuwj@bmi.ac.cn or boxc@bmi.ac.cn Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
2012-01-01
Background Hidden Markov Models (HMMs) are a powerful tool for protein domain identification. The Pfam database notably provides a large collection of HMMs which are widely used for the annotation of proteins in new sequenced organisms. In Pfam, each domain family is represented by a curated multiple sequence alignment from which a profile HMM is built. In spite of their high specificity, HMMs may lack sensitivity when searching for domains in divergent organisms. This is particularly the case for species with a biased amino-acid composition, such as P. falciparum, the main causal agent of human malaria. In this context, fitting HMMs to the specificities of the target proteome can help identify additional domains. Results Using P. falciparum as an example, we compare approaches that have been proposed for this problem, and present two alternative methods. Because previous attempts strongly rely on known domain occurrences in the target species or its close relatives, they mainly improve the detection of domains which belong to already identified families. Our methods learn global correction rules that adjust amino-acid distributions associated with the match states of HMMs. These rules are applied to all match states of the whole HMM library, thus enabling the detection of domains from previously absent families. Additionally, we propose a procedure to estimate the proportion of false positives among the newly discovered domains. Starting with the Pfam standard library, we build several new libraries with the different HMM-fitting approaches. These libraries are first used to detect new domain occurrences with low E-values. Second, by applying the Co-Occurrence Domain Discovery (CODD) procedure we have recently proposed, the libraries are further used to identify likely occurrences among potential domains with higher E-values. Conclusion We show that the new approaches allow identification of several domain families previously absent in the P. falciparum proteome and the Apicomplexa phylum, and identify many domains that are not detected by previous approaches. In terms of the number of new discovered domains, the new approaches outperform the previous ones when no close species are available or when they are used to identify likely occurrences among potential domains with high E-values. All predictions on P. falciparum have been integrated into a dedicated website which pools all known/new annotations of protein domains and functions for this organism. A software implementing the two proposed approaches is available at the same address: http://www.lirmm.fr/∼terrapon/HMMfit/ PMID:22548871
Terrapon, Nicolas; Gascuel, Olivier; Maréchal, Eric; Bréhélin, Laurent
2012-05-01
Hidden Markov Models (HMMs) are a powerful tool for protein domain identification. The Pfam database notably provides a large collection of HMMs which are widely used for the annotation of proteins in new sequenced organisms. In Pfam, each domain family is represented by a curated multiple sequence alignment from which a profile HMM is built. In spite of their high specificity, HMMs may lack sensitivity when searching for domains in divergent organisms. This is particularly the case for species with a biased amino-acid composition, such as P. falciparum, the main causal agent of human malaria. In this context, fitting HMMs to the specificities of the target proteome can help identify additional domains. Using P. falciparum as an example, we compare approaches that have been proposed for this problem, and present two alternative methods. Because previous attempts strongly rely on known domain occurrences in the target species or its close relatives, they mainly improve the detection of domains which belong to already identified families. Our methods learn global correction rules that adjust amino-acid distributions associated with the match states of HMMs. These rules are applied to all match states of the whole HMM library, thus enabling the detection of domains from previously absent families. Additionally, we propose a procedure to estimate the proportion of false positives among the newly discovered domains. Starting with the Pfam standard library, we build several new libraries with the different HMM-fitting approaches. These libraries are first used to detect new domain occurrences with low E-values. Second, by applying the Co-Occurrence Domain Discovery (CODD) procedure we have recently proposed, the libraries are further used to identify likely occurrences among potential domains with higher E-values. We show that the new approaches allow identification of several domain families previously absent in the P. falciparum proteome and the Apicomplexa phylum, and identify many domains that are not detected by previous approaches. In terms of the number of new discovered domains, the new approaches outperform the previous ones when no close species are available or when they are used to identify likely occurrences among potential domains with high E-values. All predictions on P. falciparum have been integrated into a dedicated website which pools all known/new annotations of protein domains and functions for this organism. A software implementing the two proposed approaches is available at the same address: http://www.lirmm.fr/~terrapon/HMMfit/
The aerodynamics of propellers and rotors using an acoustic formulation in the time domain
NASA Technical Reports Server (NTRS)
Long, L. N.
1983-01-01
The aerodynamics of propellers and rotors is especially complicated because of the highly three-dimensional and compressible nature of the flow field. However, in linearized theory the problem is governed by the wave equation, and a numerically-efficient integral formulation can be derived. This reduces the problem from one in space to one over a surface. Many such formulations exist in the aeroacoustics literature, but these become singular integral equations if one naively tries to use them to predict surface pressures, i.e., for aerodynamics. The present paper illustrates how one must interpret these equations in order to obtain nonambiguous results. After the regularized form of the integral equation is derived, a method for solving it numerically is described. This preliminary computer code uses Legendre-Gaussian quadrature to solve the equation. Numerical results are compared to experimental results for ellipsoids, wings, and rotors, including effects due to lift. Compressibility and the farfield boundary conditions are satisfied automatically using this method.
Cao, Renzhi; Bhattacharya, Debswapna; Adhikari, Badri; Li, Jilong; Cheng, Jianlin
2016-09-01
Model evaluation and selection is an important step and a big challenge in template-based protein structure prediction. Individual model quality assessment methods designed for recognizing some specific properties of protein structures often fail to consistently select good models from a model pool because of their limitations. Therefore, combining multiple complimentary quality assessment methods is useful for improving model ranking and consequently tertiary structure prediction. Here, we report the performance and analysis of our human tertiary structure predictor (MULTICOM) based on the massive integration of 14 diverse complementary quality assessment methods that was successfully benchmarked in the 11th Critical Assessment of Techniques of Protein Structure prediction (CASP11). The predictions of MULTICOM for 39 template-based domains were rigorously assessed by six scoring metrics covering global topology of Cα trace, local all-atom fitness, side chain quality, and physical reasonableness of the model. The results show that the massive integration of complementary, diverse single-model and multi-model quality assessment methods can effectively leverage the strength of single-model methods in distinguishing quality variation among similar good models and the advantage of multi-model quality assessment methods of identifying reasonable average-quality models. The overall excellent performance of the MULTICOM predictor demonstrates that integrating a large number of model quality assessment methods in conjunction with model clustering is a useful approach to improve the accuracy, diversity, and consequently robustness of template-based protein structure prediction. Proteins 2016; 84(Suppl 1):247-259. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
Wainwright, Haruko M; Seki, Akiyuki; Chen, Jinsong; Saito, Kimiaki
2017-02-01
This paper presents a multiscale data integration method to estimate the spatial distribution of air dose rates in the regional scale around the Fukushima Daiichi Nuclear Power Plant. We integrate various types of datasets, such as ground-based walk and car surveys, and airborne surveys, all of which have different scales, resolutions, spatial coverage, and accuracy. This method is based on geostatistics to represent spatial heterogeneous structures, and also on Bayesian hierarchical models to integrate multiscale, multi-type datasets in a consistent manner. The Bayesian method allows us to quantify the uncertainty in the estimates, and to provide the confidence intervals that are critical for robust decision-making. Although this approach is primarily data-driven, it has great flexibility to include mechanistic models for representing radiation transport or other complex correlations. We demonstrate our approach using three types of datasets collected at the same time over Fukushima City in Japan: (1) coarse-resolution airborne surveys covering the entire area, (2) car surveys along major roads, and (3) walk surveys in multiple neighborhoods. Results show that the method can successfully integrate three types of datasets and create an integrated map (including the confidence intervals) of air dose rates over the domain in high resolution. Moreover, this study provides us with various insights into the characteristics of each dataset, as well as radiocaesium distribution. In particular, the urban areas show high heterogeneity in the contaminant distribution due to human activities as well as large discrepancy among different surveys due to such heterogeneity. Copyright © 2016 Elsevier Ltd. All rights reserved.
Memory integration in amnesia: prior knowledge supports verbal short-term memory.
Race, Elizabeth; Palombo, Daniela J; Cadden, Margaret; Burke, Keely; Verfaellie, Mieke
2015-04-01
Short-term memory (STM) and long-term memory (LTM) have traditionally been considered cognitively distinct. However, it is known that STM can improve when to-be-remembered information appears in contexts that make contact with prior knowledge, suggesting a more interactive relationship between STM and LTM. The current study investigated whether the ability to leverage LTM in support of STM critically depends on the integrity of the hippocampus. Specifically, we investigated whether the hippocampus differentially supports between-domain versus within-domain STM-LTM integration given prior evidence that the representational domain of the elements being integrated in memory is a critical determinant of whether memory performance depends on the hippocampus. In Experiment 1, we investigated hippocampal contributions to within-domain STM-LTM integration by testing whether immediate verbal recall of words improves in MTL amnesic patients when words are presented in familiar verbal contexts (meaningful sentences) compared to unfamiliar verbal contexts (random word lists). Patients demonstrated a robust sentence superiority effect, whereby verbal STM performance improved in familiar compared to unfamiliar verbal contexts, and the magnitude of this effect did not differ from that in controls. In Experiment 2, we investigated hippocampal contributions to between-domain STM-LTM integration by testing whether immediate verbal recall of digits improves in MTL amnesic patients when digits are presented in a familiar visuospatial context (a typical keypad layout) compared to an unfamiliar visuospatial context (a random keypad layout). Immediate verbal recall improved in both patients and controls when digits were presented in the familiar compared to the unfamiliar keypad array, indicating a preserved ability to integrate activated verbal information with stored visuospatial knowledge. Together, these results demonstrate that immediate verbal recall in amnesia can benefit from two distinct types of semantic support, verbal and visuospatial, and that the hippocampus is not critical for leveraging stored semantic knowledge to improve memory performance. Copyright © 2015 Elsevier Ltd. All rights reserved.
Memory integration in amnesia: Prior knowledge supports verbal short-term memory
Race, Elizabeth; Palombo, Daniela J.; Cadden, Margaret; Burke, Keely; Verfaellie, Mieke
2015-01-01
Short-term memory (STM) and long-term memory (LTM) have traditionally been considered cognitively distinct. However, it is known that STM can improve when to-be-remembered information appears in contexts that make contact with prior knowledge, suggesting a more interactive relationship between STM and LTM. The current study investigated whether the ability to leverage LTM in support of STM critically depends on the integrity of the hippocampus. Specifically, we investigated whether the hippocampus differentially supports between-domain versus within-domain STM–LTM integration given prior evidence that the representational domain of the elements being integrated in memory is a critical determinant of whether memory performance depends on the hippocampus. In Experiment 1, we investigated hippocampal contributions to within-domain STM–LTM integration by testing whether immediate verbal recall of words improves in MTL amnesic patients when words are presented in familiar verbal contexts (meaningful sentences) compared to unfamiliar verbal contexts (random word lists). Patients demonstrated a robust sentence superiority effect, whereby verbal STM performance improved in familiar compared to unfamiliar verbal contexts, and the magnitude of this effect did not differ from that in controls. In Experiment 2, we investigated hippocampal contributions to between-domain STM–LTM integration by testing whether immediate verbal recall of digits improves in MTL amnesic patients when digits are presented in a familiar visuospatial context (a typical keypad layout) compared to an unfamiliar visuospatial context (a random keypad layout). Immediate verbal recall improved in both patients and controls when digits were presented in the familiar compared to the unfamiliar keypad array, indicating a preserved ability to integrate activated verbal information with stored visuospatial knowledge. Together, these results demonstrate that immediate verbal recall in amnesia can benefit from two distinct types of semantic support, verbal and visuospatial, and that the hippocampus is not critical for leveraging stored semantic knowledge to improve memory performance. PMID:25752585
Grabowski, Krzysztof; Gawronski, Mateusz; Baran, Ireneusz; Spychalski, Wojciech; Staszewski, Wieslaw J; Uhl, Tadeusz; Kundu, Tribikram; Packo, Pawel
2016-05-01
Acoustic Emission used in Non-Destructive Testing is focused on analysis of elastic waves propagating in mechanical structures. Then any information carried by generated acoustic waves, further recorded by a set of transducers, allow to determine integrity of these structures. It is clear that material properties and geometry strongly impacts the result. In this paper a method for Acoustic Emission source localization in thin plates is presented. The approach is based on the Time-Distance Domain Transform, that is a wavenumber-frequency mapping technique for precise event localization. The major advantage of the technique is dispersion compensation through a phase-shifting of investigated waveforms in order to acquire the most accurate output, allowing for source-sensor distance estimation using a single transducer. The accuracy and robustness of the above process are also investigated. This includes the study of Young's modulus value and numerical parameters influence on damage detection. By merging the Time-Distance Domain Transform with an optimal distance selection technique, an identification-localization algorithm is achieved. The method is investigated analytically, numerically and experimentally. The latter involves both laboratory and large scale industrial tests. Copyright © 2016 Elsevier B.V. All rights reserved.
Relationship auditing of the FMA ontology
Gu, Huanying (Helen); Wei, Duo; Mejino, Jose L.V.; Elhanan, Gai
2010-01-01
The Foundational Model of Anatomy (FMA) ontology is a domain reference ontology based on a disciplined modeling approach. Due to its large size, semantic complexity and manual data entry process, errors and inconsistencies are unavoidable and might remain within the FMA structure without detection. In this paper, we present computable methods to highlight candidate concepts for various relationship assignment errors. The process starts with locating structures formed by transitive structural relationships (part_of, tributary_of, branch_of) and examine their assignments in the context of the IS-A hierarchy. The algorithms were designed to detect five major categories of possible incorrect relationship assignments: circular, mutually exclusive, redundant, inconsistent, and missed entries. A domain expert reviewed samples of these presumptive errors to confirm the findings. Seven thousand and fifty-two presumptive errors were detected, the largest proportion related to part_of relationship assignments. The results highlight the fact that errors are unavoidable in complex ontologies and that well designed algorithms can help domain experts to focus on concepts with high likelihood of errors and maximize their effort to ensure consistency and reliability. In the future similar methods might be integrated with data entry processes to offer real-time error detection. PMID:19475727
Volumetric blood flow via time-domain correlation: experimental verification.
Embree, P M; O'Brien, W R
1990-01-01
A novel ultrasonic volumetric flow measurement method using time-domain correlation of consecutive pairs of echoes has been developed. An ultrasonic data acquisition system determined the time shift between a pair of range gated echoes by searching for the time shift with the maximum correlation between the RF sampled waveforms. Experiments with a 5-MHz transducer indicate that the standard deviation of the estimate of steady fluid velocity through 6-mm-diameter tubes is less than 10% of the mean. Experimentally, Sephadex (G-50; 20-80 mum dia.) particles in water and fresh porcine blood have been used as ultrasound scattering fluids. Two-dimensional (2-D) flow velocity can be estimated by slowly sweeping the ultrasonic beam across the blood vessel phantom. Volumetric flow through the vessel is estimated by integrating the 2-D flow velocity field and then is compared to hydrodynamic flow measurements to assess the overall experimental accuracy of the time-domain method. Flow rates from 50-500 ml/min have been estimated with an accuracy better than 10% under the idealized characteristics used in this study, which include straight circular thin-walled tubes, laminar axially-symmetric steady flow, and no intervening tissues.
MPEG content summarization based on compressed domain feature analysis
NASA Astrophysics Data System (ADS)
Sugano, Masaru; Nakajima, Yasuyuki; Yanagihara, Hiromasa
2003-11-01
This paper addresses automatic summarization of MPEG audiovisual content on compressed domain. By analyzing semantically important low-level and mid-level audiovisual features, our method universally summarizes the MPEG-1/-2 contents in the form of digest or highlight. The former is a shortened version of an original, while the latter is an aggregation of important or interesting events. In our proposal, first, the incoming MPEG stream is segmented into shots and the above features are derived from each shot. Then the features are adaptively evaluated in an integrated manner, and finally the qualified shots are aggregated into a summary. Since all the processes are performed completely on compressed domain, summarization is achieved at very low computational cost. The experimental results show that news highlights and sports highlights in TV baseball games can be successfully extracted according to simple shot transition models. As for digest extraction, subjective evaluation proves that meaningful shots are extracted from content without a priori knowledge, even if it contains multiple genres of programs. Our method also has the advantage of generating an MPEG-7 based description such as summary and audiovisual segments in the course of summarization.
A customisable framework for the assessment of therapies in the solution of therapy decision tasks.
Manjarrés Riesco, A; Martínez Tomás, R; Mira Mira, J
2000-01-01
In current medical research, a growing interest can be observed in the definition of a global therapy-evaluation framework which integrates considerations such as patients preferences and quality-of-life results. In this article, we propose the use of the research results in this domain as a source of knowledge in the design of support systems for therapy decision analysis, in particular with a view to application in oncology. We discuss the incorporation of these considerations in the definition of the therapy-assessment methods involved in the solution of a generic therapy decision task, described in the context of AI software development methodologies such as CommonKADS. The goal of the therapy decision task is to identify the ideal therapy, for a given patient, in accordance with a set of objectives of a diverse nature. The assessment methods applied are based either on data obtained from statistics or on the specific idiosyncrasies of each patient, as identified from their responses to a suite of psychological tests. In the analysis of the therapy decision task we emphasise the importance, from a methodological perspective, of using a rigorous approach to the modelling of domain ontologies and domain-specific data. To this aim we make extensive use of the semi-formal object oriented analysis notation UML to describe the domain level.
NASA Astrophysics Data System (ADS)
Alias, Maizam; Lashari, Tahira Anwar; Abidin Akasah, Zainal; Jahaya Kesot, Mohd.
2014-03-01
Learning in the cognitive domain is highly emphasised and has been widely investigated in engineering education. Lesser emphasis is placed on the affective dimension although the role of affects has been supported by research. The lack of understanding on learning theories and how they may be translated into classroom application of teaching and learning is one factor that contributes to this situation. This paper proposes a working framework for integrating the affective dimension of learning into engineering education that is expected to promote better learning within the cognitive domain. Four major learning theories namely behaviourism, cognitivism, socio-culturalism, and constructivism were analysed and how affects are postulated to influence cognition are identified. The affective domain constructs identified to be important are self-efficacy, attitude and locus of control. Based on the results of the analysis, a framework that integrates methodologies for achieving learning in the cognitive domain with the support of the affective dimension of learning is proposed. It is expected that integrated approach can be used as a guideline to engineering educators in designing effective and sustainable instructional material that would result in the effective engineers for future development.
Analog integrated circuits design for processing physiological signals.
Li, Yan; Poon, Carmen C Y; Zhang, Yuan-Ting
2010-01-01
Analog integrated circuits (ICs) designed for processing physiological signals are important building blocks of wearable and implantable medical devices used for health monitoring or restoring lost body functions. Due to the nature of physiological signals and the corresponding application scenarios, the ICs designed for these applications should have low power consumption, low cutoff frequency, and low input-referred noise. In this paper, techniques for designing the analog front-end circuits with these three characteristics will be reviewed, including subthreshold circuits, bulk-driven MOSFETs, floating gate MOSFETs, and log-domain circuits to reduce power consumption; methods for designing fully integrated low cutoff frequency circuits; as well as chopper stabilization (CHS) and other techniques that can be used to achieve a high signal-to-noise performance. Novel applications using these techniques will also be discussed.
ParaExp Using Leapfrog as Integrator for High-Frequency Electromagnetic Simulations
NASA Astrophysics Data System (ADS)
Merkel, M.; Niyonzima, I.; Schöps, S.
2017-12-01
Recently, ParaExp was proposed for the time integration of linear hyperbolic problems. It splits the time interval of interest into subintervals and computes the solution on each subinterval in parallel. The overall solution is decomposed into a particular solution defined on each subinterval with zero initial conditions and a homogeneous solution propagated by the matrix exponential applied to the initial conditions. The efficiency of the method depends on fast approximations of this matrix exponential based on recent results from numerical linear algebra. This paper deals with the application of ParaExp in combination with Leapfrog to electromagnetic wave problems in time domain. Numerical tests are carried out for a simple toy problem and a realistic spiral inductor model discretized by the Finite Integration Technique.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vay, Jean-Luc, E-mail: jlvay@lbl.gov; Haber, Irving; Godfrey, Brendan B.
Pseudo-spectral electromagnetic solvers (i.e. representing the fields in Fourier space) have extraordinary precision. In particular, Haber et al. presented in 1973 a pseudo-spectral solver that integrates analytically the solution over a finite time step, under the usual assumption that the source is constant over that time step. Yet, pseudo-spectral solvers have not been widely used, due in part to the difficulty for efficient parallelization owing to global communications associated with global FFTs on the entire computational domains. A method for the parallelization of electromagnetic pseudo-spectral solvers is proposed and tested on single electromagnetic pulses, and on Particle-In-Cell simulations of themore » wakefield formation in a laser plasma accelerator. The method takes advantage of the properties of the Discrete Fourier Transform, the linearity of Maxwell’s equations and the finite speed of light for limiting the communications of data within guard regions between neighboring computational domains. Although this requires a small approximation, test results show that no significant error is made on the test cases that have been presented. The proposed method opens the way to solvers combining the favorable parallel scaling of standard finite-difference methods with the accuracy advantages of pseudo-spectral methods.« less
Brautigam, Chad A; Zhao, Huaying; Vargas, Carolyn; Keller, Sandro; Schuck, Peter
2016-05-01
Isothermal titration calorimetry (ITC) is a powerful and widely used method to measure the energetics of macromolecular interactions by recording a thermogram of differential heating power during a titration. However, traditional ITC analysis is limited by stochastic thermogram noise and by the limited information content of a single titration experiment. Here we present a protocol for bias-free thermogram integration based on automated shape analysis of the injection peaks, followed by combination of isotherms from different calorimetric titration experiments into a global analysis, statistical analysis of binding parameters and graphical presentation of the results. This is performed using the integrated public-domain software packages NITPIC, SEDPHAT and GUSSI. The recently developed low-noise thermogram integration approach and global analysis allow for more precise parameter estimates and more reliable quantification of multisite and multicomponent cooperative and competitive interactions. Titration experiments typically take 1-2.5 h each, and global analysis usually takes 10-20 min.
Ozhikandathil, J.; Packirisamy, M.
2012-01-01
Integration of nano-materials in optical microfluidic devices facilitates the realization of miniaturized analytical systems with enhanced sensing abilities for biological and chemical substances. In this work, a novel method of integration of gold nano-islands in a silica-on-silicon-polydimethylsiloxane microfluidic device is reported. The device works based on the nano-enhanced evanescence technique achieved by interacting the evanescent tail of propagating wave with the gold nano-islands integrated on the core of the waveguide resulting in the modification of the propagating UV-visible spectrum. The biosensing ability of the device is investigated by finite-difference time-domain simulation with a simplified model of the device. The performance of the proposed device is demonstrated for the detection of recombinant growth hormone based on antibody-antigen interaction. PMID:24106526
Bridging ultrahigh-Q devices and photonic circuits
NASA Astrophysics Data System (ADS)
Yang, Ki Youl; Oh, Dong Yoon; Lee, Seung Hoon; Yang, Qi-Fan; Yi, Xu; Shen, Boqiang; Wang, Heming; Vahala, Kerry
2018-05-01
Optical microresonators are essential to a broad range of technologies and scientific disciplines. However, many of their applications rely on discrete devices to attain challenging combinations of ultra-low-loss performance (ultrahigh Q) and resonator design requirements. This prevents access to scalable fabrication methods for photonic integration and lithographic feature control. Indeed, finding a microfabrication bridge that connects ultrahigh-Q device functions with photonic circuits is a priority of the microcavity field. Here, an integrated resonator having a record Q factor over 200 million is presented. Its ultra-low-loss and flexible cavity design brings performance to integrated systems that has been the exclusive domain of discrete silica and crystalline microcavity devices. Two distinctly different devices are demonstrated: soliton sources with electronic repetition rates and high-coherence/low-threshold Brillouin lasers. This multi-device capability and performance from a single integrated cavity platform represents a critical advance for future photonic circuits and systems.
Bridging the Resolution Gap in Structural Modeling of 3D Genome Organization
Marti-Renom, Marc A.; Mirny, Leonid A.
2011-01-01
Over the last decade, and especially after the advent of fluorescent in situ hybridization imaging and chromosome conformation capture methods, the availability of experimental data on genome three-dimensional organization has dramatically increased. We now have access to unprecedented details of how genomes organize within the interphase nucleus. Development of new computational approaches to leverage this data has already resulted in the first three-dimensional structures of genomic domains and genomes. Such approaches expand our knowledge of the chromatin folding principles, which has been classically studied using polymer physics and molecular simulations. Our outlook describes computational approaches for integrating experimental data with polymer physics, thereby bridging the resolution gap for structural determination of genomes and genomic domains. PMID:21779160
The unified acoustic and aerodynamic prediction theory of advanced propellers in the time domain
NASA Technical Reports Server (NTRS)
Farassat, F.
1984-01-01
This paper presents some numerical results for the noise of an advanced supersonic propeller based on a formulation published last year. This formulation was derived to overcome some of the practical numerical difficulties associated with other acoustic formulations. The approach is based on the Ffowcs Williams-Hawkings equation and time domain analysis is used. To illustrate the method of solution, a model problem in three dimensions and based on the Laplace equation is solved. A brief sketch of derivation of the acoustic formula is then given. Another model problem is used to verify validity of the acoustic formulation. A recent singular integral equation for aerodynamic applications derived from the acoustic formula is also presented here.
Enabling fast, stable and accurate peridynamic computations using multi-time-step integration
Lindsay, P.; Parks, M. L.; Prakash, A.
2016-04-13
Peridynamics is a nonlocal extension of classical continuum mechanics that is well-suited for solving problems with discontinuities such as cracks. This paper extends the peridynamic formulation to decompose a problem domain into a number of smaller overlapping subdomains and to enable the use of different time steps in different subdomains. This approach allows regions of interest to be isolated and solved at a small time step for increased accuracy while the rest of the problem domain can be solved at a larger time step for greater computational efficiency. Lastly, performance of the proposed method in terms of stability, accuracy, andmore » computational cost is examined and several numerical examples are presented to corroborate the findings.« less
A new Watermarking System based on Discrete Cosine Transform (DCT) in color biometric images.
Dogan, Sengul; Tuncer, Turker; Avci, Engin; Gulten, Arif
2012-08-01
This paper recommend a biometric color images hiding approach An Watermarking System based on Discrete Cosine Transform (DCT), which is used to protect the security and integrity of transmitted biometric color images. Watermarking is a very important hiding information (audio, video, color image, gray image) technique. It is commonly used on digital objects together with the developing technology in the last few years. One of the common methods used for hiding information on image files is DCT method which used in the frequency domain. In this study, DCT methods in order to embed watermark data into face images, without corrupting their features.
Modal identification of structures by a novel approach based on FDD-wavelet method
NASA Astrophysics Data System (ADS)
Tarinejad, Reza; Damadipour, Majid
2014-02-01
An important application of system identification in structural dynamics is the determination of natural frequencies, mode shapes and damping ratios during operation which can then be used for calibrating numerical models. In this paper, the combination of two advanced methods of Operational Modal Analysis (OMA) called Frequency Domain Decomposition (FDD) and Continuous Wavelet Transform (CWT) based on novel cyclic averaging of correlation functions (CACF) technique are used for identification of dynamic properties. By using this technique, the autocorrelation of averaged correlation functions is used instead of original signals. Integration of FDD and CWT methods is used to overcome their deficiency and take advantage of the unique capabilities of these methods. The FDD method is able to accurately estimate the natural frequencies and mode shapes of structures in the frequency domain. On the other hand, the CWT method is in the time-frequency domain for decomposition of a signal at different frequencies and determines the damping coefficients. In this paper, a new formulation applied to the wavelet transform of the averaged correlation function of an ambient response is proposed. This application causes to accurate estimation of damping ratios from weak (noise) or strong (earthquake) vibrations and long or short duration record. For this purpose, the modified Morlet wavelet having two free parameters is used. The optimum values of these two parameters are obtained by employing a technique which minimizes the entropy of the wavelet coefficients matrix. The capabilities of the novel FDD-Wavelet method in the system identification of various dynamic systems with regular or irregular distribution of mass and stiffness are illustrated. This combined approach is superior to classic methods and yields results that agree well with the exact solutions of the numerical models.
High-Order Implicit-Explicit Multi-Block Time-stepping Method for Hyperbolic PDEs
NASA Technical Reports Server (NTRS)
Nielsen, Tanner B.; Carpenter, Mark H.; Fisher, Travis C.; Frankel, Steven H.
2014-01-01
This work seeks to explore and improve the current time-stepping schemes used in computational fluid dynamics (CFD) in order to reduce overall computational time. A high-order scheme has been developed using a combination of implicit and explicit (IMEX) time-stepping Runge-Kutta (RK) schemes which increases numerical stability with respect to the time step size, resulting in decreased computational time. The IMEX scheme alone does not yield the desired increase in numerical stability, but when used in conjunction with an overlapping partitioned (multi-block) domain significant increase in stability is observed. To show this, the Overlapping-Partition IMEX (OP IMEX) scheme is applied to both one-dimensional (1D) and two-dimensional (2D) problems, the nonlinear viscous Burger's equation and 2D advection equation, respectively. The method uses two different summation by parts (SBP) derivative approximations, second-order and fourth-order accurate. The Dirichlet boundary conditions are imposed using the Simultaneous Approximation Term (SAT) penalty method. The 6-stage additive Runge-Kutta IMEX time integration schemes are fourth-order accurate in time. An increase in numerical stability 65 times greater than the fully explicit scheme is demonstrated to be achievable with the OP IMEX method applied to 1D Burger's equation. Results from the 2D, purely convective, advection equation show stability increases on the order of 10 times the explicit scheme using the OP IMEX method. Also, the domain partitioning method in this work shows potential for breaking the computational domain into manageable sizes such that implicit solutions for full three-dimensional CFD simulations can be computed using direct solving methods rather than the standard iterative methods currently used.
CDAO-Store: Ontology-driven Data Integration for Phylogenetic Analysis
2011-01-01
Background The Comparative Data Analysis Ontology (CDAO) is an ontology developed, as part of the EvoInfo and EvoIO groups supported by the National Evolutionary Synthesis Center, to provide semantic descriptions of data and transformations commonly found in the domain of phylogenetic analysis. The core concepts of the ontology enable the description of phylogenetic trees and associated character data matrices. Results Using CDAO as the semantic back-end, we developed a triple-store, named CDAO-Store. CDAO-Store is a RDF-based store of phylogenetic data, including a complete import of TreeBASE. CDAO-Store provides a programmatic interface, in the form of web services, and a web-based front-end, to perform both user-defined as well as domain-specific queries; domain-specific queries include search for nearest common ancestors, minimum spanning clades, filter multiple trees in the store by size, author, taxa, tree identifier, algorithm or method. In addition, CDAO-Store provides a visualization front-end, called CDAO-Explorer, which can be used to view both character data matrices and trees extracted from the CDAO-Store. CDAO-Store provides import capabilities, enabling the addition of new data to the triple-store; files in PHYLIP, MEGA, nexml, and NEXUS formats can be imported and their CDAO representations added to the triple-store. Conclusions CDAO-Store is made up of a versatile and integrated set of tools to support phylogenetic analysis. To the best of our knowledge, CDAO-Store is the first semantically-aware repository of phylogenetic data with domain-specific querying capabilities. The portal to CDAO-Store is available at http://www.cs.nmsu.edu/~cdaostore. PMID:21496247
CDAO-store: ontology-driven data integration for phylogenetic analysis.
Chisham, Brandon; Wright, Ben; Le, Trung; Son, Tran Cao; Pontelli, Enrico
2011-04-15
The Comparative Data Analysis Ontology (CDAO) is an ontology developed, as part of the EvoInfo and EvoIO groups supported by the National Evolutionary Synthesis Center, to provide semantic descriptions of data and transformations commonly found in the domain of phylogenetic analysis. The core concepts of the ontology enable the description of phylogenetic trees and associated character data matrices. Using CDAO as the semantic back-end, we developed a triple-store, named CDAO-Store. CDAO-Store is a RDF-based store of phylogenetic data, including a complete import of TreeBASE. CDAO-Store provides a programmatic interface, in the form of web services, and a web-based front-end, to perform both user-defined as well as domain-specific queries; domain-specific queries include search for nearest common ancestors, minimum spanning clades, filter multiple trees in the store by size, author, taxa, tree identifier, algorithm or method. In addition, CDAO-Store provides a visualization front-end, called CDAO-Explorer, which can be used to view both character data matrices and trees extracted from the CDAO-Store. CDAO-Store provides import capabilities, enabling the addition of new data to the triple-store; files in PHYLIP, MEGA, nexml, and NEXUS formats can be imported and their CDAO representations added to the triple-store. CDAO-Store is made up of a versatile and integrated set of tools to support phylogenetic analysis. To the best of our knowledge, CDAO-Store is the first semantically-aware repository of phylogenetic data with domain-specific querying capabilities. The portal to CDAO-Store is available at http://www.cs.nmsu.edu/~cdaostore.
Application of the Green's function method for 2- and 3-dimensional steady transonic flows
NASA Technical Reports Server (NTRS)
Tseng, K.
1984-01-01
A Time-Domain Green's function method for the nonlinear time-dependent three-dimensional aerodynamic potential equation is presented. The Green's theorem is being used to transform the partial differential equation into an integro-differential-delay equation. Finite-element and finite-difference methods are employed for the spatial and time discretizations to approximate the integral equation by a system of differential-delay equations. Solution may be obtained by solving for this nonlinear simultaneous system of equations in time. This paper discusses the application of the method to the Transonic Small Disturbance Equation and numerical results for lifting and nonlifting airfoils and wings in steady flows are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frayce, D.; Khayat, R.E.; Derdouri, A.
The dual reciprocity boundary element method (DRBEM) is implemented to solve three-dimensional transient heat conduction problems in the presence of arbitrary sources, typically as these problems arise in materials processing. The DRBEM has a major advantage over conventional BEM, since it avoids the computation of volume integrals. These integrals stem from transient, nonlinear, and/or source terms. Thus there is no need to discretize the inner domain, since only a number of internal points are needed for the computation. The validity of the method is assessed upon comparison with results from benchmark problems where analytical solutions exist. There is generally goodmore » agreement. Comparison against finite element results is also favorable. Calculations are carried out in order to assess the influence of the number and location of internal nodes. The influence of the ratio of the numbers of internal to boundary nodes is also examined.« less
Jackson, Rebecca D; Best, Thomas M; Borlawsky, Tara B; Lai, Albert M; James, Stephen; Gurcan, Metin N
2012-01-01
The conduct of clinical and translational research regularly involves the use of a variety of heterogeneous and large-scale data resources. Scalable methods for the integrative analysis of such resources, particularly when attempting to leverage computable domain knowledge in order to generate actionable hypotheses in a high-throughput manner, remain an open area of research. In this report, we describe both a generalizable design pattern for such integrative knowledge-anchored hypothesis discovery operations and our experience in applying that design pattern in the experimental context of a set of driving research questions related to the publicly available Osteoarthritis Initiative data repository. We believe that this ‘test bed’ project and the lessons learned during its execution are both generalizable and representative of common clinical and translational research paradigms. PMID:22647689
Database Integrity Monitoring for Synthetic Vision Systems Using Machine Vision and SHADE
NASA Technical Reports Server (NTRS)
Cooper, Eric G.; Young, Steven D.
2005-01-01
In an effort to increase situational awareness, the aviation industry is investigating technologies that allow pilots to visualize what is outside of the aircraft during periods of low-visibility. One of these technologies, referred to as Synthetic Vision Systems (SVS), provides the pilot with real-time computer-generated images of obstacles, terrain features, runways, and other aircraft regardless of weather conditions. To help ensure the integrity of such systems, methods of verifying the accuracy of synthetically-derived display elements using onboard remote sensing technologies are under investigation. One such method is based on a shadow detection and extraction (SHADE) algorithm that transforms computer-generated digital elevation data into a reference domain that enables direct comparison with radar measurements. This paper describes machine vision techniques for making this comparison and discusses preliminary results from application to actual flight data.
Babb, Jessica A.; Deligiannidis, Kristina M.; Murgatroyd, Christopher A.
2014-01-01
Exposure to high levels of early life stress has been identified as a potent risk factor for neurodevelopmental delays in infants, behavioral problems and autism in children, but also for several psychiatric illnesses in adulthood, such as depression, anxiety, autism, and posttraumatic stress disorder. Despite having robust adverse effects on both mother and infant, the pathophysiology of peripartum depression and anxiety are poorly understood. The objective of this review is to highlight the advantages of using an integrated approach addressing several behavioral domains in both animal and clinical studies of peripartum depression and anxiety. It is postulated that a greater focus on integrated cross domain studies will lead to advances in treatments and preventative measures for several disorders associated with peripartum depression and anxiety. PMID:24709228
NASA Astrophysics Data System (ADS)
Nehar, K. C.; Hachi, B. E.; Cazes, F.; Haboussi, M.
2017-12-01
The aim of the present work is to investigate the numerical modeling of interfacial cracks that may appear at the interface between two isotropic elastic materials. The extended finite element method is employed to analyze brittle and bi-material interfacial fatigue crack growth by computing the mixed mode stress intensity factors (SIF). Three different approaches are introduced to compute the SIFs. In the first one, mixed mode SIF is deduced from the computation of the contour integral as per the classical J-integral method, whereas a displacement method is used to evaluate the SIF by using either one or two displacement jumps located along the crack path in the second and third approaches. The displacement jump method is rather classical for mono-materials, but has to our knowledge not been used up to now for a bi-material. Hence, use of displacement jump for characterizing bi-material cracks constitutes the main contribution of the present study. Several benchmark tests including parametric studies are performed to show the effectiveness of these computational methodologies for SIF considering static and fatigue problems of bi-material structures. It is found that results based on the displacement jump methods are in a very good agreement with those of exact solutions, such as for the J-integral method, but with a larger domain of applicability and a better numerical efficiency (less time consuming and less spurious boundary effect).
An efficient linear-scaling CCSD(T) method based on local natural orbitals.
Rolik, Zoltán; Szegedy, Lóránt; Ladjánszki, István; Ladóczki, Bence; Kállay, Mihály
2013-09-07
An improved version of our general-order local coupled-cluster (CC) approach [Z. Rolik and M. Kállay, J. Chem. Phys. 135, 104111 (2011)] and its efficient implementation at the CC singles and doubles with perturbative triples [CCSD(T)] level is presented. The method combines the cluster-in-molecule approach of Li and co-workers [J. Chem. Phys. 131, 114109 (2009)] with frozen natural orbital (NO) techniques. To break down the unfavorable fifth-power scaling of our original approach a two-level domain construction algorithm has been developed. First, an extended domain of localized molecular orbitals (LMOs) is assembled based on the spatial distance of the orbitals. The necessary integrals are evaluated and transformed in these domains invoking the density fitting approximation. In the second step, for each occupied LMO of the extended domain a local subspace of occupied and virtual orbitals is constructed including approximate second-order Mo̸ller-Plesset NOs. The CC equations are solved and the perturbative corrections are calculated in the local subspace for each occupied LMO using a highly-efficient CCSD(T) code, which was optimized for the typical sizes of the local subspaces. The total correlation energy is evaluated as the sum of the individual contributions. The computation time of our approach scales linearly with the system size, while its memory and disk space requirements are independent thereof. Test calculations demonstrate that currently our method is one of the most efficient local CCSD(T) approaches and can be routinely applied to molecules of up to 100 atoms with reasonable basis sets.
Crystal structure of the Rous sarcoma virus intasome
Yin, Zhiqi; Shi, Ke; Banerjee, Surajit; ...
2016-02-17
Integration of the reverse-transcribed viral DNA into the host genome is an essential step in the life cycle of retroviruses. Retrovirus integrase catalyses insertions of both ends of the linear viral DNA into a host chromosome. Integrase from HIV-1 and closely related retroviruses share the three-domain organization, consisting of a catalytic core domain flanked by amino- and carboxy-terminal domains essential for the concerted integration reaction. Although structures of the tetrameric integrase–DNA complexes have been reported for integrase from prototype foamy virus featuring an additional DNA-binding domain and longer interdomain linkers, the architecture of a canonical three-domain integrase bound to DNAmore » remained elusive. In this paper, we report a crystal structure of the three-domain integrase from Rous sarcoma virus in complex with viral and target DNAs. The structure shows an octameric assembly of integrase, in which a pair of integrase dimers engage viral DNA ends for catalysis while another pair of non-catalytic integrase dimers bridge between the two viral DNA molecules and help capture target DNA. The individual domains of the eight integrase molecules play varying roles to hold the complex together, making an extensive network of protein–DNA and protein–protein contacts that show both conserved and distinct features compared with those observed for prototype foamy virus integrase. Finally, our work highlights the diversity of retrovirus intasome assembly and provides insights into the mechanisms of integration by HIV-1 and related retroviruses.« less
Ground Operations Autonomous Control and Integrated Health Management
NASA Technical Reports Server (NTRS)
Daniels, James
2014-01-01
The Ground Operations Autonomous Control and Integrated Health Management plays a key role for future ground operations at NASA. The software that is integrated into this system is called G2 2011 Gensym. The purpose of this report is to describe the Ground Operations Autonomous Control and Integrated Health Management with the use of the G2 Gensym software and the G2 NASA toolkit for Integrated System Health Management (ISHM) which is a Computer Software Configuration Item (CSCI). The decision rationale for the use of the G2 platform is to develop a modular capability for ISHM and AC. Toolkit modules include knowledge bases that are generic and can be applied in any application domain module. That way, there's a maximization of reusability, maintainability, and systematic evolution, portability, and scalability. Engine modules are generic, while application modules represent the domain model of a specific application. Furthermore, the NASA toolkit, developed since 2006 (a set of modules), makes it possible to create application domain models quickly, using pre-defined objects that include sensors and components libraries for typical fluid, electrical, and mechanical systems.
Molecular origin of the binding of WWOX tumor suppressor to ErbB4 receptor tyrosine kinase.
Schuchardt, Brett J; Bhat, Vikas; Mikles, David C; McDonald, Caleb B; Sudol, Marius; Farooq, Amjad
2013-12-23
The ability of WWOX tumor suppressor to physically associate with the intracellular domain (ICD) of ErbB4 receptor tyrosine kinase is believed to play a central role in downregulating the transcriptional function of the latter. Herein, using various biophysical methods, we show that while the WW1 domain of WWOX binds to PPXY motifs located within the ICD of ErbB4 in a physiologically relevant manner, the WW2 domain does not. Importantly, while the WW1 domain absolutely requires the integrity of the PPXY consensus sequence, nonconsensus residues within and flanking this motif do not appear to be critical for binding. This strongly suggests that the WW1 domain of WWOX is rather promiscuous toward its cellular partners. We also provide evidence that the lack of binding of the WW2 domain of WWOX to PPXY motifs is due to the replacement of a signature tryptophan, lining the hydrophobic ligand binding groove, with tyrosine (Y85). Consistent with this notion, the Y85W substitution within the WW2 domain exquisitely restores its binding to PPXY motifs in a manner akin to the binding of the WW1 domain of WWOX. Of particular significance is the observation that the WW2 domain augments the binding of the WW1 domain to ErbB4, implying that the former serves as a chaperone within the context of the WW1-WW2 tandem module of WWOX in agreement with our findings reported previously. Altogether, our study sheds new light on the molecular basis of an important WW-ligand interaction involved in mediating a plethora of cellular processes.
Molecular Origin of the Binding of WWOX Tumor Suppressor to ErbB4 Receptor Tyrosine Kinase
Schuchardt, Brett J.; Bhat, Vikas; Mikles, David C.; McDonald, Caleb B.; Sudol, Marius; Farooq, Amjad
2014-01-01
The ability of WWOX tumor suppressor to physically associate with the intracellular domain (ICD) of ErbB4 receptor tyrosine kinase is believed to play a central role in down-regulating the transcriptional function of the latter. Herein, using various biophysical methods, we show that while the WW1 domain of WWOX binds to PPXY motifs located within the ICD of ErbB4 in a physiologically-relevant manner, the WW2 domain does not. Importantly, while the WW1 domain absolutely requires the integrity of the PPXY consensus sequence, non-consensus residues within and flanking this motif do not appear to be critical for binding. This strongly suggests that the WW1 domain of WWOX is rather promiscuous toward its cellular partners. We also provide evidence that the lack of binding of WW2 domain of WWOX to PPXY motifs is due to the replacement of a signature tryptophan, lining the hydrophobic ligand binding groove, with tyrosine (Y85). Consistent with this notion, the Y85W substitution within the WW2 domain exquisitely restores its binding to PPXY motifs in a manner akin to the binding of WW1 domain of WWOX. Of particular significance is the observation that WW2 domain augments the binding of WW1 domain to ErbB4, implying that the former serves as a chaperone within the context of the WW1–WW2 tandem module of WWOX in agreement with our findings reported previously. Taken together, our study sheds new light on the molecular basis of an important WW-ligand interaction involved in mediating a plethora of cellular processes. PMID:24308844
Panigrahi, Priyabrata; Jere, Abhay; Anamika, Krishanpal
2018-01-01
Gene fusion is a chromosomal rearrangement event which plays a significant role in cancer due to the oncogenic potential of the chimeric protein generated through fusions. At present many databases are available in public domain which provides detailed information about known gene fusion events and their functional role. Existing gene fusion detection tools, based on analysis of transcriptomics data usually report a large number of fusion genes as potential candidates, which could be either known or novel or false positives. Manual annotation of these putative genes is indeed time-consuming. We have developed a web platform FusionHub, which acts as integrated search engine interfacing various fusion gene databases and simplifies large scale annotation of fusion genes in a seamless way. In addition, FusionHub provides three ways of visualizing fusion events: circular view, domain architecture view and network view. Design of potential siRNA molecules through ensemble method is another utility integrated in FusionHub that could aid in siRNA-based targeted therapy. FusionHub is freely available at https://fusionhub.persistent.co.in.
Ling, Shenglong; Wang, Wei; Yu, Lu; Peng, Junhui; Cai, Xiaoying; Xiong, Ying; Hayati, Zahra; Zhang, Longhua; Zhang, Zhiyong; Song, Likai; Tian, Changlin
2016-01-01
Electron paramagnetic resonance (EPR)-based hybrid experimental and computational approaches were applied to determine the structure of a full-length E. coli integral membrane sulfurtransferase, dimeric YgaP, and its structural and dynamic changes upon ligand binding. The solution NMR structures of the YgaP transmembrane domain (TMD) and cytosolic catalytic rhodanese domain were reported recently, but the tertiary fold of full-length YgaP was not yet available. Here, systematic site-specific EPR analysis defined a helix-loop-helix secondary structure of the YagP-TMD monomers using mobility, accessibility and membrane immersion measurements. The tertiary folds of dimeric YgaP-TMD and full-length YgaP in detergent micelles were determined through inter- and intra-monomer distance mapping and rigid-body computation. Further EPR analysis demonstrated the tight packing of the two YgaP second transmembrane helices upon binding of the catalytic product SCN−, which provides insight into the thiocyanate exportation mechanism of YgaP in the E. coli membrane. PMID:26817826
NASA Astrophysics Data System (ADS)
Zimmermann, Bernhard B.; Deng, Bin; Singh, Bhawana; Martino, Mark; Selb, Juliette; Fang, Qianqian; Sajjadi, Amir Y.; Cormier, Jayne; Moore, Richard H.; Kopans, Daniel B.; Boas, David A.; Saksena, Mansi A.; Carp, Stefan A.
2017-04-01
Diffuse optical tomography (DOT) is emerging as a noninvasive functional imaging method for breast cancer diagnosis and neoadjuvant chemotherapy monitoring. In particular, the multimodal approach of combining DOT with x-ray digital breast tomosynthesis (DBT) is especially synergistic as DBT prior information can be used to enhance the DOT reconstruction. DOT, in turn, provides a functional information overlay onto the mammographic images, increasing sensitivity and specificity to cancer pathology. We describe a dynamic DOT apparatus designed for tight integration with commercial DBT scanners and providing a fast (up to 1 Hz) image acquisition rate to enable tracking hemodynamic changes induced by the mammographic breast compression. The system integrates 96 continuous-wave and 24 frequency-domain source locations as well as 32 continuous wave and 20 frequency-domain detection locations into low-profile plastic plates that can easily mate to the DBT compression paddle and x-ray detector cover, respectively. We demonstrate system performance using static and dynamic tissue-like phantoms as well as in vivo images acquired from the pool of patients recalled for breast biopsies at the Massachusetts General Hospital Breast Imaging Division.
Advancing the expectancy concept via the interplay between theory and research.
Del Boca, Frances K; Darkes, Jack; Goldman, Mark S; Smith, Gregory T
2002-06-01
Four papers from a 2001 Research Society on Alcoholism symposium on expectancy theory and research are summarized. The symposium contributors describe recent advances in expectancy theory and discuss their implications for assessment and for understanding the processes of development and change in the behavioral domain of alcohol use. First, findings are integrated across the diverse domains in which the expectancy concept has been applied. Second, the implications of expectancy theory for the measurement of expectancy structure and process are examined. Third, research and theory regarding alcohol expectancy development and change are presented, with an emphasis on the role of expectancies as mediators of known antecedents of drinking. Finally, an experimental procedure for investigating the causal role of expectancies is described, together with its implications for theory testing and prevention or intervention programming. Collectively, the symposium contributions demonstrate the utility of an integrated expectancy theory for the generation of innovative research operations and new insights regarding behavior development and change. Consistent with the notion of consilience, expectancy theory has demonstrated a convergence of findings across different levels of analysis, as well as across different operations, methods, and research designs.
ERIC Educational Resources Information Center
Cote, Heloise; Simard, Denis
2008-01-01
Since 1992, Quebec's Ministry of Education and Ministry of Culture and Communications have been creating programs designed to integrate a cultural dimension into schools--a process requiring partnerships between teachers and professionals in the cultural domain. This domain comprises the objects and practices pertaining to the realm of arts and…
Using component technologies for web based wavelet enhanced mammographic image visualization.
Sakellaropoulos, P; Costaridou, L; Panayiotakis, G
2000-01-01
The poor contrast detectability of mammography can be dealt with by domain specific software visualization tools. Remote desktop client access and time performance limitations of a previously reported visualization tool are addressed, aiming at more efficient visualization of mammographic image resources existing in web or PACS image servers. This effort is also motivated by the fact that at present, web browsers do not support domain-specific medical image visualization. To deal with desktop client access the tool was redesigned by exploring component technologies, enabling the integration of stand alone domain specific mammographic image functionality in a web browsing environment (web adaptation). The integration method is based on ActiveX Document Server technology. ActiveX Document is a part of Object Linking and Embedding (OLE) extensible systems object technology, offering new services in existing applications. The standard DICOM 3.0 part 10 compatible image-format specification Papyrus 3.0 is supported, in addition to standard digitization formats such as TIFF. The visualization functionality of the tool has been enhanced by including a fast wavelet transform implementation, which allows for real time wavelet based contrast enhancement and denoising operations. Initial use of the tool with mammograms of various breast structures demonstrated its potential in improving visualization of diagnostic mammographic features. Web adaptation and real time wavelet processing enhance the potential of the previously reported tool in remote diagnosis and education in mammography.
NASA Astrophysics Data System (ADS)
Zheng, Chang-Jun; Gao, Hai-Feng; Du, Lei; Chen, Hai-Bo; Zhang, Chuanzeng
2016-01-01
An accurate numerical solver is developed in this paper for eigenproblems governed by the Helmholtz equation and formulated through the boundary element method. A contour integral method is used to convert the nonlinear eigenproblem into an ordinary eigenproblem, so that eigenvalues can be extracted accurately by solving a set of standard boundary element systems of equations. In order to accelerate the solution procedure, the parameters affecting the accuracy and efficiency of the method are studied and two contour paths are compared. Moreover, a wideband fast multipole method is implemented with a block IDR (s) solver to reduce the overall solution cost of the boundary element systems of equations with multiple right-hand sides. The Burton-Miller formulation is employed to identify the fictitious eigenfrequencies of the interior acoustic problems with multiply connected domains. The actual effect of the Burton-Miller formulation on tackling the fictitious eigenfrequency problem is investigated and the optimal choice of the coupling parameter as α = i / k is confirmed through exterior sphere examples. Furthermore, the numerical eigenvalues obtained by the developed method are compared with the results obtained by the finite element method to show the accuracy and efficiency of the developed method.
The electronic patient record: a strategic planning framework.
Gordon, D B; Marafioti, S; Carter, M; Kunov, H; Dolan, A
1995-01-01
Sunnybrook Health Science Center (Sunnybrook) is a multifacility academic teaching center. In May 1994, Sunnybrook struck an electronic patient record taskforce to develop a strategic plan for the implementation of a comprehensive, facility wide electronic patient record (EPR). The taskforce sought to create a conceptual framework which provides context and integrates decision-making related to the comprehensive electronic patient record. The EPR is very much broader in scope than the traditional paper-based record. It is not restricted to simply reporting individual patient data. By the Institute of Medicine's definition, the electronic patient record resides in a system specifically designed to support users through availability of complete and accurate data, practitioner reminders and alerts, clinical decision support systems, links to bodies of medical knowledge, and other aids [1]. It is a comprehensive resource for patient care. The taskforce proposed a three domain model for determining how the EPR affects Sunnybrook. The EPR enables Sunnybrook to have a high performance team structure (domain 1), to function as an integrated organization (domain 2), and to reach out and develop new relationships with external organizations to become an extended enterprise (domain 3) [2]. Domain 1: Sunnybrook's high performance teams or patient service units' (PSUs) are decentralized, autonomous operating units that provide care to patients grouped by 'like' diagnosis and resource needs. The EPR must provide functions and applications which promote patient focused care, such as cross functional charting and care maps, group scheduling, clinical email, and a range of enabling technologies for multiskilled workers. Domain 2: In the integrated organization domain, the EPR should facilitate closer linkages between the arrangement of PSUs into clinical teams and with other facilities within the center in order to provide a longitudinal record that covers a continuum of care. Domain 3: In the inter-enterprise domain, the EPR must allow for patient information to be exchanged with external providers including referring doctors, laboratories, and other hospitals via community health information networks (CHINs). Sunnybrook will prioritize the development of first domain functionality within the corporate constraints imposed by the integrated organization domain. Inter-enterprise computing will be less of a priority until Sunnybrook has developed a critical mass of the electronic patient record internally. The three domain description is a useful model for describing the relationship between the electronic patient record enabling technologies and the Sunnybrook organizational structures. The taskforce has used this model to determine EPR development guidelines and implementation priorities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sidler, Rolf, E-mail: rsidler@gmail.com; Carcione, José M.; Holliger, Klaus
We present a novel numerical approach for the comprehensive, flexible, and accurate simulation of poro-elastic wave propagation in 2D polar coordinates. An important application of this method and its extensions will be the modeling of complex seismic wave phenomena in fluid-filled boreholes, which represents a major, and as of yet largely unresolved, computational problem in exploration geophysics. In view of this, we consider a numerical mesh, which can be arbitrarily heterogeneous, consisting of two or more concentric rings representing the fluid in the center and the surrounding porous medium. The spatial discretization is based on a Chebyshev expansion in themore » radial direction and a Fourier expansion in the azimuthal direction and a Runge–Kutta integration scheme for the time evolution. A domain decomposition method is used to match the fluid–solid boundary conditions based on the method of characteristics. This multi-domain approach allows for significant reductions of the number of grid points in the azimuthal direction for the inner grid domain and thus for corresponding increases of the time step and enhancements of computational efficiency. The viability and accuracy of the proposed method has been rigorously tested and verified through comparisons with analytical solutions as well as with the results obtained with a corresponding, previously published, and independently benchmarked solution for 2D Cartesian coordinates. Finally, the proposed numerical solution also satisfies the reciprocity theorem, which indicates that the inherent singularity associated with the origin of the polar coordinate system is adequately handled.« less
Garcia Lopez, Sebastian; Kim, Philip M.
2014-01-01
Advances in sequencing have led to a rapid accumulation of mutations, some of which are associated with diseases. However, to draw mechanistic conclusions, a biochemical understanding of these mutations is necessary. For coding mutations, accurate prediction of significant changes in either the stability of proteins or their affinity to their binding partners is required. Traditional methods have used semi-empirical force fields, while newer methods employ machine learning of sequence and structural features. Here, we show how combining both of these approaches leads to a marked boost in accuracy. We introduce ELASPIC, a novel ensemble machine learning approach that is able to predict stability effects upon mutation in both, domain cores and domain-domain interfaces. We combine semi-empirical energy terms, sequence conservation, and a wide variety of molecular details with a Stochastic Gradient Boosting of Decision Trees (SGB-DT) algorithm. The accuracy of our predictions surpasses existing methods by a considerable margin, achieving correlation coefficients of 0.77 for stability, and 0.75 for affinity predictions. Notably, we integrated homology modeling to enable proteome-wide prediction and show that accurate prediction on modeled structures is possible. Lastly, ELASPIC showed significant differences between various types of disease-associated mutations, as well as between disease and common neutral mutations. Unlike pure sequence-based prediction methods that try to predict phenotypic effects of mutations, our predictions unravel the molecular details governing the protein instability, and help us better understand the molecular causes of diseases. PMID:25243403
A taxonomy of adolescent health: development of the adolescent health profile-types.
Riley, A W; Green, B F; Forrest, C B; Starfield, B; Kang, M; Ensminger, M E
1998-08-01
The aim of this study was to develop a taxonomy of health profile-types that describe adolescents' patterns of health as self-reported on a health status questionnaire. The intent was to be able to assign individuals to mutually exclusive and exhaustive groups that characterize the important aspects of their health and need for health services. Cluster analytic empirical methods and clinically based conceptual methods were used to identify patterns of health in samples of adolescents from schools and from clinics that serve adolescents with chronic conditions and acute illnesses. Individuals with similar patterns of scores across multiple domains were assigned to the same profile-type. Results from the empirical and conceptually based methods were integrated to produce a practical system for assigning youths to profile-types. Four domains of health (Satisfaction, Discomfort, Risks and Resilience) were used to group individuals into 13 distinct profile-types. The profile-types were characterized primarily by the number of domains in which health is poor, identifying the unique combinations of problems that characterize different subgroups of adolescents. This method of reporting the information available on health status surveys is potentially a more informative way of identifying and classifying the health needs of subgroups in the population than is available from global scores or multiple scale scores. The reliability and validity of this taxonomy of health profile-types for the purposes of planning and evaluating health services must be demonstrated. That is the purpose of the accompanying study.
Generalized vector calculus on convex domain
NASA Astrophysics Data System (ADS)
Agrawal, Om P.; Xu, Yufeng
2015-06-01
In this paper, we apply recently proposed generalized integral and differential operators to develop generalized vector calculus and generalized variational calculus for problems defined over a convex domain. In particular, we present some generalization of Green's and Gauss divergence theorems involving some new operators, and apply these theorems to generalized variational calculus. For fractional power kernels, the formulation leads to fractional vector calculus and fractional variational calculus for problems defined over a convex domain. In special cases, when certain parameters take integer values, we obtain formulations for integer order problems. Two examples are presented to demonstrate applications of the generalized variational calculus which utilize the generalized vector calculus developed in the paper. The first example leads to a generalized partial differential equation and the second example leads to a generalized eigenvalue problem, both in two dimensional convex domains. We solve the generalized partial differential equation by using polynomial approximation. A special case of the second example is a generalized isoperimetric problem. We find an approximate solution to this problem. Many physical problems containing integer order integrals and derivatives are defined over arbitrary domains. We speculate that future problems containing fractional and generalized integrals and derivatives in fractional mechanics will be defined over arbitrary domains, and therefore, a general variational calculus incorporating a general vector calculus will be needed for these problems. This research is our first attempt in that direction.
Meng, Q.; Garcia-Rodriguez, C.; Manzanarez, G.; Silberg, M.A.; Conrad, F.; Bettencourt, J.; Pan, X.; Breece, T.; To, R.; Li, M.; Lee, D.; Thorner, L.; Tomic, M.T.; Marks, J.D.
2014-01-01
Quantitation of individual mAbs within a combined antibody drug product is required for preclinical and clinical drug development. We have developed two antitoxins (XOMA 3B and XOMA 3E) each consisting of three monoclonal antibodies (mAbs) that neutralize type B and type E botulinum neurotoxin (BoNT/B and BoNT/E) to treat serotype B and E botulism. To develop mAb-specific binding assays for each antitoxin, we mapped the epitopes of the six mAbs. Each mAb bound an epitope on either the BoNT light chain (LC) or translocation domain (HN). Epitope mapping data was used to design LC-HN domains with orthogonal mutations to make them specific for only one mAb in either XOMA 3B or 3E. Mutant LC-HN domains were cloned, expressed, and purified from E. coli. Each mAb bound only to its specific domain with affinity comparable to the binding to holotoxin. Further engineering of domains allowed construction of ELISAs that could characterize the integrity, binding affinity, and identity of each of the six mAbs in XOMA 3B, and 3E without interference from the three BoNT/A mAbs in XOMA 3AB. Such antigen engineering is a general method allowing quantitation and characterization of individual mAbs in a mAb cocktail that bind the same protein. PMID:22922799
Schapire, Arnaldo L; Voigt, Boris; Jasik, Jan; Rosado, Abel; Lopez-Cobollo, Rosa; Menzel, Diedrik; Salinas, Julio; Mancuso, Stefano; Valpuesta, Victoriano; Baluska, Frantisek; Botella, Miguel A
2008-12-01
Plasma membrane repair in animal cells uses synaptotagmin 7, a Ca(2+)-activated membrane fusion protein that mediates delivery of intracellular membranes to wound sites by a mechanism resembling neuronal Ca(2+)-regulated exocytosis. Here, we show that loss of function of the homologous Arabidopsis thaliana Synaptotagmin 1 protein (SYT1) reduces the viability of cells as a consequence of a decrease in the integrity of the plasma membrane. This reduced integrity is enhanced in the syt1-2 null mutant in conditions of osmotic stress likely caused by a defective plasma membrane repair. Consistent with a role in plasma membrane repair, SYT1 is ubiquitously expressed, is located at the plasma membrane, and shares all domains characteristic of animal synaptotagmins (i.e., an N terminus-transmembrane domain and a cytoplasmic region containing two C2 domains with phospholipid binding activities). Our analyses support that membrane trafficking mediated by SYT1 is important for plasma membrane integrity and plant fitness.
Bhasi, Ashwini; Philip, Philge; Manikandan, Vinu; Senapathy, Periannan
2009-01-01
We have developed ExDom, a unique database for the comparative analysis of the exon–intron structures of 96 680 protein domains from seven eukaryotic organisms (Homo sapiens, Mus musculus, Bos taurus, Rattus norvegicus, Danio rerio, Gallus gallus and Arabidopsis thaliana). ExDom provides integrated access to exon-domain data through a sophisticated web interface which has the following analytical capabilities: (i) intergenomic and intragenomic comparative analysis of exon–intron structure of domains; (ii) color-coded graphical display of the domain architecture of proteins correlated with their corresponding exon-intron structures; (iii) graphical analysis of multiple sequence alignments of amino acid and coding nucleotide sequences of homologous protein domains from seven organisms; (iv) comparative graphical display of exon distributions within the tertiary structures of protein domains; and (v) visualization of exon–intron structures of alternative transcripts of a gene correlated to variations in the domain architecture of corresponding protein isoforms. These novel analytical features are highly suited for detailed investigations on the exon–intron structure of domains and make ExDom a powerful tool for exploring several key questions concerning the function, origin and evolution of genes and proteins. ExDom database is freely accessible at: http://66.170.16.154/ExDom/. PMID:18984624
Transient thermal stresses of work roll by coupled thermoelasticity
NASA Astrophysics Data System (ADS)
Lai, W. B.; Chen, T. C.; Weng, C. I.
1991-01-01
A numerical method, based on a two-dimensional plane strain model, is developed to predict the transient responses (that include distributions of temperature, thermal deformation, and thermal stress) of work roll during strip rolling by coupled thermoelasticity. The method consists of discretizing the space domain of the problem by finite element method first, and then treating the time domain by implicit time integration techniques. In order to avoid the difficulty in analysis due to relative movement between work roll and its thermal boundary, the energy equation is formulated with respect to a fixed Eulerian reference frame. The effect of thermoelastic coupling term, that is generally disregarded in strip rolling, can be considered and assessed. The influences of some important process parameters, such as rotational speed of the roll and intensity of heat flux, on transient solutions are also included and discussed. Furthermore, since the stress history at any point of the roll in both transient and steady state could be accurately evaluated, it is available to perform the analysis of thermal fatigue for the roll by means of previous data.
Transport of phase space densities through tetrahedral meshes using discrete flow mapping
NASA Astrophysics Data System (ADS)
Bajars, Janis; Chappell, David J.; Søndergaard, Niels; Tanner, Gregor
2017-01-01
Discrete flow mapping was recently introduced as an efficient ray based method determining wave energy distributions in complex built up structures. Wave energy densities are transported along ray trajectories through polygonal mesh elements using a finite dimensional approximation of a ray transfer operator. In this way the method can be viewed as a smoothed ray tracing method defined over meshed surfaces. Many applications require the resolution of wave energy distributions in three-dimensional domains, such as in room acoustics, underwater acoustics and for electromagnetic cavity problems. In this work we extend discrete flow mapping to three-dimensional domains by propagating wave energy densities through tetrahedral meshes. The geometric simplicity of the tetrahedral mesh elements is utilised to efficiently compute the ray transfer operator using a mixture of analytic and spectrally accurate numerical integration. The important issue of how to choose a suitable basis approximation in phase space whilst maintaining a reasonable computational cost is addressed via low order local approximations on tetrahedral faces in the position coordinate and high order orthogonal polynomial expansions in momentum space.
Image Fusion of CT and MR with Sparse Representation in NSST Domain
Qiu, Chenhui; Wang, Yuanyuan; Zhang, Huan
2017-01-01
Multimodal image fusion techniques can integrate the information from different medical images to get an informative image that is more suitable for joint diagnosis, preoperative planning, intraoperative guidance, and interventional treatment. Fusing images of CT and different MR modalities are studied in this paper. Firstly, the CT and MR images are both transformed to nonsubsampled shearlet transform (NSST) domain. So the low-frequency components and high-frequency components are obtained. Then the high-frequency components are merged using the absolute-maximum rule, while the low-frequency components are merged by a sparse representation- (SR-) based approach. And the dynamic group sparsity recovery (DGSR) algorithm is proposed to improve the performance of the SR-based approach. Finally, the fused image is obtained by performing the inverse NSST on the merged components. The proposed fusion method is tested on a number of clinical CT and MR images and compared with several popular image fusion methods. The experimental results demonstrate that the proposed fusion method can provide better fusion results in terms of subjective quality and objective evaluation. PMID:29250134
Image Fusion of CT and MR with Sparse Representation in NSST Domain.
Qiu, Chenhui; Wang, Yuanyuan; Zhang, Huan; Xia, Shunren
2017-01-01
Multimodal image fusion techniques can integrate the information from different medical images to get an informative image that is more suitable for joint diagnosis, preoperative planning, intraoperative guidance, and interventional treatment. Fusing images of CT and different MR modalities are studied in this paper. Firstly, the CT and MR images are both transformed to nonsubsampled shearlet transform (NSST) domain. So the low-frequency components and high-frequency components are obtained. Then the high-frequency components are merged using the absolute-maximum rule, while the low-frequency components are merged by a sparse representation- (SR-) based approach. And the dynamic group sparsity recovery (DGSR) algorithm is proposed to improve the performance of the SR-based approach. Finally, the fused image is obtained by performing the inverse NSST on the merged components. The proposed fusion method is tested on a number of clinical CT and MR images and compared with several popular image fusion methods. The experimental results demonstrate that the proposed fusion method can provide better fusion results in terms of subjective quality and objective evaluation.
Atwood, Angela; Choi, Jeannie; Levin, Henry L.
1998-01-01
Retroviruses and their relatives, the LTR-retrotransposons, possess an integrase protein (IN) that is required for the insertion of reverse transcripts into the genome of host cells. Schizosaccharomyces pombe is the host of Tf1, an LTR-retrotransposon with integration activity that can be studied by using techniques of yeast genetics. In this study, we sought to identify amino acid substitutions in Tf1 that specifically affected the integration step of transposition. In addition to seeking amino acid substitutions in IN, we also explored the possibility that other Tf1 proteins contributed to integration. By comparing the results of genetic assays that monitored both transposition and reverse transcription, we were able to seek point mutations throughout Tf1 that blocked transposition but not the synthesis of reverse transcripts. These mutant versions of Tf1 were candidates of elements that possessed defects in the integration step of transposition. Five mutations in Tf1 that resulted in low levels of integration were found to be located in the IN protein: two substitutions in the N-terminal Zn domain, two in the catalytic core, and one in the C-terminal domain. These results suggested that each of the three IN domains was required for Tf1 transposition. The potential role of these five amino acid residues in the function of IN is discussed. Two of the mutations that reduced integration mapped to the RNase H (RH) domain of Tf1 reverse transcriptase. The Tf1 elements with the RH mutations produced high levels of reverse transcripts, as determined by recombination and DNA blot analysis. These results indicated that the RH of Tf1 possesses a function critical for transposition that is independent of the accumulation of reverse transcripts. PMID:9445033
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Cameron W.; Granzow, Brian; Diamond, Gerrett
Unstructured mesh methods, like finite elements and finite volumes, support the effective analysis of complex physical behaviors modeled by partial differential equations over general threedimensional domains. The most reliable and efficient methods apply adaptive procedures with a-posteriori error estimators that indicate where and how the mesh is to be modified. Although adaptive meshes can have two to three orders of magnitude fewer elements than a more uniform mesh for the same level of accuracy, there are many complex simulations where the meshes required are so large that they can only be solved on massively parallel systems.
Smith, Cameron W.; Granzow, Brian; Diamond, Gerrett; ...
2017-01-01
Unstructured mesh methods, like finite elements and finite volumes, support the effective analysis of complex physical behaviors modeled by partial differential equations over general threedimensional domains. The most reliable and efficient methods apply adaptive procedures with a-posteriori error estimators that indicate where and how the mesh is to be modified. Although adaptive meshes can have two to three orders of magnitude fewer elements than a more uniform mesh for the same level of accuracy, there are many complex simulations where the meshes required are so large that they can only be solved on massively parallel systems.
Identification of the structure parameters using short-time non-stationary stochastic excitation
NASA Astrophysics Data System (ADS)
Jarczewska, Kamila; Koszela, Piotr; Śniady, PaweŁ; Korzec, Aleksandra
2011-07-01
In this paper, we propose an approach to the flexural stiffness or eigenvalue frequency identification of a linear structure using a non-stationary stochastic excitation process. The idea of the proposed approach lies within time domain input-output methods. The proposed method is based on transforming the dynamical problem into a static one by integrating the input and the output signals. The output signal is the structure reaction, i.e. structure displacements due to the short-time, irregular load of random type. The systems with single and multiple degrees of freedom, as well as continuous systems are considered.
Algebraic methods for the solution of some linear matrix equations
NASA Technical Reports Server (NTRS)
Djaferis, T. E.; Mitter, S. K.
1979-01-01
The characterization of polynomials whose zeros lie in certain algebraic domains (and the unification of the ideas of Hermite and Lyapunov) is the basis for developing finite algorithms for the solution of linear matrix equations. Particular attention is given to equations PA + A'P = Q (the Lyapunov equation) and P - A'PA = Q the (discrete Lyapunov equation). The Lyapunov equation appears in several areas of control theory such as stability theory, optimal control (evaluation of quadratic integrals), stochastic control (evaluation of covariance matrices) and in the solution of the algebraic Riccati equation using Newton's method.
NASA Astrophysics Data System (ADS)
Ge, Liang; Sotiropoulos, Fotis
2007-08-01
A novel numerical method is developed that integrates boundary-conforming grids with a sharp interface, immersed boundary methodology. The method is intended for simulating internal flows containing complex, moving immersed boundaries such as those encountered in several cardiovascular applications. The background domain (e.g. the empty aorta) is discretized efficiently with a curvilinear boundary-fitted mesh while the complex moving immersed boundary (say a prosthetic heart valve) is treated with the sharp-interface, hybrid Cartesian/immersed-boundary approach of Gilmanov and Sotiropoulos [A. Gilmanov, F. Sotiropoulos, A hybrid cartesian/immersed boundary method for simulating flows with 3d, geometrically complex, moving bodies, Journal of Computational Physics 207 (2005) 457-492.]. To facilitate the implementation of this novel modeling paradigm in complex flow simulations, an accurate and efficient numerical method is developed for solving the unsteady, incompressible Navier-Stokes equations in generalized curvilinear coordinates. The method employs a novel, fully-curvilinear staggered grid discretization approach, which does not require either the explicit evaluation of the Christoffel symbols or the discretization of all three momentum equations at cell interfaces as done in previous formulations. The equations are integrated in time using an efficient, second-order accurate fractional step methodology coupled with a Jacobian-free, Newton-Krylov solver for the momentum equations and a GMRES solver enhanced with multigrid as preconditioner for the Poisson equation. Several numerical experiments are carried out on fine computational meshes to demonstrate the accuracy and efficiency of the proposed method for standard benchmark problems as well as for unsteady, pulsatile flow through a curved, pipe bend. To demonstrate the ability of the method to simulate flows with complex, moving immersed boundaries we apply it to calculate pulsatile, physiological flow through a mechanical, bileaflet heart valve mounted in a model straight aorta with an anatomical-like triple sinus.
Sojic, Aleksandra; Terkaj, Walter; Contini, Giorgia; Sacco, Marco
2016-05-04
The public health initiatives for obesity prevention are increasingly exploiting the advantages of smart technologies that can register various kinds of data related to physical, physiological, and behavioural conditions. Since individual features and habits vary among people, the design of appropriate intervention strategies for motivating changes in behavioural patterns towards a healthy lifestyle requires the interpretation and integration of collected information, while considering individual profiles in a personalised manner. The ontology-based modelling is recognised as a promising approach in facing the interoperability and integration of heterogeneous information related to characterisation of personal profiles. The presented ontology captures individual profiles across several obesity-related knowledge-domains structured into dedicated modules in order to support inference about health condition, physical features, behavioural habits associated with a person, and relevant changes over time. The modularisation strategy is designed to facilitate ontology development, maintenance, and reuse. The domain-specific modules formalised in the Web Ontology Language (OWL) integrate the domain-specific sets of rules formalised in the Semantic Web Rule Language (SWRL). The inference rules follow a modelling pattern designed to support personalised assessment of health condition as age- and gender-specific. The test cases exemplify a personalised assessment of the obesity-related health conditions for the population of teenagers. The paper addresses several issues concerning the modelling of normative concepts related to obesity and depicts how the public health concern impacts classification of teenagers according to their phenotypes. The modelling choices regarding the ontology-structure are explained in the context of the modelling goal to integrate multiple knowledge-domains and support reasoning about the individual changes over time. The presented modularisation pattern enhances reusability of the domain-specific modules across various health care domains.
NASA Astrophysics Data System (ADS)
Welter, Petra; Deserno, Thomas M.; Gülpers, Ralph; Wein, Berthold B.; Grouls, Christoph; Günther, Rolf W.
2010-03-01
The large and continuously growing amount of medical image data demands access methods with regards to content rather than simple text-based queries. The potential benefits of content-based image retrieval (CBIR) systems for computer-aided diagnosis (CAD) are evident and have been approved. Still, CBIR is not a well-established part of daily routine of radiologists. We have already presented a concept of CBIR integration for the radiology workflow in accordance with the Integrating the Healthcare Enterprise (IHE) framework. The retrieval result is composed as a Digital Imaging and Communication in Medicine (DICOM) Structured Reporting (SR) document. The use of DICOM SR provides interchange with PACS archive and image viewer. It offers the possibility of further data mining and automatic interpretation of CBIR results. However, existing standard templates do not address the domain of CBIR. We present a design of a SR template customized for CBIR. Our approach is based on the DICOM standard templates and makes use of the mammography and chest CAD SR templates. Reuse of approved SR sub-trees promises a reliable design which is further adopted to the CBIR domain. We analyze the special CBIR requirements and integrate the new concept of similar images into our template. Our approach also includes the new concept of a set of selected images for defining the processed images for CBIR. A commonly accepted pre-defined template for the presentation and exchange of results in a standardized format promotes the widespread application of CBIR in radiological routine.
Ghouila, Amel; Florent, Isabelle; Guerfali, Fatma Zahra; Terrapon, Nicolas; Laouini, Dhafer; Yahia, Sadok Ben; Gascuel, Olivier; Bréhélin, Laurent
2014-01-01
Identification of protein domains is a key step for understanding protein function. Hidden Markov Models (HMMs) have proved to be a powerful tool for this task. The Pfam database notably provides a large collection of HMMs which are widely used for the annotation of proteins in sequenced organisms. This is done via sequence/HMM comparisons. However, this approach may lack sensitivity when searching for domains in divergent species. Recently, methods for HMM/HMM comparisons have been proposed and proved to be more sensitive than sequence/HMM approaches in certain cases. However, these approaches are usually not used for protein domain discovery at a genome scale, and the benefit that could be expected from their utilization for this problem has not been investigated. Using proteins of P. falciparum and L. major as examples, we investigate the extent to which HMM/HMM comparisons can identify new domain occurrences not already identified by sequence/HMM approaches. We show that although HMM/HMM comparisons are much more sensitive than sequence/HMM comparisons, they are not sufficiently accurate to be used as a standalone complement of sequence/HMM approaches at the genome scale. Hence, we propose to use domain co-occurrence--the general domain tendency to preferentially appear along with some favorite domains in the proteins--to improve the accuracy of the approach. We show that the combination of HMM/HMM comparisons and co-occurrence domain detection boosts protein annotations. At an estimated False Discovery Rate of 5%, it revealed 901 and 1098 new domains in Plasmodium and Leishmania proteins, respectively. Manual inspection of part of these predictions shows that it contains several domain families that were missing in the two organisms. All new domain occurrences have been integrated in the EuPathDomains database, along with the GO annotations that can be deduced.
Ghouila, Amel; Florent, Isabelle; Guerfali, Fatma Zahra; Terrapon, Nicolas; Laouini, Dhafer; Yahia, Sadok Ben; Gascuel, Olivier; Bréhélin, Laurent
2014-01-01
Identification of protein domains is a key step for understanding protein function. Hidden Markov Models (HMMs) have proved to be a powerful tool for this task. The Pfam database notably provides a large collection of HMMs which are widely used for the annotation of proteins in sequenced organisms. This is done via sequence/HMM comparisons. However, this approach may lack sensitivity when searching for domains in divergent species. Recently, methods for HMM/HMM comparisons have been proposed and proved to be more sensitive than sequence/HMM approaches in certain cases. However, these approaches are usually not used for protein domain discovery at a genome scale, and the benefit that could be expected from their utilization for this problem has not been investigated. Using proteins of P. falciparum and L. major as examples, we investigate the extent to which HMM/HMM comparisons can identify new domain occurrences not already identified by sequence/HMM approaches. We show that although HMM/HMM comparisons are much more sensitive than sequence/HMM comparisons, they are not sufficiently accurate to be used as a standalone complement of sequence/HMM approaches at the genome scale. Hence, we propose to use domain co-occurrence — the general domain tendency to preferentially appear along with some favorite domains in the proteins — to improve the accuracy of the approach. We show that the combination of HMM/HMM comparisons and co-occurrence domain detection boosts protein annotations. At an estimated False Discovery Rate of 5%, it revealed 901 and 1098 new domains in Plasmodium and Leishmania proteins, respectively. Manual inspection of part of these predictions shows that it contains several domain families that were missing in the two organisms. All new domain occurrences have been integrated in the EuPathDomains database, along with the GO annotations that can be deduced. PMID:24901648
Gurtoo, Anil; Ranjan, Piyush; Sud, Ritika; Kumari, Archana
2013-01-01
Background & objectives: The field of medical education in our country remains deeply fragmented and polarised between the biomedical technical domains which are overrepresented and the humanitarian domains which are under-represented within the universe of medical pedagogy. To overcome this imbalance, we designed a module that integrates the two domains in a holistic biomedical and socio-cultural framework with the objective of providing unified field of learning experience to the undergraduate medical students attending rotatory clinical postings in a medical college in New Delhi, India. Methods: Undergraduate medical students of 6th and 8th semesters were enrolled in humanities based study module (HSM) on voluntary basis for a total duration of six months. During their compulsory rotatory medicine ward posting, they were introduced and exposed to learning bedside experience of HSM with various tools of art and literature in the form of poem, short narratives, paintings, sketches and group discussions to express their feelings about patients’ sufferings. Students’ feed-back was recorded through an anonymized questionnaire. Result: Of the 235 students, 223 (95%) enrolled themselves voluntarily and 94 per cent (210 of 223) of them completed the total six month duration of the study module. Seventy three per cent of the students found HSM effective in improving their affective motivational behavior, 82 per cent found it effective in motivating them to learn more about core medical subjects, and 85 per cent wanted its continuation as part of medical curriculum. Interpretation & conclusions: The positive response of the students towards the HSM was an indicator of the potential for integrating the module within the undergraduate medical curriculum. PMID:23481073