Sample records for integrated method consisting

  1. An equivalent domain integral for analysis of two-dimensional mixed mode problems

    NASA Technical Reports Server (NTRS)

    Raju, I. S.; Shivakumar, K. N.

    1989-01-01

    An equivalent domain integral (EDI) method for calculating J-integrals for two-dimensional cracked elastic bodies subjected to mixed mode loading is presented. The total and product integrals consist of the sum of an area or domain integral and line integrals on the crack faces. The EDI method gave accurate values of the J-integrals for two mode I and two mixed mode problems. Numerical studies showed that domains consisting of one layer of elements are sufficient to obtain accurate J-integral values. Two procedures for separating the individual modes from the domain integrals are presented. The procedure that uses the symmetric and antisymmetric components of the stress and displacement fields to calculate the individual modes gave accurate values of the integrals for all the problems analyzed.

  2. Biological Modeling As A Method for Data Evaluation and ...

    EPA Pesticide Factsheets

    Biological Models, evaluating consistency of data and integrating diverse data, examples of pharmacokinetics and response and pharmacodynamics Biological Models, evaluating consistency of data and integrating diverse data, examples of pharmacokinetics and response and pharmacodynamics

  3. Indirect (source-free) integration method. II. Self-force consistent radial fall

    NASA Astrophysics Data System (ADS)

    Ritter, Patxi; Aoudia, Sofiane; Spallicci, Alessandro D. A. M.; Cordier, Stéphane

    2016-12-01

    We apply our method of indirect integration, described in Part I, at fourth order, to the radial fall affected by the self-force (SF). The Mode-Sum regularization is performed in the Regge-Wheeler gauge using the equivalence with the harmonic gauge for this orbit. We consider also the motion subjected to a self-consistent and iterative correction determined by the SF through osculating stretches of geodesics. The convergence of the results confirms the validity of the integration method. This work complements and justifies the analysis and the results appeared in [Int. J. Geom. Meth. Mod. Phys. 11 (2014) 1450090].

  4. An equivalent domain integral method in the two-dimensional analysis of mixed mode crack problems

    NASA Technical Reports Server (NTRS)

    Raju, I. S.; Shivakumar, K. N.

    1990-01-01

    An equivalent domain integral (EDI) method for calculating J-integrals for two-dimensional cracked elastic bodies is presented. The details of the method and its implementation are presented for isoparametric elements. The EDI method gave accurate values of the J-integrals for two mode I and two mixed mode problems. Numerical studies showed that domains consisting of one layer of elements are sufficient to obtain accurate J-integral values. Two procedures for separating the individual modes from the domain integrals are presented.

  5. Implementation of equivalent domain integral method in the two-dimensional analysis of mixed mode problems

    NASA Technical Reports Server (NTRS)

    Raju, I. S.; Shivakumar, K. N.

    1989-01-01

    An equivalent domain integral (EDI) method for calculating J-intergrals for two-dimensional cracked elastic bodies is presented. The details of the method and its implementation are presented for isoparametric elements. The total and product integrals consist of the sum of an area of domain integral and line integrals on the crack faces. The line integrals vanish only when the crack faces are traction free and the loading is either pure mode 1 or pure mode 2 or a combination of both with only the square-root singular term in the stress field. The EDI method gave accurate values of the J-integrals for two mode I and two mixed mode problems. Numerical studies showed that domains consisting of one layer of elements are sufficient to obtain accurate J-integral values. Two procedures for separating the individual modes from the domain integrals are presented. The procedure that uses the symmetric and antisymmetric components of the stress and displacement fields to calculate the individual modes gave accurate values of the integrals for all problems analyzed. The EDI method when applied to a problem of an interface crack in two different materials showed that the mode 1 and mode 2 components are domain dependent while the total integral is not. This behavior is caused by the presence of the oscillatory part of the singularity in bimaterial crack problems. The EDI method, thus, shows behavior similar to the virtual crack closure method for bimaterial problems.

  6. Robust rotational-velocity-Verlet integration methods.

    PubMed

    Rozmanov, Dmitri; Kusalik, Peter G

    2010-05-01

    Two rotational integration algorithms for rigid-body dynamics are proposed in velocity-Verlet formulation. The first method uses quaternion dynamics and was derived from the original rotational leap-frog method by Svanberg [Mol. Phys. 92, 1085 (1997)]; it produces time consistent positions and momenta. The second method is also formulated in terms of quaternions but it is not quaternion specific and can be easily adapted for any other orientational representation. Both the methods are tested extensively and compared to existing rotational integrators. The proposed integrators demonstrated performance at least at the level of previously reported rotational algorithms. The choice of simulation parameters is also discussed.

  7. Robust rotational-velocity-Verlet integration methods

    NASA Astrophysics Data System (ADS)

    Rozmanov, Dmitri; Kusalik, Peter G.

    2010-05-01

    Two rotational integration algorithms for rigid-body dynamics are proposed in velocity-Verlet formulation. The first method uses quaternion dynamics and was derived from the original rotational leap-frog method by Svanberg [Mol. Phys. 92, 1085 (1997)]; it produces time consistent positions and momenta. The second method is also formulated in terms of quaternions but it is not quaternion specific and can be easily adapted for any other orientational representation. Both the methods are tested extensively and compared to existing rotational integrators. The proposed integrators demonstrated performance at least at the level of previously reported rotational algorithms. The choice of simulation parameters is also discussed.

  8. Conceptual Understanding of Definite Integral with Geogebra

    ERIC Educational Resources Information Center

    Tatar, Enver; Zengin, Yilmaz

    2016-01-01

    This study aimed to determine the effect of a computer-assisted instruction method using GeoGebra on achievement of prospective secondary mathematics teachers in the definite integral topic and to determine their opinions about this method. The study group consisted of 35 prospective secondary mathematics teachers studying in the mathematics…

  9. A method for exponential propagation of large systems of stiff nonlinear differential equations

    NASA Technical Reports Server (NTRS)

    Friesner, Richard A.; Tuckerman, Laurette S.; Dornblaser, Bright C.; Russo, Thomas V.

    1989-01-01

    A new time integrator for large, stiff systems of linear and nonlinear coupled differential equations is described. For linear systems, the method consists of forming a small (5-15-term) Krylov space using the Jacobian of the system and carrying out exact exponential propagation within this space. Nonlinear corrections are incorporated via a convolution integral formalism; the integral is evaluated via approximate Krylov methods as well. Gains in efficiency ranging from factors of 2 to 30 are demonstrated for several test problems as compared to a forward Euler scheme and to the integration package LSODE.

  10. Integrating Mixed Method Data in Psychological Research: Combining Q Methodology and Questionnaires in a Study Investigating Cultural and Psychological Influences on Adolescent Sexual Behavior

    ERIC Educational Resources Information Center

    Franz, Anke; Worrell, Marcia; Vögele, Claus

    2013-01-01

    In recent years, combining quantitative and qualitative research methods in the same study has become increasingly acceptable in both applied and academic psychological research. However, a difficulty for many mixed methods researchers is how to integrate findings consistently. The value of using a coherent framework throughout the research…

  11. RECOVERY ACT - Methods for Decision under Technological Change Uncertainty and Risk Assessment for Integrated Assessment of Climate Change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webster, Mort David

    2015-03-10

    This report presents the final outcomes and products of the project as performed at the Massachusetts Institute of Technology. The research project consists of three main components: methodology development for decision-making under uncertainty, improving the resolution of the electricity sector to improve integrated assessment, and application of these methods to integrated assessment. Results in each area is described in the report.

  12. Measuring ITS deployment and integration

    DOT National Transportation Integrated Search

    1999-01-01

    A consistent and simple methodology was developed to assess both the level of deployment of individual ITS elements and the level of integration between these elements. This method is based on the metropolitan ITS infrastructure, a blueprint defined ...

  13. How Integration Can Benefit Physical Education

    ERIC Educational Resources Information Center

    Wilson-Parish, Nichelle; Parish, Anthony

    2016-01-01

    One method for physical educators to increase their contact hours with their students is curricular integration, which consists of combining two or more subject areas with the goal of fostering enhanced learning in each subject area. This article provides an example of a possible integrated lesson plan involving physical education and art.

  14. Solution of the nonlinear mixed Volterra-Fredholm integral equations by hybrid of block-pulse functions and Bernoulli polynomials.

    PubMed

    Mashayekhi, S; Razzaghi, M; Tripak, O

    2014-01-01

    A new numerical method for solving the nonlinear mixed Volterra-Fredholm integral equations is presented. This method is based upon hybrid functions approximation. The properties of hybrid functions consisting of block-pulse functions and Bernoulli polynomials are presented. The operational matrices of integration and product are given. These matrices are then utilized to reduce the nonlinear mixed Volterra-Fredholm integral equations to the solution of algebraic equations. Illustrative examples are included to demonstrate the validity and applicability of the technique.

  15. Solution of the Nonlinear Mixed Volterra-Fredholm Integral Equations by Hybrid of Block-Pulse Functions and Bernoulli Polynomials

    PubMed Central

    Mashayekhi, S.; Razzaghi, M.; Tripak, O.

    2014-01-01

    A new numerical method for solving the nonlinear mixed Volterra-Fredholm integral equations is presented. This method is based upon hybrid functions approximation. The properties of hybrid functions consisting of block-pulse functions and Bernoulli polynomials are presented. The operational matrices of integration and product are given. These matrices are then utilized to reduce the nonlinear mixed Volterra-Fredholm integral equations to the solution of algebraic equations. Illustrative examples are included to demonstrate the validity and applicability of the technique. PMID:24523638

  16. Accuracy of the Generalized Self-Consistent Method in Modelling the Elastic Behaviour of Periodic Composites

    NASA Technical Reports Server (NTRS)

    Walker, Kevin P.; Freed, Alan D.; Jordan, Eric H.

    1993-01-01

    Local stress and strain fields in the unit cell of an infinite, two-dimensional, periodic fibrous lattice have been determined by an integral equation approach. The effect of the fibres is assimilated to an infinite two-dimensional array of fictitious body forces in the matrix constituent phase of the unit cell. By subtracting a volume averaged strain polarization term from the integral equation we effectively embed a finite number of unit cells in a homogenized medium in which the overall stress and strain correspond to the volume averaged stress and strain of the constrained unit cell. This paper demonstrates that the zeroth term in the governing integral equation expansion, which embeds one unit cell in the homogenized medium, corresponds to the generalized self-consistent approximation. By comparing the zeroth term approximation with higher order approximations to the integral equation summation, both the accuracy of the generalized self-consistent composite model and the rate of convergence of the integral summation can be assessed. Two example composites are studied. For a tungsten/copper elastic fibrous composite the generalized self-consistent model is shown to provide accurate, effective, elastic moduli and local field representations. The local elastic transverse stress field within the representative volume element of the generalized self-consistent method is shown to be in error by much larger amounts for a composite with periodically distributed voids, but homogenization leads to a cancelling of errors, and the effective transverse Young's modulus of the voided composite is shown to be in error by only 23% at a void volume fraction of 75%.

  17. An equivalent domain integral method for three-dimensional mixed-mode fracture problems

    NASA Technical Reports Server (NTRS)

    Shivakumar, K. N.; Raju, I. S.

    1991-01-01

    A general formulation of the equivalent domain integral (EDI) method for mixed mode fracture problems in cracked solids is presented. The method is discussed in the context of a 3-D finite element analysis. The J integral consists of two parts: the volume integral of the crack front potential over a torus enclosing the crack front and the crack surface integral due to the crack front potential plus the crack face loading. In mixed mode crack problems the total J integral is split into J sub I, J sub II, and J sub III representing the severity of the crack front in three modes of deformations. The direct and decomposition methods are used to separate the modes. These two methods were applied to several mixed mode fracture problems, were analyzed, and results were found to agree well with those available in the literature. The method lends itself to be used as a post-processing subroutine in a general purpose finite element program.

  18. An equivalent domain integral method for three-dimensional mixed-mode fracture problems

    NASA Technical Reports Server (NTRS)

    Shivakumar, K. N.; Raju, I. S.

    1992-01-01

    A general formulation of the equivalent domain integral (EDI) method for mixed mode fracture problems in cracked solids is presented. The method is discussed in the context of a 3-D finite element analysis. The J integral consists of two parts: the volume integral of the crack front potential over a torus enclosing the crack front and the crack surface integral due to the crack front potential plus the crack face loading. In mixed mode crack problems the total J integral is split into J sub I, J sub II, and J sub III representing the severity of the crack front in three modes of deformations. The direct and decomposition methods are used to separate the modes. These two methods were applied to several mixed mode fracture problems, were analyzed, and results were found to agree well with those available in the literature. The method lends itself to be used as a post-processing subroutine in a general purpose finite element program.

  19. Adaptive Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Fasnacht, Marc

    We develop adaptive Monte Carlo methods for the calculation of the free energy as a function of a parameter of interest. The methods presented are particularly well-suited for systems with complex energy landscapes, where standard sampling techniques have difficulties. The Adaptive Histogram Method uses a biasing potential derived from histograms recorded during the simulation to achieve uniform sampling in the parameter of interest. The Adaptive Integration method directly calculates an estimate of the free energy from the average derivative of the Hamiltonian with respect to the parameter of interest and uses it as a biasing potential. We compare both methods to a state of the art method, and demonstrate that they compare favorably for the calculation of potentials of mean force of dense Lennard-Jones fluids. We use the Adaptive Integration Method to calculate accurate potentials of mean force for different types of simple particles in a Lennard-Jones fluid. Our approach allows us to separate the contributions of the solvent to the potential of mean force from the effect of the direct interaction between the particles. With contributions of the solvent determined, we can find the potential of mean force directly for any other direct interaction without additional simulations. We also test the accuracy of the Adaptive Integration Method on a thermodynamic cycle, which allows us to perform a consistency check between potentials of mean force and chemical potentials calculated using the Adaptive Integration Method. The results demonstrate a high degree of consistency of the method.

  20. Survey of Pharmacy Schools' Approaches and Attitudes toward Curricular Integration.

    PubMed

    Poirier, Therese I; Fan, Jingyang; Nieto, Marcelo J

    2016-08-25

    Objective. To identify ways in which curricular integration is addressed in US pharmacy schools, the structure of therapeutics and foundational science courses, and perceptions of the effects current curricular integration methods have on student learning. Methods. An electronic survey was sent to academic leaders representing 131 pharmacy schools in the United States. Frequency data was tabulated and demographic analysis was performed. Results. Respondent data represents 94 schools of pharmacy. Arranging similar content from various disciplines in a course, a skills laboratory and pharmacy practice experiences were the most common methods for achieving curricular integration. More than one half of the schools indicated that foundational sciences were integrated with therapeutics. The most common reported challenge to curricular integration was logistics. Conclusion. Pharmacy education in the United States has evolved in addressing curricular integration in the curricula, which is consistent with changes in accreditation standards. Most pharmacy schools reported a variety of methods for achieving the intent of curricular integration.

  1. A new integrated evaluation method of heavy metals pollution control during melting and sintering of MSWI fly ash.

    PubMed

    Li, Rundong; Li, Yanlong; Yang, Tianhua; Wang, Lei; Wang, Weiyun

    2015-05-30

    Evaluations of technologies for heavy metal control mainly examine the residual and leaching rates of a single heavy metal, such that developed evaluation method have no coordination or uniqueness and are therefore unsuitable for hazard control effect evaluation. An overall pollution toxicity index (OPTI) was established in this paper, based on the developed index, an integrated evaluation method of heavy metal pollution control was established. Application of this method in the melting and sintering of fly ash revealed the following results: The integrated control efficiency of the melting process was higher in all instances than that of the sintering process. The lowest integrated control efficiency of melting was 56.2%, and the highest integrated control efficiency of sintering was 46.6%. Using the same technology, higher integrated control efficiency conditions were all achieved with lower temperatures and shorter times. This study demonstrated the unification and consistency of this method. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Automatic Authorship Detection Using Textual Patterns Extracted from Integrated Syntactic Graphs

    PubMed Central

    Gómez-Adorno, Helena; Sidorov, Grigori; Pinto, David; Vilariño, Darnes; Gelbukh, Alexander

    2016-01-01

    We apply the integrated syntactic graph feature extraction methodology to the task of automatic authorship detection. This graph-based representation allows integrating different levels of language description into a single structure. We extract textual patterns based on features obtained from shortest path walks over integrated syntactic graphs and apply them to determine the authors of documents. On average, our method outperforms the state of the art approaches and gives consistently high results across different corpora, unlike existing methods. Our results show that our textual patterns are useful for the task of authorship attribution. PMID:27589740

  3. Singularity Preserving Numerical Methods for Boundary Integral Equations

    NASA Technical Reports Server (NTRS)

    Kaneko, Hideaki (Principal Investigator)

    1996-01-01

    In the past twelve months (May 8, 1995 - May 8, 1996), under the cooperative agreement with Division of Multidisciplinary Optimization at NASA Langley, we have accomplished the following five projects: a note on the finite element method with singular basis functions; numerical quadrature for weakly singular integrals; superconvergence of degenerate kernel method; superconvergence of the iterated collocation method for Hammersteion equations; and singularity preserving Galerkin method for Hammerstein equations with logarithmic kernel. This final report consists of five papers describing these projects. Each project is preceeded by a brief abstract.

  4. Geometric integration in Born-Oppenheimer molecular dynamics.

    PubMed

    Odell, Anders; Delin, Anna; Johansson, Börje; Cawkwell, Marc J; Niklasson, Anders M N

    2011-12-14

    Geometric integration schemes for extended Lagrangian self-consistent Born-Oppenheimer molecular dynamics, including a weak dissipation to remove numerical noise, are developed and analyzed. The extended Lagrangian framework enables the geometric integration of both the nuclear and electronic degrees of freedom. This provides highly efficient simulations that are stable and energy conserving even under incomplete and approximate self-consistent field (SCF) convergence. We investigate three different geometric integration schemes: (1) regular time reversible Verlet, (2) second order optimal symplectic, and (3) third order optimal symplectic. We look at energy conservation, accuracy, and stability as a function of dissipation, integration time step, and SCF convergence. We find that the inclusion of dissipation in the symplectic integration methods gives an efficient damping of numerical noise or perturbations that otherwise may accumulate from finite arithmetics in a perfect reversible dynamics. © 2011 American Institute of Physics

  5. Multi-domain boundary element method for axi-symmetric layered linear acoustic systems

    NASA Astrophysics Data System (ADS)

    Reiter, Paul; Ziegelwanger, Harald

    2017-12-01

    Homogeneous porous materials like rock wool or synthetic foam are the main tool for acoustic absorption. The conventional absorbing structure for sound-proofing consists of one or multiple absorbers placed in front of a rigid wall, with or without air-gaps in between. Various models exist to describe these so called multi-layered acoustic systems mathematically for incoming plane waves. However, there is no efficient method to calculate the sound field in a half space above a multi layered acoustic system for an incoming spherical wave. In this work, an axi-symmetric multi-domain boundary element method (BEM) for absorbing multi layered acoustic systems and incoming spherical waves is introduced. In the proposed BEM formulation, a complex wave number is used to model absorbing materials as a fluid and a coordinate transformation is introduced which simplifies singular integrals of the conventional BEM to non-singular radial and angular integrals. The radial and angular part are integrated analytically and numerically, respectively. The output of the method can be interpreted as a numerical half space Green's function for grounds consisting of layered materials.

  6. A high-order relaxation method with projective integration for solving nonlinear systems of hyperbolic conservation laws

    NASA Astrophysics Data System (ADS)

    Lafitte, Pauline; Melis, Ward; Samaey, Giovanni

    2017-07-01

    We present a general, high-order, fully explicit relaxation scheme which can be applied to any system of nonlinear hyperbolic conservation laws in multiple dimensions. The scheme consists of two steps. In a first (relaxation) step, the nonlinear hyperbolic conservation law is approximated by a kinetic equation with stiff BGK source term. Then, this kinetic equation is integrated in time using a projective integration method. After taking a few small (inner) steps with a simple, explicit method (such as direct forward Euler) to damp out the stiff components of the solution, the time derivative is estimated and used in an (outer) Runge-Kutta method of arbitrary order. We show that, with an appropriate choice of inner step size, the time step restriction on the outer time step is similar to the CFL condition for the hyperbolic conservation law. Moreover, the number of inner time steps is also independent of the stiffness of the BGK source term. We discuss stability and consistency, and illustrate with numerical results (linear advection, Burgers' equation and the shallow water and Euler equations) in one and two spatial dimensions.

  7. Structural reliability calculation method based on the dual neural network and direct integration method.

    PubMed

    Li, Haibin; He, Yun; Nie, Xiaobo

    2018-01-01

    Structural reliability analysis under uncertainty is paid wide attention by engineers and scholars due to reflecting the structural characteristics and the bearing actual situation. The direct integration method, started from the definition of reliability theory, is easy to be understood, but there are still mathematics difficulties in the calculation of multiple integrals. Therefore, a dual neural network method is proposed for calculating multiple integrals in this paper. Dual neural network consists of two neural networks. The neural network A is used to learn the integrand function, and the neural network B is used to simulate the original function. According to the derivative relationships between the network output and the network input, the neural network B is derived from the neural network A. On this basis, the performance function of normalization is employed in the proposed method to overcome the difficulty of multiple integrations and to improve the accuracy for reliability calculations. The comparisons between the proposed method and Monte Carlo simulation method, Hasofer-Lind method, the mean value first-order second moment method have demonstrated that the proposed method is an efficient and accurate reliability method for structural reliability problems.

  8. The innovative characteristics and obstruction of technology adoption for management of integrated plants (PTT) of corn in Gowa Regency Indonesia

    NASA Astrophysics Data System (ADS)

    Jamil, M. H.; Musa, Y.; Tenriawaru, A. N.; Rahayu, N. E.

    2018-05-01

    The research aimed to analyze the effects of the farmer’s characteristic, innovation characteristics, and the obstruction faced in the technology adoption for the management of integrated plants corn in Gowa Regency. The method used was explanative in character. Respondents comprised 80 corn farmers chosen randomly. Data were collected using the interviews method which were then quantified using likers scale. The data was analyzed by logistic binary regression. The research results indicated that the farmer’s characteristics which consisted of the age, education, experience, and the land area had no significant effect on the technology adoption of maize integrated crops management (PTT). The obstruction of the adoption, which consisted of the limited capital, availability of inputs, and intensity of counseling had a significant effect on the adoption of maize integrated crops management. While the farmer’s knowledge had no significant effect on the adoption of maize integrated crops management. The variable of the limited capital had a positive coefficient, the more available the farmer’s capital the higher was the chance of farmers to adopt technology integrated crops management. The higher of the extension intensity, the higher of farmer’s chance to adopt the technology of the maize integrated corps management.

  9. System and method for integrating hazard-based decision making tools and processes

    DOEpatents

    Hodgin, C Reed [Westminster, CO

    2012-03-20

    A system and method for inputting, analyzing, and disseminating information necessary for identified decision-makers to respond to emergency situations. This system and method provides consistency and integration among multiple groups, and may be used for both initial consequence-based decisions and follow-on consequence-based decisions. The system and method in a preferred embodiment also provides tools for accessing and manipulating information that are appropriate for each decision-maker, in order to achieve more reasoned and timely consequence-based decisions. The invention includes processes for designing and implementing a system or method for responding to emergency situations.

  10. Cochrane Qualitative and Implementation Methods Group guidance series-paper 5: methods for integrating qualitative and implementation evidence within intervention effectiveness reviews.

    PubMed

    Harden, Angela; Thomas, James; Cargo, Margaret; Harris, Janet; Pantoja, Tomas; Flemming, Kate; Booth, Andrew; Garside, Ruth; Hannes, Karin; Noyes, Jane

    2018-05-01

    The Cochrane Qualitative and Implementation Methods Group develops and publishes guidance on the synthesis of qualitative and mixed-method evidence from process evaluations. Despite a proliferation of methods for the synthesis of qualitative research, less attention has focused on how to integrate these syntheses within intervention effectiveness reviews. In this article, we report updated guidance from the group on approaches, methods, and tools, which can be used to integrate the findings from quantitative studies evaluating intervention effectiveness with those from qualitative studies and process evaluations. We draw on conceptual analyses of mixed methods systematic review designs and the range of methods and tools that have been used in published reviews that have successfully integrated different types of evidence. We outline five key methods and tools as devices for integration which vary in terms of the levels at which integration takes place; the specialist skills and expertise required within the review team; and their appropriateness in the context of limited evidence. In situations where the requirement is the integration of qualitative and process evidence within intervention effectiveness reviews, we recommend the use of a sequential approach. Here, evidence from each tradition is synthesized separately using methods consistent with each tradition before integration takes place using a common framework. Reviews which integrate qualitative and process evaluation evidence alongside quantitative evidence on intervention effectiveness in a systematic way are rare. This guidance aims to support review teams to achieve integration and we encourage further development through reflection and formal testing. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Integration of forward-looking infrared (FLIR) and traffic information for moving obstacle detection with integrity

    NASA Astrophysics Data System (ADS)

    Zhu, Zhen; Vana, Sudha; Bhattacharya, Sumit; Uijt de Haag, Maarten

    2009-05-01

    This paper discusses the integration of Forward-looking Infrared (FLIR) and traffic information from, for example, the Automatic Dependent Surveillance - Broadcast (ADS-B) or the Traffic Information Service-Broadcast (TIS-B). The goal of this integration method is to obtain an improved state estimate of a moving obstacle within the Field-of-View of the FLIR with added integrity. The focus of the paper will be on the approach phase of the flight. The paper will address methods to extract moving objects from the FLIR imagery and geo-reference these objects using outputs of both the onboard Global Positioning System (GPS) and the Inertial Navigation System (INS). The proposed extraction method uses a priori airport information and terrain databases. Furthermore, state information from the traffic information sources will be extracted and integrated with the state estimates from the FLIR. Finally, a method will be addressed that performs a consistency check between both sources of traffic information. The methods discussed in this paper will be evaluated using flight test data collected with a Gulfstream V in Reno, NV (GVSITE) and simulated ADS-B.

  12. Evaluating marginal likelihood with thermodynamic integration method and comparison with several other numerical methods

    DOE PAGES

    Liu, Peigui; Elshall, Ahmed S.; Ye, Ming; ...

    2016-02-05

    Evaluating marginal likelihood is the most critical and computationally expensive task, when conducting Bayesian model averaging to quantify parametric and model uncertainties. The evaluation is commonly done by using Laplace approximations to evaluate semianalytical expressions of the marginal likelihood or by using Monte Carlo (MC) methods to evaluate arithmetic or harmonic mean of a joint likelihood function. This study introduces a new MC method, i.e., thermodynamic integration, which has not been attempted in environmental modeling. Instead of using samples only from prior parameter space (as in arithmetic mean evaluation) or posterior parameter space (as in harmonic mean evaluation), the thermodynamicmore » integration method uses samples generated gradually from the prior to posterior parameter space. This is done through a path sampling that conducts Markov chain Monte Carlo simulation with different power coefficient values applied to the joint likelihood function. The thermodynamic integration method is evaluated using three analytical functions by comparing the method with two variants of the Laplace approximation method and three MC methods, including the nested sampling method that is recently introduced into environmental modeling. The thermodynamic integration method outperforms the other methods in terms of their accuracy, convergence, and consistency. The thermodynamic integration method is also applied to a synthetic case of groundwater modeling with four alternative models. The application shows that model probabilities obtained using the thermodynamic integration method improves predictive performance of Bayesian model averaging. As a result, the thermodynamic integration method is mathematically rigorous, and its MC implementation is computationally general for a wide range of environmental problems.« less

  13. An Integrated Curriculum at an Islamic University: Perceptions of Students and Lecturers

    ERIC Educational Resources Information Center

    Suraydi, Bambang; Ekayanti, Fika; Amalia, Euis

    2018-01-01

    Methods: A descriptive quantitative research study was conducted with 670 students and 90 lecturers from 11 faculties at UIN Jakarta. The student samples consisted of 270 men and 400 women, while lecturer samples consisted of 44 men and 46 women. Data were collected via interviews and a perceptual questionnaire consisting of 54 items scored on a…

  14. Onion cells after high pressure and thermal processing: comparison of membrane integrity changes using different analytical methods and impact on tissue texture.

    PubMed

    Gonzalez, Maria E; Anthon, Gordon E; Barrett, Diane M

    2010-09-01

    Two different analytical methods were evaluated for their capacity to provide quantitative information on onion cell membrane permeability and integrity after high pressure and thermal processing and to study the impact of these processing treatments on cell compartmentalization and texture quality. To determine changes in cell membrane permeability and/or integrity the methodologies utilized were: (1) measurement of a biochemical product, pyruvate, formed as a result of membrane permeabilization followed by enzymatic activity and (2) leakage of electrolytes into solution. These results were compared to previously determined methods that quantified cell viability and ¹H-NMR T(2) of onions. These methods allowed for the monitoring of changes in the plasma and tonoplast membranes after high pressure or thermal processing. High pressure treatments consisted of 5 min holding times at 50, 100, 200, 300, or 600 MPa. Thermal treatments consisted of 30 min water bath exposure to 40, 50, 60, 70, or 90 °C. There was strong agreement between the methods in the determination of the ranges of high pressure and temperature that induce changes in the integrity of the plasma and tonoplast membranes. Membrane rupture could clearly be identified at 300 MPa and above in high pressure treatments and at 60 °C and above in the thermal treatments. Membrane destabilization effects could already be visualized following the 200 MPa and 50 °C treatments. The texture of onions was influenced by the state of the membranes and was abruptly modified once membrane integrity was lost. In this study, we used chemical, biochemical, and histological techniques to obtain information on cell membrane permeability and onion tissue integrity after high pressure and thermal processing. Because there was strong agreement between the various methods used, it is possible to implement something relatively simple, such as ion leakage, into routine quality assurance measurements to determine the severity of preservation methods and the shelf life of processed vegetables.

  15. Integral processing in beyond-Hartree-Fock calculations

    NASA Technical Reports Server (NTRS)

    Taylor, P. R.

    1986-01-01

    The increasing rate at which improvements in processing capacity outstrip improvements in input/output performance of large computers has led to recent attempts to bypass generation of a disk-based integral file. The direct self-consistent field (SCF) method of Almlof and co-workers represents a very successful implementation of this approach. This paper is concerned with the extension of this general approach to configuration interaction (CI) and multiconfiguration-self-consistent field (MCSCF) calculations. After a discussion of the particular types of molecular orbital (MO) integrals for which -- at least for most current generation machines -- disk-based storage seems unavoidable, it is shown how all the necessary integrals can be obtained as matrix elements of Coulomb and exchange operators that can be calculated using a direct approach. Computational implementations of such a scheme are discussed.

  16. Optimal solution of full fuzzy transportation problems using total integral ranking

    NASA Astrophysics Data System (ADS)

    Sam’an, M.; Farikhin; Hariyanto, S.; Surarso, B.

    2018-03-01

    Full fuzzy transportation problem (FFTP) is a transportation problem where transport costs, demand, supply and decision variables are expressed in form of fuzzy numbers. To solve fuzzy transportation problem, fuzzy number parameter must be converted to a crisp number called defuzzyfication method. In this new total integral ranking method with fuzzy numbers from conversion of trapezoidal fuzzy numbers to hexagonal fuzzy numbers obtained result of consistency defuzzyfication on symmetrical fuzzy hexagonal and non symmetrical type 2 numbers with fuzzy triangular numbers. To calculate of optimum solution FTP used fuzzy transportation algorithm with least cost method. From this optimum solution, it is found that use of fuzzy number form total integral ranking with index of optimism gives different optimum value. In addition, total integral ranking value using hexagonal fuzzy numbers has an optimal value better than the total integral ranking value using trapezoidal fuzzy numbers.

  17. Boundary element modelling of dynamic behavior of piecewise homogeneous anisotropic elastic solids

    NASA Astrophysics Data System (ADS)

    Igumnov, L. A.; Markov, I. P.; Litvinchuk, S. Yu

    2018-04-01

    A traditional direct boundary integral equations method is applied to solve three-dimensional dynamic problems of piecewise homogeneous linear elastic solids. The materials of homogeneous parts are considered to be generally anisotropic. The technique used to solve the boundary integral equations is based on the boundary element method applied together with the Radau IIA convolution quadrature method. A numerical example of suddenly loaded 3D prismatic rod consisting of two subdomains with different anisotropic elastic properties is presented to verify the accuracy of the proposed formulation.

  18. 2012-13 Integrated Postsecondary Education Data System (IPEDS) Methodology Report. NCES 2013-293

    ERIC Educational Resources Information Center

    Ginder, Scott A.; Kelly-Reid, Janice E.

    2013-01-01

    This report describes the universe, methods, and editing procedures used in the 2012-13 Integrated Postsecondary Education Data System (IPEDS) data collection. IPEDS data consist of basic statistics on postsecondary institutions regarding tuition and fees, number and types of degrees and certificates conferred, number of students enrolled, number…

  19. 2013-14 Integrated Postsecondary Education Data System (IPEDS) Methodology Report. NCES 2014-067

    ERIC Educational Resources Information Center

    Ginder, Scott A.; Kelly-Reid, Janice E.; Mann, Farrah B.

    2014-01-01

    This report describes the universe, methods, and editing procedures used in the 2013-14 Integrated Postsecondary Education Data System (IPEDS) data collection. IPEDS data consist of basic statistics on postsecondary institutions regarding tuition and fees, number and types of degrees and certificates conferred, number of students enrolled, number…

  20. 2011-12 Integrated Postsecondary Education Data System (IPEDS) Methodology Report. NCES 2012-293

    ERIC Educational Resources Information Center

    Knapp, Laura G.; Kelly-Reid, Janice E.; Ginder, Scott A.

    2012-01-01

    This report describes the universe, methods, and editing procedures used in the 2011-12 Integrated Postsecondary Education Data System (IPEDS) data collection. IPEDS data consist of basic statistics on postsecondary institutions regarding tuition and fees, number and types of degrees and certificates conferred, number of students enrolled, number…

  1. The Babushka Concept--An Instructional Sequence to Enhance Laboratory Learning in Science Education

    ERIC Educational Resources Information Center

    Gårdebjer, Sofie; Larsson, Anette; Adawi, Tom

    2017-01-01

    This paper deals with a novel method for improving the traditional "verification" laboratory in science education. Drawing on the idea of integrated instructional units, we describe an instructional sequence which we call the Babushka concept. This concept consists of three integrated instructional units: a start-up lecture, a laboratory…

  2. Using historical perspective in designing discovery learning on Integral for undergraduate students

    NASA Astrophysics Data System (ADS)

    Abadi; Fiangga, S.

    2018-01-01

    In the course of Integral Calculus, to be able to calculate an integral of a given function is becoming the main idea in the teaching beside the ability in implementing the application of integral. The students tend to be unable to understand the conceptual idea of what is integration actually. One of the promising perspectives that can be used to invite students to discover the idea of integral is the History and Pedagogy Mathematics (HPM). The method of exhaustion and indivisible appear in the discussion on the early history of area measurement. This paper study will discuss the designed learning activities based on the method of exhaustion and indivisible in providing the undergraduate student’s discovery materials for integral using design research. The designed learning activities were conducted into design experiment that consists of three phases, i.e., preliminary, design experimental, and teaching experiment. The teaching experiment phase was conducted in two cycles for refinement purpose. The finding suggests that the implementation of the method of exhaustion and indivisible enable students to reinvent the idea of integral by using the concept of derivative.

  3. Self-consistent Bulge/Disk/Halo Galaxy Dynamical Modeling Using Integral Field Kinematics

    NASA Astrophysics Data System (ADS)

    Taranu, D. S.; Obreschkow, D.; Dubinski, J. J.; Fogarty, L. M. R.; van de Sande, J.; Catinella, B.; Cortese, L.; Moffett, A.; Robotham, A. S. G.; Allen, J. T.; Bland-Hawthorn, J.; Bryant, J. J.; Colless, M.; Croom, S. M.; D'Eugenio, F.; Davies, R. L.; Drinkwater, M. J.; Driver, S. P.; Goodwin, M.; Konstantopoulos, I. S.; Lawrence, J. S.; López-Sánchez, Á. R.; Lorente, N. P. F.; Medling, A. M.; Mould, J. R.; Owers, M. S.; Power, C.; Richards, S. N.; Tonini, C.

    2017-11-01

    We introduce a method for modeling disk galaxies designed to take full advantage of data from integral field spectroscopy (IFS). The method fits equilibrium models to simultaneously reproduce the surface brightness, rotation, and velocity dispersion profiles of a galaxy. The models are fully self-consistent 6D distribution functions for a galaxy with a Sérsic profile stellar bulge, exponential disk, and parametric dark-matter halo, generated by an updated version of GalactICS. By creating realistic flux-weighted maps of the kinematic moments (flux, mean velocity, and dispersion), we simultaneously fit photometric and spectroscopic data using both maximum-likelihood and Bayesian (MCMC) techniques. We apply the method to a GAMA spiral galaxy (G79635) with kinematics from the SAMI Galaxy Survey and deep g- and r-band photometry from the VST-KiDS survey, comparing parameter constraints with those from traditional 2D bulge-disk decomposition. Our method returns broadly consistent results for shared parameters while constraining the mass-to-light ratios of stellar components and reproducing the H I-inferred circular velocity well beyond the limits of the SAMI data. Although the method is tailored for fitting integral field kinematic data, it can use other dynamical constraints like central fiber dispersions and H I circular velocities, and is well-suited for modeling galaxies with a combination of deep imaging and H I and/or optical spectra (resolved or otherwise). Our implementation (MagRite) is computationally efficient and can generate well-resolved models and kinematic maps in under a minute on modern processors.

  4. Modeling and Simulation of High Resolution Optical Remote Sensing Satellite Geometric Chain

    NASA Astrophysics Data System (ADS)

    Xia, Z.; Cheng, S.; Huang, Q.; Tian, G.

    2018-04-01

    The high resolution satellite with the longer focal length and the larger aperture has been widely used in georeferencing of the observed scene in recent years. The consistent end to end model of high resolution remote sensing satellite geometric chain is presented, which consists of the scene, the three line array camera, the platform including attitude and position information, the time system and the processing algorithm. The integrated design of the camera and the star tracker is considered and the simulation method of the geolocation accuracy is put forward by introduce the new index of the angle between the camera and the star tracker. The model is validated by the geolocation accuracy simulation according to the test method of the ZY-3 satellite imagery rigorously. The simulation results show that the geolocation accuracy is within 25m, which is highly consistent with the test results. The geolocation accuracy can be improved about 7 m by the integrated design. The model combined with the simulation method is applicable to the geolocation accuracy estimate before the satellite launching.

  5. Detection of concrete dam leakage using an integrated geophysical technique based on flow-field fitting method

    NASA Astrophysics Data System (ADS)

    Dai, Qianwei; Lin, Fangpeng; Wang, Xiaoping; Feng, Deshan; Bayless, Richard C.

    2017-05-01

    An integrated geophysical investigation was performed at S dam located at Dadu basin in China to assess the condition of the dam curtain. The key methodology of the integrated technique used was flow-field fitting method, which allowed identification of the hydraulic connections between the dam foundation and surface water sources (upstream and downstream), and location of the anomalous leakage outlets in the dam foundation. Limitations of the flow-field fitting method were complemented with resistivity logging to identify the internal erosion which had not yet developed into seepage pathways. The results of the flow-field fitting method and resistivity logging were consistent when compared with data provided by seismic tomography, borehole television, water injection test, and rock quality designation.

  6. Evaluation of Container Closure System Integrity for Frozen Storage Drug Products.

    PubMed

    Nieto, Alejandra; Roehl, Holger; Brown, Helen; Nikoloff, Jonas; Adler, Michael; Mahler, Hanns-Christian

    2016-01-01

    Sometimes, drug product for parenteral administration is stored in a frozen state (e.g., -20 °C or -80 °C), particularly during early stages of development of some biotech molecules in order to provide sufficient stability. Shipment of frozen product could potentially be performed in the frozen state, yet possibly at different temperatures, for example, using dry ice (-80 °C). Container closure systems of drug products usually consist of a glass vial, rubber stopper, and an aluminum crimped cap. In the frozen state, the glass transition temperature (Tg) of commonly used rubber stoppers is between -55 and -65 °C. Below their Tg, rubber stoppers are known to lose their elastic properties and become brittle, and thus potentially fail to maintain container closure integrity in the frozen state. Leaks during frozen temperature storage and transportation are likely to be transient, yet, can possibly risk container closure integrity and lead to microbial contamination. After thawing, the rubber stopper is supposed to re-seal the container closure system. Given the transient nature of the possible impact on container closure integrity in the frozen state, typical container closure integrity testing methods (used at room temperature conditions) are unable to evaluate and thus confirm container closure integrity in the frozen state. Here we present the development of a novel method (thermal physical container closure integrity) for direct assessment of container closure integrity by a physical method (physical container closure integrity) at frozen conditions, using a modified He leakage test. In this study, different container closure systems were evaluated with regard to physical container closure integrity in the frozen state to assess the suitability of vial/stopper combinations and were compared to a gas headspace method. In summary, the thermal physical container closure integrity He leakage method was more sensitive in detecting physical container closure integrity impact than gas headspace and aided identification of an unsuitable container closure system. Sometimes, drug product for parenteral administration is stored in a frozen state (e.g., -20 °C or -80 °C), particularly during early stages of development of some biotech molecules in order to provide sufficient stability. Container closure systems for drug products usually consist of a glass vial, rubber stopper, and an aluminum crimped cap. In the frozen state, the glass transition temperature (Tg) of commonly used rubber stoppers is between -55 and -65 °C. Leaks during frozen temperature storage and transportation are likely to be transient, yet they can possibly risk container closure integrity and lead to microbial contamination and sterility breach. After thawing, the rubber stopper is expected to re-seal the container closure system. Given the transient nature of the possible impact on container closure integrity in the frozen state, typical container closure integrity testing methods (used at room temperature conditions) are unable to evaluate and thus confirm container closure integrity in the frozen state. Here we present the development of a novel method (thermal container closure integrity) for direct measurement of container closure integrity by a physical method (physical container closure integrity) at frozen conditions, using a modified He leakage test. In this study, we found that the thermal container closure integrity He leakage method was more sensitive in detecting physical container closure integrity impact than gas headspace and aided identification of an unsuitable container closure system. © PDA, Inc. 2016.

  7. The role of pollinator diversity in the evolution of corolla-shape integration in a pollination-generalist plant clade

    PubMed Central

    Gómez, José María; Perfectti, Francisco; Klingenberg, Christian Peter

    2014-01-01

    Flowers of animal-pollinated plants are integrated structures shaped by the action of pollinator-mediated selection. It is widely assumed that pollination specialization increases the magnitude of floral integration. However, empirical evidence is still inconclusive. In this study, we explored the role of pollinator diversity in shaping the evolution of corolla-shape integration in Erysimum, a plant genus with generalized pollination systems. We quantified floral integration in Erysimum using geometric morphometrics and explored its evolution using phylogenetic comparative methods. Corolla-shape integration was low but significantly different from zero in all study species. Spatial autocorrelation and phylogenetic signal in corolla-shape integration were not detected. In addition, integration in Erysimum seems to have evolved in a way that is consistent with Brownian motion, but with frequent convergent evolution. Corolla-shape integration was negatively associated with the number of pollinators visiting the flowers of each Erysimum species. That is, it was lower in those species having a more generalized pollination system. This negative association may occur because the co-occurrence of many pollinators imposes conflicting selection and cancels out any consistent selection on specific floral traits, preventing the evolution of highly integrated flowers. PMID:25002702

  8. redGEM: Systematic reduction and analysis of genome-scale metabolic reconstructions for development of consistent core metabolic models

    PubMed Central

    Ataman, Meric

    2017-01-01

    Genome-scale metabolic reconstructions have proven to be valuable resources in enhancing our understanding of metabolic networks as they encapsulate all known metabolic capabilities of the organisms from genes to proteins to their functions. However the complexity of these large metabolic networks often hinders their utility in various practical applications. Although reduced models are commonly used for modeling and in integrating experimental data, they are often inconsistent across different studies and laboratories due to different criteria and detail, which can compromise transferability of the findings and also integration of experimental data from different groups. In this study, we have developed a systematic semi-automatic approach to reduce genome-scale models into core models in a consistent and logical manner focusing on the central metabolism or subsystems of interest. The method minimizes the loss of information using an approach that combines graph-based search and optimization methods. The resulting core models are shown to be able to capture key properties of the genome-scale models and preserve consistency in terms of biomass and by-product yields, flux and concentration variability and gene essentiality. The development of these “consistently-reduced” models will help to clarify and facilitate integration of different experimental data to draw new understanding that can be directly extendable to genome-scale models. PMID:28727725

  9. QUANTITATIVE ASSESSMENT OF INTEGRATED PHRENIC NERVE ACTIVITY

    PubMed Central

    Nichols, Nicole L.; Mitchell, Gordon S.

    2016-01-01

    Integrated electrical activity in the phrenic nerve is commonly used to assess within-animal changes in phrenic motor output. Because of concerns regarding the consistency of nerve recordings, activity is most often expressed as a percent change from baseline values. However, absolute values of nerve activity are necessary to assess the impact of neural injury or disease on phrenic motor output. To date, no systematic evaluations of the repeatability/reliability have been made among animals when phrenic recordings are performed by an experienced investigator using standardized methods. We performed a meta-analysis of studies reporting integrated phrenic nerve activity in many rat groups by the same experienced investigator; comparisons were made during baseline and maximal chemoreceptor stimulation in 14 wild-type Harlan and 14 Taconic Sprague Dawley groups, and in 3 pre-symptomatic and 11 end-stage SOD1G93A Taconic rat groups (an ALS model). Meta-analysis results indicate: 1) consistent measurements of integrated phrenic activity in each sub-strain of wild-type rats; 2) with bilateral nerve recordings, left-to-right integrated phrenic activity ratios are ~1.0; and 3) consistently reduced activity in end-stage SOD1G93A rats. Thus, with appropriate precautions, integrated phrenic nerve activity enables robust, quantitative comparisons among nerves or experimental groups, including differences caused by neuromuscular disease. PMID:26724605

  10. Different integrated geophysical approaches to investigate archaeological sites in urban and suburban area.

    NASA Astrophysics Data System (ADS)

    Piro, Salvatore; Papale, Enrico; Zamuner, Daniela

    2016-04-01

    Geophysical methods are frequently used in archaeological prospection in order to provide detailed information about the presence of structures in the subsurface as well as their position and their geometrical reconstruction, by measuring variations of some physical properties. Often, due to the limited size and depth of an archaeological structure, it may be rather difficult to single out its position and extent because of the generally low signal-to-noise ratio. This problem can be overcome by improving data acquisition, processing techniques and by integrating different geophysical methods. In this work, two sites of archaeological interest, were investigated employing several methods (Ground Penetrating Radar (GPR), Electrical Resistivity Tomography (ERT), Fluxgate Differential Magnetic) to obtain precise and detailed maps of subsurface bodies. The first site, situated in a suburban area between Itri and Fondi, in the Aurunci Natural Regional Park (Central Italy), is characterized by the presence of remains of past human activity dating from the third century B.C. The second site, is instead situated in an urban area in the city of Rome (Basilica di Santa Balbina), where historical evidence is also present. The methods employed, allowed to determine the position and the geometry of some structures in the subsurface related to this past human activity. To have a better understanding of the subsurface, we then performed a qualitative and quantitative integration of this data, which consists in fusing the data from all the methods used, to have a complete visualization of the investigated area. Qualitative integration consists in graphically overlaying the maps obtained by the single methods; this method yields only images, not new data that may be subsequently analyzed. Quantitative integration is instead performed by mathematical and statistical solutions, which allows to have a more accurate reconstruction of the subsurface and generates new data with high information content.

  11. Green's function integral equation method for propagation of electromagnetic waves in an anisotropic dielectric-magnetic slab

    NASA Astrophysics Data System (ADS)

    Shu, Weixing; Lv, Xiaofang; Luo, Hailu; Wen, Shuangchun

    2010-08-01

    We extend the Green's function integral method to investigate the propagation of electromagnetic waves through an anisotropic dielectric-magnetic slab. From a microscopic perspective, we analyze the interaction of wave with the slab and derive the propagation characteristics by self-consistent analyses. Applying the results, we find an alternative explanation to the general mechanism for the photon tunneling. The results are confirmed by numerical simulations and disclose the underlying physics of wave propagation through slab. The method extended is applicable to other problems of propagation in dielectric-magnetic materials, including metamaterials.

  12. Parallel Implementation of Numerical Solution of Few-Body Problem Using Feynman's Continual Integrals

    NASA Astrophysics Data System (ADS)

    Naumenko, Mikhail; Samarin, Viacheslav

    2018-02-01

    Modern parallel computing algorithm has been applied to the solution of the few-body problem. The approach is based on Feynman's continual integrals method implemented in C++ programming language using NVIDIA CUDA technology. A wide range of 3-body and 4-body bound systems has been considered including nuclei described as consisting of protons and neutrons (e.g., 3,4He) and nuclei described as consisting of clusters and nucleons (e.g., 6He). The correctness of the results was checked by the comparison with the exactly solvable 4-body oscillatory system and experimental data.

  13. Calculation of Scattering Amplitude Without Partial Analysis. II; Inclusion of Exchange

    NASA Technical Reports Server (NTRS)

    Temkin, Aaron; Shertzer, J.; Fisher, Richard R. (Technical Monitor)

    2002-01-01

    There was a method for calculating the whole scattering amplitude, f(Omega(sub k)), directly. The idea was to calculate the complete wave function Psi numerically, and use it in an integral expression for f, which can be reduced to a 2 dimensional quadrature. The original application was for e-H scattering without exchange. There the Schrodinger reduces a 2-d partial differential equation (pde), which was solved using the finite element method (FEM). Here we extend the method to the exchange approximation. The S.E. can be reduced to a pair of coupled pde's, which are again solved by the FEM. The formal expression for f(Omega(sub k)) consists two integrals, f+/- = f(sub d) +/- f(sub e); f(sub d) is formally the same integral as the no-exchange f. We have also succeeded in reducing f(sub e) to a 2-d integral. Results will be presented at the meeting.

  14. Combined Use of Integral Experiments and Covariance Data

    NASA Astrophysics Data System (ADS)

    Palmiotti, G.; Salvatores, M.; Aliberti, G.; Herman, M.; Hoblit, S. D.; McKnight, R. D.; Obložinský, P.; Talou, P.; Hale, G. M.; Hiruta, H.; Kawano, T.; Mattoon, C. M.; Nobre, G. P. A.; Palumbo, A.; Pigni, M.; Rising, M. E.; Yang, W.-S.; Kahler, A. C.

    2014-04-01

    In the frame of a US-DOE sponsored project, ANL, BNL, INL and LANL have performed a joint multidisciplinary research activity in order to explore the combined use of integral experiments and covariance data with the objective to both give quantitative indications on possible improvements of the ENDF evaluated data files and to reduce at the same time crucial reactor design parameter uncertainties. Methods that have been developed in the last four decades for the purposes indicated above have been improved by some new developments that benefited also by continuous exchanges with international groups working in similar areas. The major new developments that allowed significant progress are to be found in several specific domains: a) new science-based covariance data; b) integral experiment covariance data assessment and improved experiment analysis, e.g., of sample irradiation experiments; c) sensitivity analysis, where several improvements were necessary despite the generally good understanding of these techniques, e.g., to account for fission spectrum sensitivity; d) a critical approach to the analysis of statistical adjustments performance, both a priori and a posteriori; e) generalization of the assimilation method, now applied for the first time not only to multigroup cross sections data but also to nuclear model parameters (the "consistent" method). This article describes the major results obtained in each of these areas; a large scale nuclear data adjustment, based on the use of approximately one hundred high-accuracy integral experiments, will be reported along with a significant example of the application of the new "consistent" method of data assimilation.

  15. Integrated nonthermal treatment system study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biagi, C.; Bahar, D.; Teheranian, B.

    1997-01-01

    This report presents the results of a study of nonthermal treatment technologies. The study consisted of a systematic assessment of five nonthermal treatment alternatives. The treatment alternatives consist of widely varying technologies for safely destroying the hazardous organic components, reducing the volume, and preparing for final disposal of the contact-handled mixed low-level waste (MLLW) currently stored in the US Department of Energy complex. The alternatives considered were innovative nonthermal treatments for organic liquids and sludges, process residue, soil and debris. Vacuum desorption or various washing approaches are considered for treatment of soil, residue and debris. Organic destruction methods include mediatedmore » electrochemical oxidation, catalytic wet oxidation, and acid digestion. Other methods studied included stabilization technologies and mercury separation of treatment residues. This study is a companion to the integrated thermal treatment study which examined 19 alternatives for thermal treatment of MLLW waste. The quantities and physical and chemical compositions of the input waste are based on the inventory database developed by the US Department of Energy. The Integrated Nonthermal Treatment Systems (INTS) systems were evaluated using the same waste input (2,927 pounds per hour) as the Integrated Thermal Treatment Systems (ITTS). 48 refs., 68 figs., 37 tabs.« less

  16. Towards a taxonomy for integrated care: a mixed-methods study

    PubMed Central

    Valentijn, Pim P.; Boesveld, Inge C.; van der Klauw, Denise M.; Ruwaard, Dirk; Struijs, Jeroen N.; Molema, Johanna J.W.; Bruijnzeels, Marc A.; Vrijhoef, Hubertus JM.

    2015-01-01

    Introduction Building integrated services in a primary care setting is considered an essential important strategy for establishing a high-quality and affordable health care system. The theoretical foundations of such integrated service models are described by the Rainbow Model of Integrated Care, which distinguishes six integration dimensions (clinical, professional, organisational, system, functional and normative integration). The aim of the present study is to refine the Rainbow Model of Integrated Care by developing a taxonomy that specifies the underlying key features of the six dimensions. Methods First, a literature review was conducted to identify features for achieving integrated service delivery. Second, a thematic analysis method was used to develop a taxonomy of key features organised into the dimensions of the Rainbow Model of Integrated Care. Finally, the appropriateness of the key features was tested in a Delphi study among Dutch experts. Results The taxonomy consists of 59 key features distributed across the six integration dimensions of the Rainbow Model of Integrated Care. Key features associated with the clinical, professional, organisational and normative dimensions were considered appropriate by the experts. Key features linked to the functional and system dimensions were considered less appropriate. Discussion This study contributes to the ongoing debate of defining the concept and typology of integrated care. This taxonomy provides a development agenda for establishing an accepted scientific framework of integrated care from an end-user, professional, managerial and policy perspective. PMID:25759607

  17. Exploring in integrated quality evaluation of Chinese herbal medicines: the integrated quality index (IQI) for aconite.

    PubMed

    Zhang, Ding-kun; Wang, Jia-bo; Yang, Ming; Peng, Cheng; Xiao, Xiao-he

    2015-07-01

    Good medicinal herbs, good drugs. Good evaluation method and indices are the prerequisite of good medicinal herbs. However, there exist numerous indices for quality evaluation and control in Chinese medicinal materials. However, most of these indices are non-interrelated each other, as well as having little relationship with efficiency and safety. The results of different evaluatior methods may not be consistent, even contradictory. Considering the complex material properties of Chinese medicinal materials, single method and index is difficult to objectively and comprehensively reflect the quality. Therefore, it is essential to explore the integrated evaluation methods. In this paper, oriented by the integrated evaluation strategies for traditional Chinese medicine quality, a new method called integrated quality index (IQI) by the integration of empirical evaluation, chemical evaluation, and biological evaluation was proposed. In addition, a study case of hypertoxic herb Aconitum carmichaelii Debx. was provided to explain this method in detail. The results suggested that in the view of specifications, the average weight of Jiangyou aconite was the greatest, followed by Weishan aconite, Butuo aconite, Hanzhong aconite, and Anxian aconite; from the point of chemical components, Jiangyou aconite had the characteristic with strong efficacy and weak toxicity, next was Hanzhong aconite, Butuo aconite, Weishan aconite, and Anxian aconite; taking toxicity price as the index, Hanzhong aconite and Jiangyou aconite have the lower toxicity, while Butuo aconite, Weishan aconite, and Anxian aconite have the relatively higher one. After the normalization and integration of evaluation results, we calculated the IQI value of Jiangyou aconite, Hanzhong aconite, Butuo aconite, Weishan aconite, and Anxian aconite were 0.842 +/- 0.091, 0.597 +/- 0.047, 0.442 +/- 0.033, 0.454 +/- 0.038, 0.170 +/- 0.021, respectively. The quality of Jiangyou aconite is significantly better than the others (P < 0.05) followed by Hanzhong aconite, which is consistent with the traditional understanding of genuineness. It can be concluded that IQI achieves the integrated control and evaluation for the quality of Chinese medicinal materials, and it is an exploration for building the good medicinal herbs standards. In addition, IQI provides technical supports for the geoherbalism evaluation, selective breeding, the development of precision decoction pieces, high quality and favourable price in market circulation, and rational drug use.

  18. Vision-Aided RAIM: A New Method for GPS Integrity Monitoring in Approach and Landing Phase

    PubMed Central

    Fu, Li; Zhang, Jun; Li, Rui; Cao, Xianbin; Wang, Jinling

    2015-01-01

    In the 1980s, Global Positioning System (GPS) receiver autonomous integrity monitoring (RAIM) was proposed to provide the integrity of a navigation system by checking the consistency of GPS measurements. However, during the approach and landing phase of a flight path, where there is often low GPS visibility conditions, the performance of the existing RAIM method may not meet the stringent aviation requirements for availability and integrity due to insufficient observations. To solve this problem, a new RAIM method, named vision-aided RAIM (VA-RAIM), is proposed for GPS integrity monitoring in the approach and landing phase. By introducing landmarks as pseudo-satellites, the VA-RAIM enriches the navigation observations to improve the performance of RAIM. In the method, a computer vision system photographs and matches these landmarks to obtain additional measurements for navigation. Nevertheless, the challenging issue is that such additional measurements may suffer from vision errors. To ensure the reliability of the vision measurements, a GPS-based calibration algorithm is presented to reduce the time-invariant part of the vision errors. Then, the calibrated vision measurements are integrated with the GPS observations for integrity monitoring. Simulation results show that the VA-RAIM outperforms the conventional RAIM with a higher level of availability and fault detection rate. PMID:26378533

  19. Vision-Aided RAIM: A New Method for GPS Integrity Monitoring in Approach and Landing Phase.

    PubMed

    Fu, Li; Zhang, Jun; Li, Rui; Cao, Xianbin; Wang, Jinling

    2015-09-10

    In the 1980s, Global Positioning System (GPS) receiver autonomous integrity monitoring (RAIM) was proposed to provide the integrity of a navigation system by checking the consistency of GPS measurements. However, during the approach and landing phase of a flight path, where there is often low GPS visibility conditions, the performance of the existing RAIM method may not meet the stringent aviation requirements for availability and integrity due to insufficient observations. To solve this problem, a new RAIM method, named vision-aided RAIM (VA-RAIM), is proposed for GPS integrity monitoring in the approach and landing phase. By introducing landmarks as pseudo-satellites, the VA-RAIM enriches the navigation observations to improve the performance of RAIM. In the method, a computer vision system photographs and matches these landmarks to obtain additional measurements for navigation. Nevertheless, the challenging issue is that such additional measurements may suffer from vision errors. To ensure the reliability of the vision measurements, a GPS-based calibration algorithm is presented to reduce the time-invariant part of the vision errors. Then, the calibrated vision measurements are integrated with the GPS observations for integrity monitoring. Simulation results show that the VA-RAIM outperforms the conventional RAIM with a higher level of availability and fault detection rate.

  20. Nanocarrier-Integrated Microspheres: Nanogel Tectonic Engineering for Advanced Drug-Delivery Systems.

    PubMed

    Tahara, Yoshiro; Mukai, Sada-Atsu; Sawada, Shin-Ichi; Sasaki, Yoshihiro; Akiyoshi, Kazunari

    2015-09-09

    A nanocarrier-integrated bottom-up method is a promising strategy for advanced drug-release systems. Self-assembled nanogels, which are one of the most beneficial nanocarriers for drug-delivery systems, are tectonically integrated to prepare nanogel-crosslinked (NanoClik) microspheres. NanoClik microspheres consisting of nanogel-derived structures (observed by STED microscopy) release "drug-loaded nanogels" after hydrolysis, resulting in successful sustained drug delivery in vivo. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. The role of pollinator diversity in the evolution of corolla-shape integration in a pollination-generalist plant clade.

    PubMed

    Gómez, José María; Perfectti, Francisco; Klingenberg, Christian Peter

    2014-08-19

    Flowers of animal-pollinated plants are integrated structures shaped by the action of pollinator-mediated selection. It is widely assumed that pollination specialization increases the magnitude of floral integration. However, empirical evidence is still inconclusive. In this study, we explored the role of pollinator diversity in shaping the evolution of corolla-shape integration in Erysimum, a plant genus with generalized pollination systems. We quantified floral integration in Erysimum using geometric morphometrics and explored its evolution using phylogenetic comparative methods. Corolla-shape integration was low but significantly different from zero in all study species. Spatial autocorrelation and phylogenetic signal in corolla-shape integration were not detected. In addition, integration in Erysimum seems to have evolved in a way that is consistent with Brownian motion, but with frequent convergent evolution. Corolla-shape integration was negatively associated with the number of pollinators visiting the flowers of each Erysimum species. That is, it was lower in those species having a more generalized pollination system. This negative association may occur because the co-occurrence of many pollinators imposes conflicting selection and cancels out any consistent selection on specific floral traits, preventing the evolution of highly integrated flowers. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  2. The fabrication of integrated carbon pipes with sub-micron diameters

    NASA Astrophysics Data System (ADS)

    Kim, B. M.; Murray, T.; Bau, H. H.

    2005-08-01

    A method for fabricating integrated carbon pipes (nanopipettes) of sub-micron diameters and tens of microns in length is demonstrated. The carbon pipes are formed from a template consisting of the tip of a pulled alumino-silicate glass capillary coated with carbon deposited from a vapour phase. This method renders carbon nanopipettes without the need for ex situ assembly and facilitates parallel production of multiple carbon-pipe devices. An electric-field-driven transfer of ions in a KCl solution through the integrated carbon pipes exhibits nonlinear current-voltage (I-V) curves, markedly different from the Ohmic I-V curves observed in glass pipettes under similar conditions. The filling of the nanopipette with fluorescent suspension is also demonstrated.

  3. Indirect (source-free) integration method. I. Wave-forms from geodesic generic orbits of EMRIs

    NASA Astrophysics Data System (ADS)

    Ritter, Patxi; Aoudia, Sofiane; Spallicci, Alessandro D. A. M.; Cordier, Stéphane

    2016-12-01

    The Regge-Wheeler-Zerilli (RWZ) wave-equation describes Schwarzschild-Droste black hole perturbations. The source term contains a Dirac distribution and its derivative. We have previously designed a method of integration in time domain. It consists of a finite difference scheme where analytic expressions, dealing with the wave-function discontinuity through the jump conditions, replace the direct integration of the source and the potential. Herein, we successfully apply the same method to the geodesic generic orbits of EMRI (Extreme Mass Ratio Inspiral) sources, at second order. An EMRI is a Compact Star (CS) captured by a Super-Massive Black Hole (SMBH). These are considered the best probes for testing gravitation in strong regime. The gravitational wave-forms, the radiated energy and angular momentum at infinity are computed and extensively compared with other methods, for different orbits (circular, elliptic, parabolic, including zoom-whirl).

  4. Integrating heterogeneous databases in clustered medic care environments using object-oriented technology

    NASA Astrophysics Data System (ADS)

    Thakore, Arun K.; Sauer, Frank

    1994-05-01

    The organization of modern medical care environments into disease-related clusters, such as a cancer center, a diabetes clinic, etc., has the side-effect of introducing multiple heterogeneous databases, often containing similar information, within the same organization. This heterogeneity fosters incompatibility and prevents the effective sharing of data amongst applications at different sites. Although integration of heterogeneous databases is now feasible, in the medical arena this is often an ad hoc process, not founded on proven database technology or formal methods. In this paper we illustrate the use of a high-level object- oriented semantic association method to model information found in different databases into an integrated conceptual global model that integrates the databases. We provide examples from the medical domain to illustrate an integration approach resulting in a consistent global view, without attacking the autonomy of the underlying databases.

  5. Quantitative assessment of integrated phrenic nerve activity.

    PubMed

    Nichols, Nicole L; Mitchell, Gordon S

    2016-06-01

    Integrated electrical activity in the phrenic nerve is commonly used to assess within-animal changes in phrenic motor output. Because of concerns regarding the consistency of nerve recordings, activity is most often expressed as a percent change from baseline values. However, absolute values of nerve activity are necessary to assess the impact of neural injury or disease on phrenic motor output. To date, no systematic evaluations of the repeatability/reliability have been made among animals when phrenic recordings are performed by an experienced investigator using standardized methods. We performed a meta-analysis of studies reporting integrated phrenic nerve activity in many rat groups by the same experienced investigator; comparisons were made during baseline and maximal chemoreceptor stimulation in 14 wild-type Harlan and 14 Taconic Sprague Dawley groups, and in 3 pre-symptomatic and 11 end-stage SOD1(G93A) Taconic rat groups (an ALS model). Meta-analysis results indicate: (1) consistent measurements of integrated phrenic activity in each sub-strain of wild-type rats; (2) with bilateral nerve recordings, left-to-right integrated phrenic activity ratios are ∼1.0; and (3) consistently reduced activity in end-stage SOD1(G93A) rats. Thus, with appropriate precautions, integrated phrenic nerve activity enables robust, quantitative comparisons among nerves or experimental groups, including differences caused by neuromuscular disease. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Thermal integration of Spacelab experiments

    NASA Technical Reports Server (NTRS)

    Patterson, W. C.; Hopson, G. D.

    1978-01-01

    The method of thermally integrating the experiments for Spacelab is discussed. The scientific payload consists of a combination of European and United States sponsored experiments located in the module as well as on a single Spacelab pallet. The thermal integration must result in accomodating the individual experiment requirements as well as ensuring that the total payload is within the Spacelab Environmental Control System (ECS) resource capability. An integrated thermal/ECS analysis of the module and pallet is performed in concert with the mission timeline to ensure that the agreed upon experiment requirements are accommodated and to ensure the total payload is within the Spacelab ECS resources.

  7. A method for modifying two-dimensional adaptive wind-tunnel walls including analytical and experimental verification

    NASA Technical Reports Server (NTRS)

    Everhart, J. L.

    1983-01-01

    The theoretical development of a simple and consistent method for removing the interference in adaptive-wall wind tunnels is reported. A Cauchy integral formulation of the velocities in an imaginary infinite extension of the real wind-tunnel flow is obtained and evaluated on a closed contour dividing the real and imaginary flow. The contour consists of the upper and lower effective wind-tunnel walls (wall plus boundary-layer displacement thickness) and upstream and downstream boundaries perpendicular to the axial tunnel flow. The resulting integral expressions for the streamwise and normal perturbation velocities on the contour are integrated by assuming a linear variation of the velocities between data-measurement stations along the contour. In an iterative process, the velocity components calculated on the upper and lower boundaries are then used to correct the shape of the wall to remove the interference. Convergence of the technique is shown numerically for the cases of a circular cylinder and a lifting and nonlifting NACA 0012 airfoil in incompressible flow. Experimental convergence at a transonic Mach number is demonstrated by using an NACA 0012 airfoil at zero lift.

  8. An Empirical Orthogonal Function-Based Algorithm for Estimating Terrestrial Latent Heat Flux from Eddy Covariance, Meteorological and Satellite Observations

    PubMed Central

    Feng, Fei; Li, Xianglan; Yao, Yunjun; Liang, Shunlin; Chen, Jiquan; Zhao, Xiang; Jia, Kun; Pintér, Krisztina; McCaughey, J. Harry

    2016-01-01

    Accurate estimation of latent heat flux (LE) based on remote sensing data is critical in characterizing terrestrial ecosystems and modeling land surface processes. Many LE products were released during the past few decades, but their quality might not meet the requirements in terms of data consistency and estimation accuracy. Merging multiple algorithms could be an effective way to improve the quality of existing LE products. In this paper, we present a data integration method based on modified empirical orthogonal function (EOF) analysis to integrate the Moderate Resolution Imaging Spectroradiometer (MODIS) LE product (MOD16) and the Priestley-Taylor LE algorithm of Jet Propulsion Laboratory (PT-JPL) estimate. Twenty-two eddy covariance (EC) sites with LE observation were chosen to evaluate our algorithm, showing that the proposed EOF fusion method was capable of integrating the two satellite data sets with improved consistency and reduced uncertainties. Further efforts were needed to evaluate and improve the proposed algorithm at larger spatial scales and time periods, and over different land cover types. PMID:27472383

  9. Integrative Approaches for Predicting in vivo Effects of Chemicals from their Structural Descriptors and the Results of Short-term Biological Assays

    PubMed Central

    Low, Yen S.; Sedykh, Alexander; Rusyn, Ivan; Tropsha, Alexander

    2017-01-01

    Cheminformatics approaches such as Quantitative Structure Activity Relationship (QSAR) modeling have been used traditionally for predicting chemical toxicity. In recent years, high throughput biological assays have been increasingly employed to elucidate mechanisms of chemical toxicity and predict toxic effects of chemicals in vivo. The data generated in such assays can be considered as biological descriptors of chemicals that can be combined with molecular descriptors and employed in QSAR modeling to improve the accuracy of toxicity prediction. In this review, we discuss several approaches for integrating chemical and biological data for predicting biological effects of chemicals in vivo and compare their performance across several data sets. We conclude that while no method consistently shows superior performance, the integrative approaches rank consistently among the best yet offer enriched interpretation of models over those built with either chemical or biological data alone. We discuss the outlook for such interdisciplinary methods and offer recommendations to further improve the accuracy and interpretability of computational models that predict chemical toxicity. PMID:24805064

  10. An Empirical Orthogonal Function-Based Algorithm for Estimating Terrestrial Latent Heat Flux from Eddy Covariance, Meteorological and Satellite Observations.

    PubMed

    Feng, Fei; Li, Xianglan; Yao, Yunjun; Liang, Shunlin; Chen, Jiquan; Zhao, Xiang; Jia, Kun; Pintér, Krisztina; McCaughey, J Harry

    2016-01-01

    Accurate estimation of latent heat flux (LE) based on remote sensing data is critical in characterizing terrestrial ecosystems and modeling land surface processes. Many LE products were released during the past few decades, but their quality might not meet the requirements in terms of data consistency and estimation accuracy. Merging multiple algorithms could be an effective way to improve the quality of existing LE products. In this paper, we present a data integration method based on modified empirical orthogonal function (EOF) analysis to integrate the Moderate Resolution Imaging Spectroradiometer (MODIS) LE product (MOD16) and the Priestley-Taylor LE algorithm of Jet Propulsion Laboratory (PT-JPL) estimate. Twenty-two eddy covariance (EC) sites with LE observation were chosen to evaluate our algorithm, showing that the proposed EOF fusion method was capable of integrating the two satellite data sets with improved consistency and reduced uncertainties. Further efforts were needed to evaluate and improve the proposed algorithm at larger spatial scales and time periods, and over different land cover types.

  11. Spatial Data Integration Using Ontology-Based Approach

    NASA Astrophysics Data System (ADS)

    Hasani, S.; Sadeghi-Niaraki, A.; Jelokhani-Niaraki, M.

    2015-12-01

    In today's world, the necessity for spatial data for various organizations is becoming so crucial that many of these organizations have begun to produce spatial data for that purpose. In some circumstances, the need to obtain real time integrated data requires sustainable mechanism to process real-time integration. Case in point, the disater management situations that requires obtaining real time data from various sources of information. One of the problematic challenges in the mentioned situation is the high degree of heterogeneity between different organizations data. To solve this issue, we introduce an ontology-based method to provide sharing and integration capabilities for the existing databases. In addition to resolving semantic heterogeneity, better access to information is also provided by our proposed method. Our approach is consisted of three steps, the first step is identification of the object in a relational database, then the semantic relationships between them are modelled and subsequently, the ontology of each database is created. In a second step, the relative ontology will be inserted into the database and the relationship of each class of ontology will be inserted into the new created column in database tables. Last step is consisted of a platform based on service-oriented architecture, which allows integration of data. This is done by using the concept of ontology mapping. The proposed approach, in addition to being fast and low cost, makes the process of data integration easy and the data remains unchanged and thus takes advantage of the legacy application provided.

  12. A Novel Multilayered RFID Tagged Cargo Integrity Assurance Scheme

    PubMed Central

    Yang, Ming Hour; Luo, Jia Ning; Lu, Shao Yong

    2015-01-01

    To minimize cargo theft during transport, mobile radio frequency identification (RFID) grouping proof methods are generally employed to ensure the integrity of entire cargo loads. However, conventional grouping proofs cannot simultaneously generate grouping proofs for a specific group of RFID tags. The most serious problem of these methods is that nonexistent tags are included in the grouping proofs because of the considerable amount of time it takes to scan a high number of tags. Thus, applying grouping proof methods in the current logistics industry is difficult. To solve this problem, this paper proposes a method for generating multilayered offline grouping proofs. The proposed method provides tag anonymity; moreover, resolving disputes between recipients and transporters over the integrity of cargo deliveries can be expedited by generating grouping proofs and automatically authenticating the consistency between the receipt proof and pick proof. The proposed method can also protect against replay attacks, multi-session attacks, and concurrency attacks. Finally, experimental results verify that, compared with other methods for generating grouping proofs, the proposed method can efficiently generate offline grouping proofs involving several parties in a supply chain using mobile RFID. PMID:26512673

  13. A high-order boundary integral method for surface diffusions on elastically stressed axisymmetric rods.

    PubMed

    Li, Xiaofan; Nie, Qing

    2009-07-01

    Many applications in materials involve surface diffusion of elastically stressed solids. Study of singularity formation and long-time behavior of such solid surfaces requires accurate simulations in both space and time. Here we present a high-order boundary integral method for an elastically stressed solid with axi-symmetry due to surface diffusions. In this method, the boundary integrals for isotropic elasticity in axi-symmetric geometry are approximated through modified alternating quadratures along with an extrapolation technique, leading to an arbitrarily high-order quadrature; in addition, a high-order (temporal) integration factor method, based on explicit representation of the mean curvature, is used to reduce the stability constraint on time-step. To apply this method to a periodic (in axial direction) and axi-symmetric elastically stressed cylinder, we also present a fast and accurate summation method for the periodic Green's functions of isotropic elasticity. Using the high-order boundary integral method, we demonstrate that in absence of elasticity the cylinder surface pinches in finite time at the axis of the symmetry and the universal cone angle of the pinching is found to be consistent with the previous studies based on a self-similar assumption. In the presence of elastic stress, we show that a finite time, geometrical singularity occurs well before the cylindrical solid collapses onto the axis of symmetry, and the angle of the corner singularity on the cylinder surface is also estimated.

  14. A boundary integral equation method using auxiliary interior surface approach for acoustic radiation and scattering in two dimensions.

    PubMed

    Yang, S A

    2002-10-01

    This paper presents an effective solution method for predicting acoustic radiation and scattering fields in two dimensions. The difficulty of the fictitious characteristic frequency is overcome by incorporating an auxiliary interior surface that satisfies certain boundary condition into the body surface. This process gives rise to a set of uniquely solvable boundary integral equations. Distributing monopoles with unknown strengths over the body and interior surfaces yields the simple source formulation. The modified boundary integral equations are further transformed to ordinary ones that contain nonsingular kernels only. This implementation allows direct application of standard quadrature formulas over the entire integration domain; that is, the collocation points are exactly the positions at which the integration points are located. Selecting the interior surface is an easy task. Moreover, only a few corresponding interior nodal points are sufficient for the computation. Numerical calculations consist of the acoustic radiation and scattering by acoustically hard elliptic and rectangular cylinders. Comparisons with analytical solutions are made. Numerical results demonstrate the efficiency and accuracy of the current solution method.

  15. Guide for Regional Integrated Assessments: Handbook of Methods and Procedures, Version 5.1. Appendix 1

    NASA Technical Reports Server (NTRS)

    Rosenzweig, Cynthia E.; Jones, James W.; Hatfield, Jerry; Antle, John; Ruane, Alex; Boote, Ken; Thorburn, Peter; Valdivia, Roberto; Porter, Cheryl; Janssen, Sander; hide

    2015-01-01

    The purpose of this handbook is to describe recommended methods for a trans-disciplinary, systems-based approach for regional-scale (local to national scale) integrated assessment of agricultural systems under future climate, bio-physical and socio-economic conditions. An earlier version of this Handbook was developed and used by several AgMIP Regional Research Teams (RRTs) in Sub-Saharan Africa (SSA) and South Asia (SA)(AgMIP handbook version 4.2, www.agmip.org/regional-integrated-assessments-handbook/). In contrast to the earlier version, which was written specifically to guide a consistent set of integrated assessments across SSA and SA, this version is intended to be more generic such that the methods can be applied to any region globally. These assessments are the regional manifestation of research activities described by AgMIP in its online protocols document (available at www.agmip.org). AgMIP Protocols were created to guide climate, crop modeling, economics, and information technology components of its projects.

  16. Integration of phase separation with ultrasound-assisted salt-induced liquid-liquid microextraction for analyzing the fluoroquinones in human body fluids by liquid chromatography.

    PubMed

    Wang, Huili; Gao, Ming; Wang, Mei; Zhang, Rongbo; Wang, Wenwei; Dahlgren, Randy A; Wang, Xuedong

    2015-03-15

    Herein, we developed a novel integrated device to perform phase separation based on ultrasound-assisted salt-induced liquid-liquid microextraction for determination of five fluoroquinones (FQs) in human body fluids. The integrated device consisted of three simple HDPE components used to separate the extraction solvent from the aqueous phase prior to retrieving the extractant. A series of extraction parameters were optimized using the response surface method based on central composite design. Optimal conditions consisted of 945μL acetone extraction solvent, pH 2.1, 4.1min stir time, 5.9g Na2SO4, and 4.0min centrifugation. Under optimized conditions, the limits of detection (at S/N=3) were 0.12-0.66μgL(-1), the linear range was 0.5-500μgL(-1) and recoveries were 92.6-110.9% for the five FQs extracted from plasma and urine. The proposed method has several advantages, such as easy construction from inexpensive materials, high extraction efficiency, short extraction time, and compatibility with HPLC analysis. Thus, this method shows excellent prospects for sample pretreatment and analysis of FQs in human body fluids. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. 'The singing hospital'--integrated group therapy in the Black mentally ill.

    PubMed

    Benjamin, B

    1983-06-04

    Integrated group therapy was originally introduced at Sterkfontein Hospital in 1957, and reintroduced 6 years ago in an effort to overcome difficulties in communication with approximately 100 Black male and female mental hospital patients. This therapy consisted mainly of song and dance activation, sociodrama, psychodrama and behavioural modification methods. These techniques are flexible, and can be carried out by proxy therapists working with doctors and psychologists.

  18. Chloride mass-balance method for estimating ground water recharge in arid areas: examples from western Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Bazuhair, Abdulghaffar S.; Wood, Warren W.

    1996-11-01

    The chloride mass-balance method, which integrates time and aerial distribution of ground water recharge, was applied to small alluvial aquifers in the wadi systems of the Asir and Hijaz mountains in western Saudi Arabia. This application is an extension of the method shown to be suitable for estimating recharge in regional aquifers in semi-arid areas. Because the method integrates recharge in time and space it appears to be, with certain assumptions, particularly well suited for and areas with large temporal and spatial variation in recharge. In general, recharge was found to be between 3 to 4% of precipitation — a range consistent with recharge rates found in other and and semi-arid areas of the earth.

  19. Chloride mass-balance method for estimating ground water recharge in arid areas: Examples from western Saudi Arabia

    USGS Publications Warehouse

    Bazuhair, A.S.; Wood, W.W.

    1996-01-01

    The chloride mass-balance method, which integrates time and aerial distribution of ground water recharge, was applied to small alluvial aquifers in the wadi systems of the Asir and Hijaz mountains in western Saudi Arabia. This application is an extension of the method shown to be suitable for estimating recharge in regional aquifers in semi-arid areas. Because the method integrates recharge in time and space it appears to be, with certain assumptions, particularly well suited for and areas with large temporal and spatial variation in recharge. In general, recharge was found to be between 3 to 4% of precipitation - a range consistent with recharge rates found in other arid and semi-arid areas of the earth.

  20. Characterizing regional-scale temporal evolution of air dose rates after the Fukushima Daiichi Nuclear Power Plant accident.

    PubMed

    Wainwright, Haruko M; Seki, Akiyuki; Mikami, Satoshi; Saito, Kimiaki

    2018-09-01

    In this study, we quantify the temporal changes of air dose rates in the regional scale around the Fukushima Dai-ichi Nuclear Power Plant in Japan, and predict the spatial distribution of air dose rates in the future. We first apply the Bayesian geostatistical method developed by Wainwright et al. (2017) to integrate multiscale datasets including ground-based walk and car surveys, and airborne surveys, all of which have different scales, resolutions, spatial coverage, and accuracy. This method is based on geostatistics to represent spatial heterogeneous structures, and also on Bayesian hierarchical models to integrate multiscale, multi-type datasets in a consistent manner. We apply this method to the datasets from three years: 2014 to 2016. The temporal changes among the three integrated maps enables us to characterize the spatiotemporal dynamics of radiation air dose rates. The data-driven ecological decay model is then coupled with the integrated map to predict future dose rates. Results show that the air dose rates are decreasing consistently across the region. While slower in the forested region, the decrease is particularly significant in the town area. The decontamination has contributed to significant reduction of air dose rates. By 2026, the air dose rates will continue to decrease, and the area above 3.8 μSv/h will be almost fully contained within the non-residential forested zone. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Uberon, an integrative multi-species anatomy ontology

    PubMed Central

    2012-01-01

    We present Uberon, an integrated cross-species ontology consisting of over 6,500 classes representing a variety of anatomical entities, organized according to traditional anatomical classification criteria. The ontology represents structures in a species-neutral way and includes extensive associations to existing species-centric anatomical ontologies, allowing integration of model organism and human data. Uberon provides a necessary bridge between anatomical structures in different taxa for cross-species inference. It uses novel methods for representing taxonomic variation, and has proved to be essential for translational phenotype analyses. Uberon is available at http://uberon.org PMID:22293552

  2. Multispectral scanner system parameter study and analysis software system description, volume 2

    NASA Technical Reports Server (NTRS)

    Landgrebe, D. A. (Principal Investigator); Mobasseri, B. G.; Wiersma, D. J.; Wiswell, E. R.; Mcgillem, C. D.; Anuta, P. E.

    1978-01-01

    The author has identified the following significant results. The integration of the available methods provided the analyst with the unified scanner analysis package (USAP), the flexibility and versatility of which was superior to many previous integrated techniques. The USAP consisted of three main subsystems; (1) a spatial path, (2) a spectral path, and (3) a set of analytic classification accuracy estimators which evaluated the system performance. The spatial path consisted of satellite and/or aircraft data, data correlation analyzer, scanner IFOV, and random noise model. The output of the spatial path was fed into the analytic classification and accuracy predictor. The spectral path consisted of laboratory and/or field spectral data, EXOSYS data retrieval, optimum spectral function calculation, data transformation, and statistics calculation. The output of the spectral path was fended into the stratified posterior performance estimator.

  3. Combining the boundary shift integral and tensor-based morphometry for brain atrophy estimation

    NASA Astrophysics Data System (ADS)

    Michalkiewicz, Mateusz; Pai, Akshay; Leung, Kelvin K.; Sommer, Stefan; Darkner, Sune; Sørensen, Lauge; Sporring, Jon; Nielsen, Mads

    2016-03-01

    Brain atrophy from structural magnetic resonance images (MRIs) is widely used as an imaging surrogate marker for Alzheimers disease. Their utility has been limited due to the large degree of variance and subsequently high sample size estimates. The only consistent and reasonably powerful atrophy estimation methods has been the boundary shift integral (BSI). In this paper, we first propose a tensor-based morphometry (TBM) method to measure voxel-wise atrophy that we combine with BSI. The combined model decreases the sample size estimates significantly when compared to BSI and TBM alone.

  4. Software development environments: Status and trends

    NASA Technical Reports Server (NTRS)

    Duffel, Larry E.

    1988-01-01

    Currently software engineers are the essential integrating factors tying several components together. The components consist of process, methods, computers, tools, support environments, and software engineers. The engineers today empower the tools versus the tools empowering the engineers. Some of the issues in software engineering are quality, managing the software engineering process, and productivity. A strategy to accomplish this is to promote the evolution of software engineering from an ad hoc, labor intensive activity to a managed, technology supported discipline. This strategy may be implemented by putting the process under management control, adopting appropriate methods, inserting the technology that provides automated support for the process and methods, collecting automated tools into an integrated environment and educating the personnel.

  5. SEM analysis of ionizing radiation effects in linear integrated circuits. [Scanning Electron Microscope

    NASA Technical Reports Server (NTRS)

    Stanley, A. G.; Gauthier, M. K.

    1977-01-01

    A successful diagnostic technique was developed using a scanning electron microscope (SEM) as a precision tool to determine ionization effects in integrated circuits. Previous SEM methods radiated the entire semiconductor chip or major areas. The large area exposure methods do not reveal the exact components which are sensitive to radiation. To locate these sensitive components a new method was developed, which consisted in successively irradiating selected components on the device chip with equal doses of electrons /10 to the 6th rad (Si)/, while the whole device was subjected to representative bias conditions. A suitable device parameter was measured in situ after each successive irradiation with the beam off.

  6. Fast and reliable symplectic integration for planetary system N-body problems

    NASA Astrophysics Data System (ADS)

    Hernandez, David M.

    2016-06-01

    We apply one of the exactly symplectic integrators, which we call HB15, of Hernandez & Bertschinger, along with the Kepler problem solver of Wisdom & Hernandez, to solve planetary system N-body problems. We compare the method to Wisdom-Holman (WH) methods in the MERCURY software package, the MERCURY switching integrator, and others and find HB15 to be the most efficient method or tied for the most efficient method in many cases. Unlike WH, HB15 solved N-body problems exhibiting close encounters with small, acceptable error, although frequent encounters slowed the code. Switching maps like MERCURY change between two methods and are not exactly symplectic. We carry out careful tests on their properties and suggest that they must be used with caution. We then use different integrators to solve a three-body problem consisting of a binary planet orbiting a star. For all tested tolerances and time steps, MERCURY unbinds the binary after 0 to 25 years. However, in the solutions of HB15, a time-symmetric HERMITE code, and a symplectic Yoshida method, the binary remains bound for >1000 years. The methods' solutions are qualitatively different, despite small errors in the first integrals in most cases. Several checks suggest that the qualitative binary behaviour of HB15's solution is correct. The Bulirsch-Stoer and Radau methods in the MERCURY package also unbind the binary before a time of 50 years, suggesting that this dynamical error is due to a MERCURY bug.

  7. Advanced optical manufacturing digital integrated system

    NASA Astrophysics Data System (ADS)

    Tao, Yizheng; Li, Xinglan; Li, Wei; Tang, Dingyong

    2012-10-01

    It is necessarily to adapt development of advanced optical manufacturing technology with modern science technology development. To solved these problems which low of ration, ratio of finished product, repetition, consistent in big size and high precision in advanced optical component manufacturing. Applied business driven and method of Rational Unified Process, this paper has researched advanced optical manufacturing process flow, requirement of Advanced Optical Manufacturing integrated System, and put forward architecture and key technology of it. Designed Optical component core and Manufacturing process driven of Advanced Optical Manufacturing Digital Integrated System. the result displayed effective well, realized dynamic planning Manufacturing process, information integration improved ratio of production manufactory.

  8. SLEPR: A Sample-Level Enrichment-Based Pathway Ranking Method — Seeking Biological Themes through Pathway-Level Consistency

    PubMed Central

    Yi, Ming; Stephens, Robert M.

    2008-01-01

    Analysis of microarray and other high throughput data often involves identification of genes consistently up or down-regulated across samples as the first step in extraction of biological meaning. This gene-level paradigm can be limited as a result of valid sample fluctuations and biological complexities. In this report, we describe a novel method, SLEPR, which eliminates this limitation by relying on pathway-level consistencies. Our method first selects the sample-level differentiated genes from each individual sample, capturing genes missed by other analysis methods, ascertains the enrichment levels of associated pathways from each of those lists, and then ranks annotated pathways based on the consistency of enrichment levels of individual samples from both sample classes. As a proof of concept, we have used this method to analyze three public microarray datasets with a direct comparison with the GSEA method, one of the most popular pathway-level analysis methods in the field. We found that our method was able to reproduce the earlier observations with significant improvements in depth of coverage for validated or expected biological themes, but also produced additional insights that make biological sense. This new method extends existing analyses approaches and facilitates integration of different types of HTP data. PMID:18818771

  9. Integrated use of surface geophysical methods for site characterization — A case study in North Kingstown, Rhode Island

    USGS Publications Warehouse

    Johnson, Carole D.; Lane, John W.; Brandon, William C.; Williams, Christine A.P.; White, Eric A.

    2010-01-01

    A suite of complementary, non‐invasive surface geophysical methods was used to assess their utility for site characterization in a pilot investigation at a former defense site in North Kingstown, Rhode Island. The methods included frequency‐domain electromagnetics (FDEM), ground‐penetrating radar (GPR), electrical resistivity tomography (ERT), and multi‐channel analysis of surface‐wave (MASW) seismic. The results of each method were compared to each other and to drive‐point data from the site. FDEM was used as a reconnaissance method to assess buried utilities and anthropogenic structures; to identify near‐surface changes in water chemistry related to conductive leachate from road‐salt storage; and to investigate a resistive signature possibly caused by groundwater discharge. Shallow anomalies observed in the GPR and ERT data were caused by near‐surface infrastructure and were consistent with anomalies observed in the FDEM data. Several parabolic reflectors were observed in the upper part of the GPR profiles, and a fairly continuous reflector that was interpreted as bedrock could be traced across the lower part of the profiles. MASW seismic data showed a sharp break in shear wave velocity at depth, which was interpreted as the overburden/bedrock interface. The MASW profile indicates the presence of a trough in the bedrock surface in the same location where the ERT data indicate lateral variations in resistivity. Depths to bedrock interpreted from the ERT, MASW, and GPR profiles were similar and consistent with the depths of refusal identified in the direct‐push wells. The interpretations of data collected using the individual methods yielded non‐unique solutions with considerable uncertainty. Integrated interpretation of the electrical, electromagnetic, and seismic geophysical profiles produced a more consistent and unique estimation of depth to bedrock that is consistent with ground‐truth data at the site. This test case shows that using complementary techniques that measure different properties can be more effective for site characterization than a single‐method investigation.

  10. Bayesian Integration and Classification of Composition C-4 Plastic Explosives Based on Time-of-Flight-Secondary Ion Mass Spectrometry and Laser Ablation-Inductively Coupled Plasma Mass Spectrometry.

    PubMed

    Mahoney, Christine M; Kelly, Ryan T; Alexander, Liz; Newburn, Matt; Bader, Sydney; Ewing, Robert G; Fahey, Albert J; Atkinson, David A; Beagley, Nathaniel

    2016-04-05

    Time-of-flight-secondary ion mass spectrometry (TOF-SIMS) and laser ablation-inductively coupled plasma mass spectrometry (LA-ICPMS) were used for characterization and identification of unique signatures from a series of 18 Composition C-4 plastic explosives. The samples were obtained from various commercial and military sources around the country. Positive and negative ion TOF-SIMS data were acquired directly from the C-4 residue on Si surfaces, where the positive ion mass spectra obtained were consistent with the major composition of organic additives, and the negative ion mass spectra were more consistent with explosive content in the C-4 samples. Each series of mass spectra was subjected to partial least squares-discriminant analysis (PLS-DA), a multivariate statistical analysis approach which serves to first find the areas of maximum variance within different classes of C-4 and subsequently to classify unknown samples based on correlations between the unknown data set and the original data set (often referred to as a training data set). This method was able to successfully classify test samples of C-4, though with a limited degree of certainty. The classification accuracy of the method was further improved by integrating the positive and negative ion data using a Bayesian approach. The TOF-SIMS data was combined with a second analytical method, LA-ICPMS, which was used to analyze elemental signatures in the C-4. The integrated data were able to classify test samples with a high degree of certainty. Results indicate that this Bayesian integrated approach constitutes a robust classification method that should be employable even in dirty samples collected in the field.

  11. Best Merge Region Growing with Integrated Probabilistic Classification for Hyperspectral Imagery

    NASA Technical Reports Server (NTRS)

    Tarabalka, Yuliya; Tilton, James C.

    2011-01-01

    A new method for spectral-spatial classification of hyperspectral images is proposed. The method is based on the integration of probabilistic classification within the hierarchical best merge region growing algorithm. For this purpose, preliminary probabilistic support vector machines classification is performed. Then, hierarchical step-wise optimization algorithm is applied, by iteratively merging regions with the smallest Dissimilarity Criterion (DC). The main novelty of this method consists in defining a DC between regions as a function of region statistical and geometrical features along with classification probabilities. Experimental results are presented on a 200-band AVIRIS image of the Northwestern Indiana s vegetation area and compared with those obtained by recently proposed spectral-spatial classification techniques. The proposed method improves classification accuracies when compared to other classification approaches.

  12. Preventing syndemic Zika virus, HIV/STIs and unintended pregnancy: dual method use and consistent condom use among Brazilian women in marital and civil unions.

    PubMed

    Tsuyuki, Kiyomi; Gipson, Jessica D; Barbosa, Regina Maria; Urada, Lianne A; Morisky, Donald E

    2017-12-12

    Syndemic Zika virus, HIV and unintended pregnancy call for an urgent understanding of dual method (condoms with another modern non-barrier contraceptive) and consistent condom use. Multinomial and logistic regression analysis using data from the Pesquisa Nacional de Demografia e Saúde da Criança e da Mulher (PNDS), a nationally representative household survey of reproductive-aged women in Brazil, identified the socio-demographic, fertility and relationship context correlates of exclusive non-barrier contraception, dual method use and condom use consistency. Among women in marital and civil unions, half reported dual protection (30% condoms, 20% dual methods). In adjusted models, condom use was associated with older age and living in the northern region of Brazil or in urban areas, whereas dual method use (versus condom use) was associated with younger age, living in the southern region of Brazil, living in non-urban areas and relationship age homogamy. Among condom users, consistent condom use was associated with reporting Afro-religion or other religion, not wanting (more) children and using condoms only (versus dual methods). Findings highlight that integrated STI prevention and family planning services should target young married/in union women, couples not wanting (more) children and heterogamous relationships to increase dual method use and consistent condom use.

  13. AN INTEGRATED PERSPECTIVE ON THE ASSESSMENT OF TECHNOLOGIES: INTEGRATE-HTA.

    PubMed

    Wahlster, Philip; Brereton, Louise; Burns, Jacob; Hofmann, Björn; Mozygemba, Kati; Oortwijn, Wija; Pfadenhauer, Lisa; Polus, Stephanie; Rehfuess, Eva; Schilling, Imke; van der Wilt, Gert Jan; Gerhardus, Ansgar

    2017-01-01

    Current health technology assessment (HTA) is not well equipped to assess complex technologies as insufficient attention is being paid to the diversity in patient characteristics and preferences, context, and implementation. Strategies to integrate these and several other aspects, such as ethical considerations, in a comprehensive assessment are missing. The aim of the European research project INTEGRATE-HTA was to develop a model for an integrated HTA of complex technologies. A multi-method, four-stage approach guided the development of the INTEGRATE-HTA Model: (i) definition of the different dimensions of information to be integrated, (ii) literature review of existing methods for integration, (iii) adjustment of concepts and methods for assessing distinct aspects of complex technologies in the frame of an integrated process, and (iv) application of the model in a case study and subsequent revisions. The INTEGRATE-HTA Model consists of five steps, each involving stakeholders: (i) definition of the technology and the objective of the HTA; (ii) development of a logic model to provide a structured overview of the technology and the system in which it is embedded; (iii) evidence assessment on effectiveness, economic, ethical, legal, and socio-cultural aspects, taking variability of participants, context, implementation issues, and their interactions into account; (iv) populating the logic model with the data generated in step 3; (v) structured process of decision-making. The INTEGRATE-HTA Model provides a structured process for integrated HTAs of complex technologies. Stakeholder involvement in all steps is essential as a means of ensuring relevance and meaningful interpretation of the evidence.

  14. A Kinematically Consistent Two-Point Correlation Function

    NASA Technical Reports Server (NTRS)

    Ristorcelli, J. R.

    1998-01-01

    A simple kinematically consistent expression for the longitudinal two-point correlation function related to both the integral length scale and the Taylor microscale is obtained. On the inner scale, in a region of width inversely proportional to the turbulent Reynolds number, the function has the appropriate curvature at the origin. The expression for two-point correlation is related to the nonlinear cascade rate, or dissipation epsilon, a quantity that is carried as part of a typical single-point turbulence closure simulation. Constructing an expression for the two-point correlation whose curvature at the origin is the Taylor microscale incorporates one of the fundamental quantities characterizing turbulence, epsilon, into a model for the two-point correlation function. The integral of the function also gives, as is required, an outer integral length scale of the turbulence independent of viscosity. The proposed expression is obtained by kinematic arguments; the intention is to produce a practically applicable expression in terms of simple elementary functions that allow an analytical evaluation, by asymptotic methods, of diverse functionals relevant to single-point turbulence closures. Using the expression devised an example of the asymptotic method by which functionals of the two-point correlation can be evaluated is given.

  15. Self-Consistent Sources for Integrable Equations Via Deformations of Binary Darboux Transformations

    NASA Astrophysics Data System (ADS)

    Chvartatskyi, Oleksandr; Dimakis, Aristophanes; Müller-Hoissen, Folkert

    2016-08-01

    We reveal the origin and structure of self-consistent source extensions of integrable equations from the perspective of binary Darboux transformations. They arise via a deformation of the potential that is central in this method. As examples, we obtain in particular matrix versions of self-consistent source extensions of the KdV, Boussinesq, sine-Gordon, nonlinear Schrödinger, KP, Davey-Stewartson, two-dimensional Toda lattice and discrete KP equation. We also recover a (2+1)-dimensional version of the Yajima-Oikawa system from a deformation of the pKP hierarchy. By construction, these systems are accompanied by a hetero binary Darboux transformation, which generates solutions of such a system from a solution of the source-free system and additionally solutions of an associated linear system and its adjoint. The essence of all this is encoded in universal equations in the framework of bidifferential calculus.

  16. System and method for integrating and accessing multiple data sources within a data warehouse architecture

    DOEpatents

    Musick, Charles R [Castro Valley, CA; Critchlow, Terence [Livermore, CA; Ganesh, Madhaven [San Jose, CA; Slezak, Tom [Livermore, CA; Fidelis, Krzysztof [Brentwood, CA

    2006-12-19

    A system and method is disclosed for integrating and accessing multiple data sources within a data warehouse architecture. The metadata formed by the present method provide a way to declaratively present domain specific knowledge, obtained by analyzing data sources, in a consistent and useable way. Four types of information are represented by the metadata: abstract concepts, databases, transformations and mappings. A mediator generator automatically generates data management computer code based on the metadata. The resulting code defines a translation library and a mediator class. The translation library provides a data representation for domain specific knowledge represented in a data warehouse, including "get" and "set" methods for attributes that call transformation methods and derive a value of an attribute if it is missing. The mediator class defines methods that take "distinguished" high-level objects as input and traverse their data structures and enter information into the data warehouse.

  17. Towards a taxonomy for integrated care: a mixed-methods study.

    PubMed

    Valentijn, Pim P; Boesveld, Inge C; van der Klauw, Denise M; Ruwaard, Dirk; Struijs, Jeroen N; Molema, Johanna J W; Bruijnzeels, Marc A; Vrijhoef, Hubertus Jm

    2015-01-01

    Building integrated services in a primary care setting is considered an essential important strategy for establishing a high-quality and affordable health care system. The theoretical foundations of such integrated service models are described by the Rainbow Model of Integrated Care, which distinguishes six integration dimensions (clinical, professional, organisational, system, functional and normative integration). The aim of the present study is to refine the Rainbow Model of Integrated Care by developing a taxonomy that specifies the underlying key features of the six dimensions. First, a literature review was conducted to identify features for achieving integrated service delivery. Second, a thematic analysis method was used to develop a taxonomy of key features organised into the dimensions of the Rainbow Model of Integrated Care. Finally, the appropriateness of the key features was tested in a Delphi study among Dutch experts. The taxonomy consists of 59 key features distributed across the six integration dimensions of the Rainbow Model of Integrated Care. Key features associated with the clinical, professional, organisational and normative dimensions were considered appropriate by the experts. Key features linked to the functional and system dimensions were considered less appropriate. This study contributes to the ongoing debate of defining the concept and typology of integrated care. This taxonomy provides a development agenda for establishing an accepted scientific framework of integrated care from an end-user, professional, managerial and policy perspective.

  18. The Kadomtsev{endash}Petviashvili equation as a source of integrable model equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maccari, A.

    1996-12-01

    A new integrable and nonlinear partial differential equation (PDE) in 2+1 dimensions is obtained, by an asymptotically exact reduction method based on Fourier expansion and spatiotemporal rescaling, from the Kadomtsev{endash}Petviashvili equation. The integrability property is explicitly demonstrated, by exhibiting the corresponding Lax pair, that is obtained by applying the reduction technique to the Lax pair of the Kadomtsev{endash}Petviashvili equation. This model equation is likely to be of applicative relevance, because it may be considered a consistent approximation of a large class of nonlinear evolution PDEs. {copyright} {ital 1996 American Institute of Physics.}

  19. New methods in the Newtonian potential theory. I - The representation of the potential energy of homogeneous gravitating bodies by converging bodies

    NASA Astrophysics Data System (ADS)

    Kondrat'ev, B. P.

    1993-06-01

    A method is developed for the representation of the potential energy of homogeneous gravitating, as well as electrically charged, bodies in the form of special series. These series contain members consisting of products of the corresponding coefficients appearing in the expansion of external and internal Newtonian potentials in Legendre polynomial series. Several versions of the representation of potential energy through these series are possible. A formula which expresses potential energy not as a volume integral, as is the convention, but as an integral over the body surface is derived. The method is tested for the particular cases of sphere and ellipsoid, and the convergence of the found series is shown.

  20. Instantaneous Coastline Extraction from LIDAR Point Cloud and High Resolution Remote Sensing Imagery

    NASA Astrophysics Data System (ADS)

    Li, Y.; Zhoing, L.; Lai, Z.; Gan, Z.

    2018-04-01

    A new method was proposed for instantaneous waterline extraction in this paper, which combines point cloud geometry features and image spectral characteristics of the coastal zone. The proposed method consists of follow steps: Mean Shift algorithm is used to segment the coastal zone of high resolution remote sensing images into small regions containing semantic information;Region features are extracted by integrating LiDAR data and the surface area of the image; initial waterlines are extracted by α-shape algorithm; a region growing algorithm with is taking into coastline refinement, with a growth rule integrating the intensity and topography of LiDAR data; moothing the coastline. Experiments are conducted to demonstrate the efficiency of the proposed method.

  1. Predicting Flory-Huggins χ from Simulations

    NASA Astrophysics Data System (ADS)

    Zhang, Wenlin; Gomez, Enrique D.; Milner, Scott T.

    2017-07-01

    We introduce a method, based on a novel thermodynamic integration scheme, to extract the Flory-Huggins χ parameter as small as 10-3k T for polymer blends from molecular dynamics (MD) simulations. We obtain χ for the archetypical coarse-grained model of nonpolar polymer blends: flexible bead-spring chains with different Lennard-Jones interactions between A and B monomers. Using these χ values and a lattice version of self-consistent field theory (SCFT), we predict the shape of planar interfaces for phase-separated binary blends. Our SCFT results agree with MD simulations, validating both the predicted χ values and our thermodynamic integration method. Combined with atomistic simulations, our method can be applied to predict χ for new polymers from their chemical structures.

  2. Advancement of Bi-Level Integrated System Synthesis (BLISS)

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Emiley, Mark S.; Agte, Jeremy S.; Sandusky, Robert R., Jr.

    2000-01-01

    Bi-Level Integrated System Synthesis (BLISS) is a method for optimization of an engineering system, e.g., an aerospace vehicle. BLISS consists of optimizations at the subsystem (module) and system levels to divide the overall large optimization task into sets of smaller ones that can be executed concurrently. In the initial version of BLISS that was introduced and documented in previous publications, analysis in the modules was kept at the early conceptual design level. This paper reports on the next step in the BLISS development in which the fidelity of the aerodynamic drag and structural stress and displacement analyses were upgraded while the method's satisfactory convergence rate was retained.

  3. Integrative missing value estimation for microarray data.

    PubMed

    Hu, Jianjun; Li, Haifeng; Waterman, Michael S; Zhou, Xianghong Jasmine

    2006-10-12

    Missing value estimation is an important preprocessing step in microarray analysis. Although several methods have been developed to solve this problem, their performance is unsatisfactory for datasets with high rates of missing data, high measurement noise, or limited numbers of samples. In fact, more than 80% of the time-series datasets in Stanford Microarray Database contain less than eight samples. We present the integrative Missing Value Estimation method (iMISS) by incorporating information from multiple reference microarray datasets to improve missing value estimation. For each gene with missing data, we derive a consistent neighbor-gene list by taking reference data sets into consideration. To determine whether the given reference data sets are sufficiently informative for integration, we use a submatrix imputation approach. Our experiments showed that iMISS can significantly and consistently improve the accuracy of the state-of-the-art Local Least Square (LLS) imputation algorithm by up to 15% improvement in our benchmark tests. We demonstrated that the order-statistics-based integrative imputation algorithms can achieve significant improvements over the state-of-the-art missing value estimation approaches such as LLS and is especially good for imputing microarray datasets with a limited number of samples, high rates of missing data, or very noisy measurements. With the rapid accumulation of microarray datasets, the performance of our approach can be further improved by incorporating larger and more appropriate reference datasets.

  4. An integrated approach to evaluate food antioxidant capacity.

    PubMed

    Sun, T; Tanumihardjo, S A

    2007-11-01

    Many methods are available for determining food antioxidant capacity, which is an important topic in food and nutrition research and marketing. However, the results and inferences from different methods may vary substantially because each complex chemical reaction generates unique values. To get a complete and dynamic picture of the ranking of food antioxidant capacity, relative antioxidant capacity index (RACI), a hypothetical concept, is created from the perspective of statistics by integrating the antioxidant capacity values generated from different in vitro methods. RACI is the mean value of standard scores transformed from the initial data generated with different methods for each food item. By comparing the antioxidant capacity of 20 commonly consumed vegetables in the U.S. market that were measured with 7 chemical methods, we demonstrated that the RACI correlated strongly with each method. The significant correlation of RACI with an independent data set further confirmed that RACI is a valid tool to assess food antioxidant capacity. The key advantage of this integrated approach is that RACI is in a numerical scale with no units and has consistent agreement with chemical methods. Although it is a relative index and may not represent a specific antioxidant property of different food items, RACI provides a reasonably accurate rank of antioxidant capacity among foods. Therefore, it can be used as an integrated approach to evaluate food antioxidant capacity.

  5. An integrated approach to monitoring the calibration stability of operational dual-polarization radars

    DOE PAGES

    Vaccarono, Mattia; Bechini, Renzo; Chandrasekar, Chandra V.; ...

    2016-11-08

    The stability of weather radar calibration is a mandatory aspect for quantitative applications, such as rainfall estimation, short-term weather prediction and initialization of numerical atmospheric and hydrological models. Over the years, calibration monitoring techniques based on external sources have been developed, specifically calibration using the Sun and calibration based on ground clutter returns. In this paper, these two techniques are integrated and complemented with a self-consistency procedure and an intercalibration technique. The aim of the integrated approach is to implement a robust method for online monitoring, able to detect significant changes in the radar calibration. The physical consistency of polarimetricmore » radar observables is exploited using the self-consistency approach, based on the expected correspondence between dual-polarization power and phase measurements in rain. This technique allows a reference absolute value to be provided for the radar calibration, from which eventual deviations may be detected using the other procedures. In particular, the ground clutter calibration is implemented on both polarization channels (horizontal and vertical) for each radar scan, allowing the polarimetric variables to be monitored and hardware failures to promptly be recognized. The Sun calibration allows monitoring the calibration and sensitivity of the radar receiver, in addition to the antenna pointing accuracy. It is applied using observations collected during the standard operational scans but requires long integration times (several days) in order to accumulate a sufficient amount of useful data. Finally, an intercalibration technique is developed and performed to compare colocated measurements collected in rain by two radars in overlapping regions. The integrated approach is performed on the C-band weather radar network in northwestern Italy, during July–October 2014. The set of methods considered appears suitable to establish an online tool to monitor the stability of the radar calibration with an accuracy of about 2 dB. In conclusion, this is considered adequate to automatically detect any unexpected change in the radar system requiring further data analysis or on-site measurements.« less

  6. Data integration for inference about spatial processes: A model-based approach to test and account for data inconsistency

    PubMed Central

    Pedrini, Paolo; Bragalanti, Natalia; Groff, Claudio

    2017-01-01

    Recently-developed methods that integrate multiple data sources arising from the same ecological processes have typically utilized structured data from well-defined sampling protocols (e.g., capture-recapture and telemetry). Despite this new methodological focus, the value of opportunistic data for improving inference about spatial ecological processes is unclear and, perhaps more importantly, no procedures are available to formally test whether parameter estimates are consistent across data sources and whether they are suitable for integration. Using data collected on the reintroduced brown bear population in the Italian Alps, a population of conservation importance, we combined data from three sources: traditional spatial capture-recapture data, telemetry data, and opportunistic data. We developed a fully integrated spatial capture-recapture (SCR) model that included a model-based test for data consistency to first compare model estimates using different combinations of data, and then, by acknowledging data-type differences, evaluate parameter consistency. We demonstrate that opportunistic data lend itself naturally to integration within the SCR framework and highlight the value of opportunistic data for improving inference about space use and population size. This is particularly relevant in studies of rare or elusive species, where the number of spatial encounters is usually small and where additional observations are of high value. In addition, our results highlight the importance of testing and accounting for inconsistencies in spatial information from structured and unstructured data so as to avoid the risk of spurious or averaged estimates of space use and consequently, of population size. Our work supports the use of a single modeling framework to combine spatially-referenced data while also accounting for parameter consistency. PMID:28973034

  7. Harvesting small stems -- A Southern USA perspective

    Treesearch

    William F. Watson; Bryce J. Stokes

    1989-01-01

    Operations that harvest small stems using conventional equipment are discussed. A typical operation consists of rubber-tired feller-bunchers with shear heads, rubber-tired grapple skidders, and in-woods chippers. These systems harvest the small stems either in a pre-harvest, postharvest, or integrated-harvest method.

  8. [How timely are the methods taught in psychotherapy training and practice?].

    PubMed

    Beutel, Manfred E; Michal, Matthias; Wiltink, Jörg; Subic-Wrana, Claudia

    2015-01-01

    Even though many psychotherapists consider themselves to be eclectic or integrative, training and reimbursement in the modern healthcare system are clearly oriented toward the model of distinct psychotherapy approaches. Prompted by the proposition to favor general, disorder-oriented psychotherapy, we investigate how timely distinctive methods are that are taught in training and practice. We reviewed the pertinent literature regarding general and specific factors, the effectiveness of integrative and eclectic treatments, orientation toward specific disorders, manualization and psychotherapeutic training. There is a lack of systematic studies on the efficacy of combining therapy methods from different approaches. The first empirical findings reveal that a superiority of combined versus single treatmentmethods has yet to be demonstrated. The development of transnosological manuals shows the limits of disorder-specific treatment.General factors such as therapeutic alliance or education about the model of disease and treatment rationale require specific definitions. Taking reference to a specific treatment approach provides important consistency of theory, training therapy and supervision, though this does not preclude an openness toward other therapy concepts. Current manualized examples show that methods and techniques can indeed be integrated from other approaches. Integrating different methods can also be seen as a developmental task for practitioners and researchers which may be mastered increasingly better with more experience.

  9. Method used to test the imaging consistency of binocular camera's left-right optical system

    NASA Astrophysics Data System (ADS)

    Liu, Meiying; Wang, Hu; Liu, Jie; Xue, Yaoke; Yang, Shaodong; Zhao, Hui

    2016-09-01

    To binocular camera, the consistency of optical parameters of the left and the right optical system is an important factor that will influence the overall imaging consistency. In conventional testing procedure of optical system, there lacks specifications suitable for evaluating imaging consistency. In this paper, considering the special requirements of binocular optical imaging system, a method used to measure the imaging consistency of binocular camera is presented. Based on this method, a measurement system which is composed of an integrating sphere, a rotary table and a CMOS camera has been established. First, let the left and the right optical system capture images in normal exposure time under the same condition. Second, a contour image is obtained based on the multiple threshold segmentation result and the boundary is determined using the slope of contour lines near the pseudo-contour line. Third, the constraint of gray level based on the corresponding coordinates of left-right images is established and the imaging consistency could be evaluated through standard deviation σ of the imaging grayscale difference D (x, y) between the left and right optical system. The experiments demonstrate that the method is suitable for carrying out the imaging consistency testing for binocular camera. When the standard deviation 3σ distribution of imaging gray difference D (x, y) between the left and right optical system of the binocular camera does not exceed 5%, it is believed that the design requirements have been achieved. This method could be used effectively and paves the way for the imaging consistency testing of the binocular camera.

  10. Lie symmetry analysis, Bäcklund transformations, and exact solutions of a (2 + 1)-dimensional Boiti-Leon-Pempinelli system

    NASA Astrophysics Data System (ADS)

    Zhao, Zhonglong; Han, Bo

    2017-10-01

    In this paper, the Lie symmetry analysis method is employed to investigate the Lie point symmetries and the one-parameter transformation groups of a (2 + 1)-dimensional Boiti-Leon-Pempinelli system. By using Ibragimov's method, the optimal system of one-dimensional subalgebras of this system is constructed. Truncated Painlevé analysis is used for deriving the Bäcklund transformation. The method of constructing lump-type solutions of integrable equations by means of Bäcklund transformation is first presented. Meanwhile, the lump-type solutions of the (2 + 1)-dimensional Boiti-Leon-Pempinelli system are obtained. The lump-type wave is one kind of rogue wave. The fusion-type N-solitary wave solutions are also constructed. In addition, this system is integrable in terms of the consistent Riccati expansion method.

  11. Analysis of crack propagation in roller bearings using the boundary integral equation method - A mixed-mode loading problem

    NASA Technical Reports Server (NTRS)

    Ghosn, L. J.

    1988-01-01

    Crack propagation in a rotating inner raceway of a high-speed roller bearing is analyzed using the boundary integral method. The model consists of an edge plate under plane strain condition upon which varying Hertzian stress fields are superimposed. A multidomain boundary integral equation using quadratic elements was written to determine the stress intensity factors KI and KII at the crack tip for various roller positions. The multidomain formulation allows the two faces of the crack to be modeled in two different subregions, making it possible to analyze crack closure when the roller is positioned on or close to the crack line. KI and KII stress intensity factors along any direction were computed. These calculations permit determination of crack growth direction along which the average KI times the alternating KI is maximum.

  12. A SINS/SRS/GNS Autonomous Integrated Navigation System Based on Spectral Redshift Velocity Measurements.

    PubMed

    Wei, Wenhui; Gao, Zhaohui; Gao, Shesheng; Jia, Ke

    2018-04-09

    In order to meet the requirements of autonomy and reliability for the navigation system, combined with the method of measuring speed by using the spectral redshift information of the natural celestial bodies, a new scheme, consisting of Strapdown Inertial Navigation System (SINS)/Spectral Redshift (SRS)/Geomagnetic Navigation System (GNS), is designed for autonomous integrated navigation systems. The principle of this SINS/SRS/GNS autonomous integrated navigation system is explored, and the corresponding mathematical model is established. Furthermore, a robust adaptive central difference particle filtering algorithm is proposed for this autonomous integrated navigation system. The simulation experiments are conducted and the results show that the designed SINS/SRS/GNS autonomous integrated navigation system possesses good autonomy, strong robustness and high reliability, thus providing a new solution for autonomous navigation technology.

  13. The Space-Time Conservation Element and Solution Element Method: A New High-Resolution and Genuinely Multidimensional Paradigm for Solving Conservation Laws. 1; The Two Dimensional Time Marching Schemes

    NASA Technical Reports Server (NTRS)

    Chang, Sin-Chung; Wang, Xiao-Yen; Chow, Chuen-Yen

    1998-01-01

    A new high resolution and genuinely multidimensional numerical method for solving conservation laws is being, developed. It was designed to avoid the limitations of the traditional methods. and was built from round zero with extensive physics considerations. Nevertheless, its foundation is mathmatically simple enough that one can build from it a coherent, robust. efficient and accurate numerical framework. Two basic beliefs that set the new method apart from the established methods are at the core of its development. The first belief is that, in order to capture physics more efficiently and realistically, the modeling, focus should be placed on the original integral form of the physical conservation laws, rather than the differential form. The latter form follows from the integral form under the additional assumption that the physical solution is smooth, an assumption that is difficult to realize numerically in a region of rapid chance. such as a boundary layer or a shock. The second belief is that, with proper modeling of the integral and differential forms themselves, the resulting, numerical solution should automatically be consistent with the properties derived front the integral and differential forms, e.g., the jump conditions across a shock and the properties of characteristics. Therefore a much simpler and more robust method can be developed by not using the above derived properties explicitly.

  14. Massive integration of diverse protein quality assessment methods to improve template based modeling in CASP11

    PubMed Central

    Cao, Renzhi; Bhattacharya, Debswapna; Adhikari, Badri; Li, Jilong; Cheng, Jianlin

    2015-01-01

    Model evaluation and selection is an important step and a big challenge in template-based protein structure prediction. Individual model quality assessment methods designed for recognizing some specific properties of protein structures often fail to consistently select good models from a model pool because of their limitations. Therefore, combining multiple complimentary quality assessment methods is useful for improving model ranking and consequently tertiary structure prediction. Here, we report the performance and analysis of our human tertiary structure predictor (MULTICOM) based on the massive integration of 14 diverse complementary quality assessment methods that was successfully benchmarked in the 11th Critical Assessment of Techniques of Protein Structure prediction (CASP11). The predictions of MULTICOM for 39 template-based domains were rigorously assessed by six scoring metrics covering global topology of Cα trace, local all-atom fitness, side chain quality, and physical reasonableness of the model. The results show that the massive integration of complementary, diverse single-model and multi-model quality assessment methods can effectively leverage the strength of single-model methods in distinguishing quality variation among similar good models and the advantage of multi-model quality assessment methods of identifying reasonable average-quality models. The overall excellent performance of the MULTICOM predictor demonstrates that integrating a large number of model quality assessment methods in conjunction with model clustering is a useful approach to improve the accuracy, diversity, and consequently robustness of template-based protein structure prediction. PMID:26369671

  15. The Role of Applied Epidemiology Methods in the Disaster Management Cycle

    PubMed Central

    Heumann, Michael; Perrotta, Dennis; Wolkin, Amy F.; Schnall, Amy H.; Podgornik, Michelle N.; Cruz, Miguel A.; Horney, Jennifer A.; Zane, David; Roisman, Rachel; Greenspan, Joel R.; Thoroughman, Doug; Anderson, Henry A.; Wells, Eden V.; Simms, Erin F.

    2014-01-01

    Disaster epidemiology (i.e., applied epidemiology in disaster settings) presents a source of reliable and actionable information for decision-makers and stakeholders in the disaster management cycle. However, epidemiological methods have yet to be routinely integrated into disaster response and fully communicated to response leaders. We present a framework consisting of rapid needs assessments, health surveillance, tracking and registries, and epidemiological investigations, including risk factor and health outcome studies and evaluation of interventions, which can be practiced throughout the cycle. Applying each method can result in actionable information for planners and decision-makers responsible for preparedness, response, and recovery. Disaster epidemiology, once integrated into the disaster management cycle, can provide the evidence base to inform and enhance response capability within the public health infrastructure. PMID:25211748

  16. The reduced basis method for the electric field integral equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fares, M., E-mail: fares@cerfacs.f; Hesthaven, J.S., E-mail: Jan_Hesthaven@Brown.ed; Maday, Y., E-mail: maday@ann.jussieu.f

    We introduce the reduced basis method (RBM) as an efficient tool for parametrized scattering problems in computational electromagnetics for problems where field solutions are computed using a standard Boundary Element Method (BEM) for the parametrized electric field integral equation (EFIE). This combination enables an algorithmic cooperation which results in a two step procedure. The first step consists of a computationally intense assembling of the reduced basis, that needs to be effected only once. In the second step, we compute output functionals of the solution, such as the Radar Cross Section (RCS), independently of the dimension of the discretization space, formore » many different parameter values in a many-query context at very little cost. Parameters include the wavenumber, the angle of the incident plane wave and its polarization.« less

  17. Kernel-PCA data integration with enhanced interpretability

    PubMed Central

    2014-01-01

    Background Nowadays, combining the different sources of information to improve the biological knowledge available is a challenge in bioinformatics. One of the most powerful methods for integrating heterogeneous data types are kernel-based methods. Kernel-based data integration approaches consist of two basic steps: firstly the right kernel is chosen for each data set; secondly the kernels from the different data sources are combined to give a complete representation of the available data for a given statistical task. Results We analyze the integration of data from several sources of information using kernel PCA, from the point of view of reducing dimensionality. Moreover, we improve the interpretability of kernel PCA by adding to the plot the representation of the input variables that belong to any dataset. In particular, for each input variable or linear combination of input variables, we can represent the direction of maximum growth locally, which allows us to identify those samples with higher/lower values of the variables analyzed. Conclusions The integration of different datasets and the simultaneous representation of samples and variables together give us a better understanding of biological knowledge. PMID:25032747

  18. Integrating the Advanced Human Eye Model (AHEM) and optical instrument models to model complete visual optical systems inclusive of the typical or atypical eye

    NASA Astrophysics Data System (ADS)

    Donnelly, William J., III

    2012-06-01

    PURPOSE: To present a commercially available optical modeling software tool to assist the development of optical instrumentation and systems that utilize and/or integrate with the human eye. METHODS: A commercially available flexible eye modeling system is presented, the Advanced Human Eye Model (AHEM). AHEM is a module that the engineer can use to perform rapid development and test scenarios on systems that integrate with the eye. Methods include merging modeled systems initially developed outside of AHEM and performing a series of wizard-type operations that relieve the user from requiring an optometric or ophthalmic background to produce a complete eye inclusive system. Scenarios consist of retinal imaging of targets and sources through integrated systems. Uses include, but are not limited to, optimization, telescopes, microscopes, spectacles, contact and intraocular lenses, ocular aberrations, cataract simulation and scattering, and twin eye model (binocular) systems. RESULTS: Metrics, graphical data, and exportable CAD geometry are generated from the various modeling scenarios.

  19. A numerical scheme to solve unstable boundary value problems

    NASA Technical Reports Server (NTRS)

    Kalnay Derivas, E.

    1975-01-01

    A new iterative scheme for solving boundary value problems is presented. It consists of the introduction of an artificial time dependence into a modified version of the system of equations. Then explicit forward integrations in time are followed by explicit integrations backwards in time. The method converges under much more general conditions than schemes based in forward time integrations (false transient schemes). In particular it can attain a steady state solution of an elliptical system of equations even if the solution is unstable, in which case other iterative schemes fail to converge. The simplicity of its use makes it attractive for solving large systems of nonlinear equations.

  20. Models Extracted from Text for System-Software Safety Analyses

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2010-01-01

    This presentation describes extraction and integration of requirements information and safety information in visualizations to support early review of completeness, correctness, and consistency of lengthy and diverse system safety analyses. Software tools have been developed and extended to perform the following tasks: 1) extract model parts and safety information from text in interface requirements documents, failure modes and effects analyses and hazard reports; 2) map and integrate the information to develop system architecture models and visualizations for safety analysts; and 3) provide model output to support virtual system integration testing. This presentation illustrates the methods and products with a rocket motor initiation case.

  1. Integrating Multimedia Instructional Design Principles with Complex Physiological Concepts in Reproductive Science

    ERIC Educational Resources Information Center

    Oki, Angela Christine

    2011-01-01

    This dissertation examines the effect of digital multimedia presentations as a method to teach complex concepts in reproductive physiology. The digital presentations developed for this research consisted of two-dimensional (2-D) and three-dimensional (3-D) animations, scriptmessaging and narration. The topics were "Mammalian Ovarian…

  2. A MATLAB-Aided Method for Teaching Calculus-Based Business Mathematics

    ERIC Educational Resources Information Center

    Liang, Jiajuan; Pan, William S. Y.

    2009-01-01

    MATLAB is a powerful package for numerical computation. MATLAB contains a rich pool of mathematical functions and provides flexible plotting functions for illustrating mathematical solutions. The course of calculus-based business mathematics consists of two major topics: 1) derivative and its applications in business; and 2) integration and its…

  3. Courseware Authoring and Delivering System for Chinese Language Instruction. Final Report.

    ERIC Educational Resources Information Center

    Mao, Tang

    A study investigated technical methods for simplifying and improving the creation of software for teaching uncommonly taught languages such as Chinese. Research consisted of assessment of existing authoring systems, domestic and overseas, available hardware, peripherals, and software packages that could be integrated into this project. Then some…

  4. Time Series ARIMA Models of Undergraduate Grade Point Average.

    ERIC Educational Resources Information Center

    Rogers, Bruce G.

    The Auto-Regressive Integrated Moving Average (ARIMA) Models, often referred to as Box-Jenkins models, are regression methods for analyzing sequential dependent observations with large amounts of data. The Box-Jenkins approach, a three-stage procedure consisting of identification, estimation and diagnosis, was used to select the most appropriate…

  5. Deep 3D Convolutional Encoder Networks With Shortcuts for Multiscale Feature Integration Applied to Multiple Sclerosis Lesion Segmentation.

    PubMed

    Brosch, Tom; Tang, Lisa Y W; Youngjin Yoo; Li, David K B; Traboulsee, Anthony; Tam, Roger

    2016-05-01

    We propose a novel segmentation approach based on deep 3D convolutional encoder networks with shortcut connections and apply it to the segmentation of multiple sclerosis (MS) lesions in magnetic resonance images. Our model is a neural network that consists of two interconnected pathways, a convolutional pathway, which learns increasingly more abstract and higher-level image features, and a deconvolutional pathway, which predicts the final segmentation at the voxel level. The joint training of the feature extraction and prediction pathways allows for the automatic learning of features at different scales that are optimized for accuracy for any given combination of image types and segmentation task. In addition, shortcut connections between the two pathways allow high- and low-level features to be integrated, which enables the segmentation of lesions across a wide range of sizes. We have evaluated our method on two publicly available data sets (MICCAI 2008 and ISBI 2015 challenges) with the results showing that our method performs comparably to the top-ranked state-of-the-art methods, even when only relatively small data sets are available for training. In addition, we have compared our method with five freely available and widely used MS lesion segmentation methods (EMS, LST-LPA, LST-LGA, Lesion-TOADS, and SLS) on a large data set from an MS clinical trial. The results show that our method consistently outperforms these other methods across a wide range of lesion sizes.

  6. Reasoning about energy in qualitative simulation

    NASA Technical Reports Server (NTRS)

    Fouche, Pierre; Kuipers, Benjamin J.

    1992-01-01

    While possible behaviors of a mechanism that are consistent with an incomplete state of knowledge can be predicted through qualitative modeling and simulation, spurious behaviors corresponding to no solution of any ordinary differential equation consistent with the model may be generated. The present method for energy-related reasoning eliminates an important source of spurious behaviors, as demonstrated by its application to a nonlinear, proportional-integral controlled. It is shown that such qualitative properties of such a system as stability and zero-offset control are captured by the simulation.

  7. Improving the result of forcasting using reservoir and surface network simulation

    NASA Astrophysics Data System (ADS)

    Hendri, R. S.; Winarta, J.

    2018-01-01

    This study was aimed to get more representative results in production forcasting using integrated simulation in pipeline gathering system of X field. There are 5 main scenarios which consist of the production forecast of the existing condition, work over, and infill drilling. Then, it’s determined the best development scenario. The methods of this study is Integrated Reservoir Simulator and Pipeline Simulator so-calle as Integrated Reservoir and Surface Network Simulation. After well data result from reservoir simulator was then integrated with pipeline networking simulator’s to construct a new schedule, which was input for all simulation procedure. The well design result was done by well modeling simulator then exported into pipeline simulator. Reservoir prediction depends on the minimum value of Tubing Head Pressure (THP) for each well, where the pressure drop on the Gathering Network is not necessary calculated. The same scenario was done also for the single-reservoir simulation. Integration Simulation produces results approaching the actual condition of the reservoir and was confirmed by the THP profile, which difference between those two methods. The difference between integrated simulation compared to single-modeling simulation is 6-9%. The aimed of solving back-pressure problem in pipeline gathering system of X field is achieved.

  8. Quantum theory of multiscale coarse-graining.

    PubMed

    Han, Yining; Jin, Jaehyeok; Wagner, Jacob W; Voth, Gregory A

    2018-03-14

    Coarse-grained (CG) models serve as a powerful tool to simulate molecular systems at much longer temporal and spatial scales. Previously, CG models and methods have been built upon classical statistical mechanics. The present paper develops a theory and numerical methodology for coarse-graining in quantum statistical mechanics, by generalizing the multiscale coarse-graining (MS-CG) method to quantum Boltzmann statistics. A rigorous derivation of the sufficient thermodynamic consistency condition is first presented via imaginary time Feynman path integrals. It identifies the optimal choice of CG action functional and effective quantum CG (qCG) force field to generate a quantum MS-CG (qMS-CG) description of the equilibrium system that is consistent with the quantum fine-grained model projected onto the CG variables. A variational principle then provides a class of algorithms for optimally approximating the qMS-CG force fields. Specifically, a variational method based on force matching, which was also adopted in the classical MS-CG theory, is generalized to quantum Boltzmann statistics. The qMS-CG numerical algorithms and practical issues in implementing this variational minimization procedure are also discussed. Then, two numerical examples are presented to demonstrate the method. Finally, as an alternative strategy, a quasi-classical approximation for the thermal density matrix expressed in the CG variables is derived. This approach provides an interesting physical picture for coarse-graining in quantum Boltzmann statistical mechanics in which the consistency with the quantum particle delocalization is obviously manifest, and it opens up an avenue for using path integral centroid-based effective classical force fields in a coarse-graining methodology.

  9. Numerical simulation of pseudoelastic shape memory alloys using the large time increment method

    NASA Astrophysics Data System (ADS)

    Gu, Xiaojun; Zhang, Weihong; Zaki, Wael; Moumni, Ziad

    2017-04-01

    The paper presents a numerical implementation of the large time increment (LATIN) method for the simulation of shape memory alloys (SMAs) in the pseudoelastic range. The method was initially proposed as an alternative to the conventional incremental approach for the integration of nonlinear constitutive models. It is adapted here for the simulation of pseudoelastic SMA behavior using the Zaki-Moumni model and is shown to be especially useful in situations where the phase transformation process presents little or lack of hardening. In these situations, a slight stress variation in a load increment can result in large variations of strain and local state variables, which may lead to difficulties in numerical convergence. In contrast to the conventional incremental method, the LATIN method solve the global equilibrium and local consistency conditions sequentially for the entire loading path. The achieved solution must satisfy the conditions of static and kinematic admissibility and consistency simultaneously after several iterations. 3D numerical implementation is accomplished using an implicit algorithm and is then used for finite element simulation using the software Abaqus. Computational tests demonstrate the ability of this approach to simulate SMAs presenting flat phase transformation plateaus and subjected to complex loading cases, such as the quasi-static behavior of a stent structure. Some numerical results are contrasted to those obtained using step-by-step incremental integration.

  10. Integrative Structure Determination of Protein Assemblies by Satisfaction of Spatial Restraints

    NASA Astrophysics Data System (ADS)

    Alber, Frank; Chait, Brian T.; Rout, Michael P.; Sali, Andrej

    To understand the cell, we need to determine the structures of macromolecular assemblies, many of which consist of tens to hundreds of components. A great variety of experimental data can be used to characterize the assemblies at several levels of resolution, from atomic structures to component configurations. To maximize completeness, resolution, accuracy, precision and efficiency of the structure determination, a computational approach is needed that can use spatial information from a variety of experimental methods. We propose such an approach, defined by its three main components: a hierarchical representation of the assembly, a scoring function consisting of spatial restraints derived from experimental data, and an optimization method that generates structures consistent with the data. We illustrate the approach by determining the configuration of the 456 proteins in the nuclear pore complex from Baker's yeast.

  11. Team-based learning in therapeutics workshop sessions.

    PubMed

    Beatty, Stuart J; Kelley, Katherine A; Metzger, Anne H; Bellebaum, Katherine L; McAuley, James W

    2009-10-01

    To implement team-based learning in the workshop portion of a pathophysiology and therapeutics sequence of courses to promote integration of concepts across the pharmacy curriculum, provide a consistent problem-solving approach to patient care, and determine the impact on student perceptions of professionalism and teamwork. Team-based learning was incorporated into the workshop portion of 3 of 6 pathophysiology and therapeutics courses. Assignments that promoted team-building and application of key concepts were created. Readiness assurance tests were used to assess individual and team understanding of course materials. Students consistently scored 20% higher on team assessments compared with individual assessments. Mean professionalism and teamwork scores were significantly higher after implementation of team-based learning; however, this improvement was not considered educationally significant. Approximately 91% of students felt team-based learning improved understanding of course materials and 93% of students felt teamwork should continue in workshops. Team-based learning is an effective teaching method to ensure a consistent approach to problem-solving and curriculum integration in workshop sessions for a pathophysiology and therapeutics course sequence.

  12. Shared or Integrated: Which Type of Integration is More Effective Improves Students’ Creativity?

    NASA Astrophysics Data System (ADS)

    Mariyam, M.; Kaniawati, I.; Sriyati, S.

    2017-09-01

    Integrated science learning has various types of integration. This study aims to apply shared and integrated type of integration with project based learning (PjBL) model to improve students’ creativity on waste recycling theme. The research method used is a quasi experiment with the matching-only pre test-post test design. The samples of this study are 108 students consisting of 36 students (experiment class 1st), 35 students (experiment class 2nd) and 37 students (control class 3rd) at one of Junior High School in Tanggamus, Lampung. The results show that there is difference of creativity improvement in the class applied by PjBL model with shared type of integration, integrated type of integration and without any integration in waste recycling theme. Class applied by PjBL model with shared type of integration has the higher creativity improvement than the PjBL model with integrated type of integration and without any integration. Integrated science learning using shared type only combines 2 lessons, hence an intact concept is resulted. So, PjBL model with shared type of integration more effective improves students’ creativity than integrated type.

  13. Combined Feature Based and Shape Based Visual Tracker for Robot Navigation

    NASA Technical Reports Server (NTRS)

    Deans, J.; Kunz, C.; Sargent, R.; Park, E.; Pedersen, L.

    2005-01-01

    We have developed a combined feature based and shape based visual tracking system designed to enable a planetary rover to visually track and servo to specific points chosen by a user with centimeter precision. The feature based tracker uses invariant feature detection and matching across a stereo pair, as well as matching pairs before and after robot movement in order to compute an incremental 6-DOF motion at each tracker update. This tracking method is subject to drift over time, which can be compensated by the shape based method. The shape based tracking method consists of 3D model registration, which recovers 6-DOF motion given sufficient shape and proper initialization. By integrating complementary algorithms, the combined tracker leverages the efficiency and robustness of feature based methods with the precision and accuracy of model registration. In this paper, we present the algorithms and their integration into a combined visual tracking system.

  14. Integrating computer programs for engineering analysis and design

    NASA Technical Reports Server (NTRS)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  15. A SINS/SRS/GNS Autonomous Integrated Navigation System Based on Spectral Redshift Velocity Measurements

    PubMed Central

    Wei, Wenhui; Gao, Zhaohui; Gao, Shesheng; Jia, Ke

    2018-01-01

    In order to meet the requirements of autonomy and reliability for the navigation system, combined with the method of measuring speed by using the spectral redshift information of the natural celestial bodies, a new scheme, consisting of Strapdown Inertial Navigation System (SINS)/Spectral Redshift (SRS)/Geomagnetic Navigation System (GNS), is designed for autonomous integrated navigation systems. The principle of this SINS/SRS/GNS autonomous integrated navigation system is explored, and the corresponding mathematical model is established. Furthermore, a robust adaptive central difference particle filtering algorithm is proposed for this autonomous integrated navigation system. The simulation experiments are conducted and the results show that the designed SINS/SRS/GNS autonomous integrated navigation system possesses good autonomy, strong robustness and high reliability, thus providing a new solution for autonomous navigation technology. PMID:29642549

  16. The Riemann-Lanczos equations in general relativity and their integrability

    NASA Astrophysics Data System (ADS)

    Dolan, P.; Gerber, A.

    2008-06-01

    The aim of this paper is to examine the Riemann-Lanczos equations and how they can be made integrable. They consist of a system of linear first-order partial differential equations that arise in general relativity, whereby the Riemann curvature tensor is generated by an unknown third-order tensor potential field called the Lanczos tensor. Our approach is based on the theory of jet bundles, where all field variables and all their partial derivatives of all relevant orders are treated as independent variables alongside the local manifold coordinates (xa) on the given space-time manifold M. This approach is adopted in (a) Cartan's method of exterior differential systems, (b) Vessiot's dual method using vector field systems, and (c) the Janet-Riquier theory of systems of partial differential equations. All three methods allow for the most general situations under which integrability conditions can be found. They give equivalent results, namely, that involutivity is always achieved at all generic points of the jet manifold M after a finite number of prolongations. Two alternative methods that appear in the general relativity literature to find integrability conditions for the Riemann-Lanczos equations generate new partial differential equations for the Lanczos potential that introduce a source term, which is nonlinear in the components of the Riemann tensor. We show that such sources do not occur when either of method (a), (b), or (c) are used.

  17. Integration of an OWL-DL knowledge base with an EHR prototype and providing customized information.

    PubMed

    Jing, Xia; Kay, Stephen; Marley, Tom; Hardiker, Nicholas R

    2014-09-01

    When clinicians use electronic health record (EHR) systems, their ability to obtain general knowledge is often an important contribution to their ability to make more informed decisions. In this paper we describe a method by which an external, formal representation of clinical and molecular genetic knowledge can be integrated into an EHR such that customized knowledge can be delivered to clinicians in a context-appropriate manner.Web Ontology Language-Description Logic (OWL-DL) is a formal knowledge representation language that is widely used for creating, organizing and managing biomedical knowledge through the use of explicit definitions, consistent structure and a computer-processable format, particularly in biomedical fields. In this paper we describe: 1) integration of an OWL-DL knowledge base with a standards-based EHR prototype, 2) presentation of customized information from the knowledge base via the EHR interface, and 3) lessons learned via the process. The integration was achieved through a combination of manual and automatic methods. Our method has advantages for scaling up to and maintaining knowledge bases of any size, with the goal of assisting clinicians and other EHR users in making better informed health care decisions.

  18. Two-dimensional phase unwrapping using robust derivative estimation and adaptive integration.

    PubMed

    Strand, Jarle; Taxt, Torfinn

    2002-01-01

    The adaptive integration (ADI) method for two-dimensional (2-D) phase unwrapping is presented. The method uses an algorithm for noise robust estimation of partial derivatives, followed by a noise robust adaptive integration process. The ADI method can easily unwrap phase images with moderate noise levels, and the resulting images are congruent modulo 2pi with the observed, wrapped, input images. In a quantitative evaluation, both the ADI and the BLS methods (Strand et al.) were better than the least-squares methods of Ghiglia and Romero (GR), and of Marroquin and Rivera (MRM). In a qualitative evaluation, the ADI, the BLS, and a conjugate gradient version of the MRM method (MRMCG), were all compared using a synthetic image with shear, using 115 magnetic resonance images, and using 22 fiber-optic interferometry images. For the synthetic image and the interferometry images, the ADI method gave consistently visually better results than the other methods. For the MR images, the MRMCG method was best, and the ADI method second best. The ADI method was less sensitive to the mask definition and the block size than the BLS method, and successfully unwrapped images with shears that were not marked in the masks. The computational requirements of the ADI method for images of nonrectangular objects were comparable to only two iterations of many least-squares-based methods (e.g., GR). We believe the ADI method provides a powerful addition to the ensemble of tools available for 2-D phase unwrapping.

  19. Concordant integrative gene set enrichment analysis of multiple large-scale two-sample expression data sets.

    PubMed

    Lai, Yinglei; Zhang, Fanni; Nayak, Tapan K; Modarres, Reza; Lee, Norman H; McCaffrey, Timothy A

    2014-01-01

    Gene set enrichment analysis (GSEA) is an important approach to the analysis of coordinate expression changes at a pathway level. Although many statistical and computational methods have been proposed for GSEA, the issue of a concordant integrative GSEA of multiple expression data sets has not been well addressed. Among different related data sets collected for the same or similar study purposes, it is important to identify pathways or gene sets with concordant enrichment. We categorize the underlying true states of differential expression into three representative categories: no change, positive change and negative change. Due to data noise, what we observe from experiments may not indicate the underlying truth. Although these categories are not observed in practice, they can be considered in a mixture model framework. Then, we define the mathematical concept of concordant gene set enrichment and calculate its related probability based on a three-component multivariate normal mixture model. The related false discovery rate can be calculated and used to rank different gene sets. We used three published lung cancer microarray gene expression data sets to illustrate our proposed method. One analysis based on the first two data sets was conducted to compare our result with a previous published result based on a GSEA conducted separately for each individual data set. This comparison illustrates the advantage of our proposed concordant integrative gene set enrichment analysis. Then, with a relatively new and larger pathway collection, we used our method to conduct an integrative analysis of the first two data sets and also all three data sets. Both results showed that many gene sets could be identified with low false discovery rates. A consistency between both results was also observed. A further exploration based on the KEGG cancer pathway collection showed that a majority of these pathways could be identified by our proposed method. This study illustrates that we can improve detection power and discovery consistency through a concordant integrative analysis of multiple large-scale two-sample gene expression data sets.

  20. Development of Mobile Electronic Health Records Application in a Secondary General Hospital in Korea

    PubMed Central

    Park, Min Ah; Hong, Eunseok; Kim, Sunhyu; Ahn, Ryeok; Hong, Jungseok; Song, Seungyeol; Kim, Tak; Kim, Jeongkeun; Yeo, Seongwoon

    2013-01-01

    Objectives The recent evolution of mobile devices has opened new possibilities of providing strongly integrated mobile services in healthcare. The objective of this paper is to describe the decision driver, development, and implementation of an integrated mobile Electronic Health Record (EHR) application at Ulsan University Hospital. This application helps healthcare providers view patients' medical records and information without a stationary computer workstation. Methods We developed an integrated mobile application prototype that aimed to improve the mobility and usability of healthcare providers during their daily medical activities. The Android and iOS platform was used to create the mobile EHR application. The first working version was completed in 5 months and required 1,080 development hours. Results The mobile EHR application provides patient vital signs, patient data, text communication, and integrated EHR. The application allows our healthcare providers to know the status of patients within and outside the hospital environment. The application provides a consistent user environment on several compatible Android and iOS devices. A group of 10 beta testers has consistently used and maintained our copy of the application, suggesting user acceptance. Conclusions We are developing the integrated mobile EHR application with the goals of implementing an environment that is user-friendly, implementing a patient-centered system, and increasing the hospital's competitiveness. PMID:24523996

  1. The Implications of Using Integrated Software Support Environment for Design of Guidance and Control Systems Software

    DTIC Science & Technology

    1990-02-01

    inspections are performed before each formal review of each software life cycle phase. * Required software audits are performed . " The software is acceptable... Audits : Software audits are performed bySQA consistent with thegeneral audit rules and an auditreportis prepared. Software Quality Inspection (SQI...DSD Software Development Method 3-34 DEFINITION OF ACRONYMS Acronym Full Name or Description MACH Methode d’Analyse et de Conception Flierarchisee

  2. Regional and National Grid Integration Studies Consistently Show Higher

    Science.gov Websites

    Levels of Renewables Are Possible | Energy Analysis | NREL Regional and National Grid Integration Studies Consistently Show Higher Levels of Renewables Are Possible Regional and National Grid Integration Studies Consistently Show Higher Levels of Renewables Are Possible Analysis Insights: April 2015

  3. Monitoring and evaluating the quality consistency of Compound Bismuth Aluminate tablets by a simple quantified ratio fingerprint method combined with simultaneous determination of five compounds and correlated with antioxidant activities.

    PubMed

    Liu, Yingchun; Liu, Zhongbo; Sun, Guoxiang; Wang, Yan; Ling, Junhong; Gao, Jiayue; Huang, Jiahao

    2015-01-01

    A combination method of multi-wavelength fingerprinting and multi-component quantification by high performance liquid chromatography (HPLC) coupled with diode array detector (DAD) was developed and validated to monitor and evaluate the quality consistency of herbal medicines (HM) in the classical preparation Compound Bismuth Aluminate tablets (CBAT). The validation results demonstrated that our method met the requirements of fingerprint analysis and quantification analysis with suitable linearity, precision, accuracy, limits of detection (LOD) and limits of quantification (LOQ). In the fingerprint assessments, rather than using conventional qualitative "Similarity" as a criterion, the simple quantified ratio fingerprint method (SQRFM) was recommended, which has an important quantified fingerprint advantage over the "Similarity" approach. SQRFM qualitatively and quantitatively offers the scientific criteria for traditional Chinese medicines (TCM)/HM quality pyramid and warning gate in terms of three parameters. In order to combine the comprehensive characterization of multi-wavelength fingerprints, an integrated fingerprint assessment strategy based on information entropy was set up involving a super-information characteristic digitized parameter of fingerprints, which reveals the total entropy value and absolute information amount about the fingerprints and, thus, offers an excellent method for fingerprint integration. The correlation results between quantified fingerprints and quantitative determination of 5 marker compounds, including glycyrrhizic acid (GLY), liquiritin (LQ), isoliquiritigenin (ILG), isoliquiritin (ILQ) and isoliquiritin apioside (ILA), indicated that multi-component quantification could be replaced by quantified fingerprints. The Fenton reaction was employed to determine the antioxidant activities of CBAT samples in vitro, and they were correlated with HPLC fingerprint components using the partial least squares regression (PLSR) method. In summary, the method of multi-wavelength fingerprints combined with antioxidant activities has been proved to be a feasible and scientific procedure for monitoring and evaluating the quality consistency of CBAT.

  4. A Spiking Neural Simulator Integrating Event-Driven and Time-Driven Computation Schemes Using Parallel CPU-GPU Co-Processing: A Case Study.

    PubMed

    Naveros, Francisco; Luque, Niceto R; Garrido, Jesús A; Carrillo, Richard R; Anguita, Mancia; Ros, Eduardo

    2015-07-01

    Time-driven simulation methods in traditional CPU architectures perform well and precisely when simulating small-scale spiking neural networks. Nevertheless, they still have drawbacks when simulating large-scale systems. Conversely, event-driven simulation methods in CPUs and time-driven simulation methods in graphic processing units (GPUs) can outperform CPU time-driven methods under certain conditions. With this performance improvement in mind, we have developed an event-and-time-driven spiking neural network simulator suitable for a hybrid CPU-GPU platform. Our neural simulator is able to efficiently simulate bio-inspired spiking neural networks consisting of different neural models, which can be distributed heterogeneously in both small layers and large layers or subsystems. For the sake of efficiency, the low-activity parts of the neural network can be simulated in CPU using event-driven methods while the high-activity subsystems can be simulated in either CPU (a few neurons) or GPU (thousands or millions of neurons) using time-driven methods. In this brief, we have undertaken a comparative study of these different simulation methods. For benchmarking the different simulation methods and platforms, we have used a cerebellar-inspired neural-network model consisting of a very dense granular layer and a Purkinje layer with a smaller number of cells (according to biological ratios). Thus, this cerebellar-like network includes a dense diverging neural layer (increasing the dimensionality of its internal representation and sparse coding) and a converging neural layer (integration) similar to many other biologically inspired and also artificial neural networks.

  5. A New Image Encryption Technique Combining Hill Cipher Method, Morse Code and Least Significant Bit Algorithm

    NASA Astrophysics Data System (ADS)

    Nofriansyah, Dicky; Defit, Sarjon; Nurcahyo, Gunadi W.; Ganefri, G.; Ridwan, R.; Saleh Ahmar, Ansari; Rahim, Robbi

    2018-01-01

    Cybercrime is one of the most serious threats. Efforts are made to reduce the number of cybercrime is to find new techniques in securing data such as Cryptography, Steganography and Watermarking combination. Cryptography and Steganography is a growing data security science. A combination of Cryptography and Steganography is one effort to improve data integrity. New techniques are used by combining several algorithms, one of which is the incorporation of hill cipher method and Morse code. Morse code is one of the communication codes used in the Scouting field. This code consists of dots and lines. This is a new modern and classic concept to maintain data integrity. The result of the combination of these three methods is expected to generate new algorithms to improve the security of the data, especially images.

  6. A Hybrid 3D Indoor Space Model

    NASA Astrophysics Data System (ADS)

    Jamali, Ali; Rahman, Alias Abdul; Boguslawski, Pawel

    2016-10-01

    GIS integrates spatial information and spatial analysis. An important example of such integration is for emergency response which requires route planning inside and outside of a building. Route planning requires detailed information related to indoor and outdoor environment. Indoor navigation network models including Geometric Network Model (GNM), Navigable Space Model, sub-division model and regular-grid model lack indoor data sources and abstraction methods. In this paper, a hybrid indoor space model is proposed. In the proposed method, 3D modeling of indoor navigation network is based on surveying control points and it is less dependent on the 3D geometrical building model. This research proposes a method of indoor space modeling for the buildings which do not have proper 2D/3D geometrical models or they lack semantic or topological information. The proposed hybrid model consists of topological, geometrical and semantical space.

  7. Massive integration of diverse protein quality assessment methods to improve template based modeling in CASP11.

    PubMed

    Cao, Renzhi; Bhattacharya, Debswapna; Adhikari, Badri; Li, Jilong; Cheng, Jianlin

    2016-09-01

    Model evaluation and selection is an important step and a big challenge in template-based protein structure prediction. Individual model quality assessment methods designed for recognizing some specific properties of protein structures often fail to consistently select good models from a model pool because of their limitations. Therefore, combining multiple complimentary quality assessment methods is useful for improving model ranking and consequently tertiary structure prediction. Here, we report the performance and analysis of our human tertiary structure predictor (MULTICOM) based on the massive integration of 14 diverse complementary quality assessment methods that was successfully benchmarked in the 11th Critical Assessment of Techniques of Protein Structure prediction (CASP11). The predictions of MULTICOM for 39 template-based domains were rigorously assessed by six scoring metrics covering global topology of Cα trace, local all-atom fitness, side chain quality, and physical reasonableness of the model. The results show that the massive integration of complementary, diverse single-model and multi-model quality assessment methods can effectively leverage the strength of single-model methods in distinguishing quality variation among similar good models and the advantage of multi-model quality assessment methods of identifying reasonable average-quality models. The overall excellent performance of the MULTICOM predictor demonstrates that integrating a large number of model quality assessment methods in conjunction with model clustering is a useful approach to improve the accuracy, diversity, and consequently robustness of template-based protein structure prediction. Proteins 2016; 84(Suppl 1):247-259. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  8. A multiscale Bayesian data integration approach for mapping air dose rates around the Fukushima Daiichi Nuclear Power Plant.

    PubMed

    Wainwright, Haruko M; Seki, Akiyuki; Chen, Jinsong; Saito, Kimiaki

    2017-02-01

    This paper presents a multiscale data integration method to estimate the spatial distribution of air dose rates in the regional scale around the Fukushima Daiichi Nuclear Power Plant. We integrate various types of datasets, such as ground-based walk and car surveys, and airborne surveys, all of which have different scales, resolutions, spatial coverage, and accuracy. This method is based on geostatistics to represent spatial heterogeneous structures, and also on Bayesian hierarchical models to integrate multiscale, multi-type datasets in a consistent manner. The Bayesian method allows us to quantify the uncertainty in the estimates, and to provide the confidence intervals that are critical for robust decision-making. Although this approach is primarily data-driven, it has great flexibility to include mechanistic models for representing radiation transport or other complex correlations. We demonstrate our approach using three types of datasets collected at the same time over Fukushima City in Japan: (1) coarse-resolution airborne surveys covering the entire area, (2) car surveys along major roads, and (3) walk surveys in multiple neighborhoods. Results show that the method can successfully integrate three types of datasets and create an integrated map (including the confidence intervals) of air dose rates over the domain in high resolution. Moreover, this study provides us with various insights into the characteristics of each dataset, as well as radiocaesium distribution. In particular, the urban areas show high heterogeneity in the contaminant distribution due to human activities as well as large discrepancy among different surveys due to such heterogeneity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Design Study: Integer Subtraction Operation Teaching Learning Using Multimedia in Primary School

    ERIC Educational Resources Information Center

    Aris, Rendi Muhammad; Putri, Ratu Ilma Indra

    2017-01-01

    This study aims to develop a learning trajectory to help students understand concept of subtraction of integers using multimedia in the fourth grade. This study is thematic integrative learning in Curriculum 2013 PMRI based. The method used is design research consists of three stages; preparing for the experiment, design experiment, retrospective…

  10. Aligned Carbon Nanotubes for Highly Efficient Energy Generation and Storage Devices

    DTIC Science & Technology

    2012-01-24

    solution processing methods, including filtration, solution-casting, electrophoretic deposition, and Langmuir - Blodgett deposition. However, most...supercapacitors with environmentally friendly ionic liquid electrolytes. These new nanocomposite electrodes consist of the high-surface-area activated...carbons, carbon nanotubes, and ionic liquids as the integrated constituent components. The resultant composites show significantly improved charge

  11. Validating the Alcohol Use Disorders Identification Test with Persons Who Have a Serious Mental Illness

    ERIC Educational Resources Information Center

    O'Hare, Thomas; Sherrer, Margaret V.; LaButti, Annamaria; Emrick, Kelly

    2004-01-01

    Objective/Method: The use of brief, reliable, valid, and practical measures of substance use is critical for conducting individual assessments and program evaluation for integrated mental health-substance abuse services for persons with serious mental illness. This investigation examines the internal consistency reliability, concurrent validity,…

  12. APMS 3.0 Flight Analyst Guide: Aviation Performance Measuring System

    NASA Technical Reports Server (NTRS)

    Jay, Griff; Prothero, Gary; Romanowski, Timothy; Lynch, Robert; Lawrence, Robert; Rosenthal, Loren

    2004-01-01

    The Aviation Performance Measuring System (APMS) is a method-embodied in software-that uses mathematical algorithms and related procedures to analyze digital flight data extracted from aircraft flight data recorders. APMS consists of an integrated set of tools used to perform two primary functions: a) Flight Data Importation b) Flight Data Analysis.

  13. Airfoil Design Using a Coupled Euler and Integral Boundary Layer Method with Adjoint Based Sensitivities

    NASA Technical Reports Server (NTRS)

    Edwards, S.; Reuther, J.; Chattot, J. J.

    1997-01-01

    The objective of this paper is to present a control theory approach for the design of airfoils in the presence of viscous compressible flows. A coupled system of the integral boundary layer and the Euler equations is solved to provide rapid flow simulations. An adjunct approach consistent with the complete coupled state equations is employed to obtain the sensitivities needed to drive a numerical optimization algorithm. Design to target pressure distribution is demonstrated on an RAE 2822 airfoil at transonic speed.

  14. Steady and unsteady three-dimensional transonic flow computations by integral equation method

    NASA Technical Reports Server (NTRS)

    Hu, Hong

    1994-01-01

    This is the final technical report of the research performed under the grant: NAG1-1170, from the National Aeronautics and Space Administration. The report consists of three parts. The first part presents the work on unsteady flows around a zero-thickness wing. The second part presents the work on steady flows around non-zero thickness wings. The third part presents the massively parallel processing implementation and performance analysis of integral equation computations. At the end of the report, publications resulting from this grant are listed and attached.

  15. COMPREHENSIVE ASSESSMENT OF COMPLEX TECHNOLOGIES: INTEGRATING VARIOUS ASPECTS IN HEALTH TECHNOLOGY ASSESSMENT.

    PubMed

    Lysdahl, Kristin Bakke; Mozygemba, Kati; Burns, Jacob; Brönneke, Jan Benedikt; Chilcott, James B; Ward, Sue; Hofmann, Bjørn

    2017-01-01

    Despite recent development of health technology assessment (HTA) methods, there are still methodological gaps for the assessment of complex health technologies. The INTEGRATE-HTA guidance for effectiveness, economic, ethical, socio-cultural, and legal aspects, deals with challenges when assessing complex technologies, such as heterogeneous study designs, multiple stakeholder perspectives, and unpredictable outcomes. The objective of this article is to outline this guidance and describe the added value of integrating these assessment aspects. Different methods were used to develop the various parts of the guidance, but all draw on existing, published knowledge and were supported by stakeholder involvement. The guidance was modified after application in a case study and in response to feedback from internal and external reviewers. The guidance consists of five parts, addressing five core aspects of HTA, all presenting stepwise approaches based on the assessment of complexity, context, and stakeholder involvement. The guidance on effectiveness, health economics and ethics aspects focus on helping users choose appropriate, or further develop, existing methods. The recommendations are based on existing methods' applicability for dealing with problems arising with complex interventions. The guidance offers new frameworks to identify socio-cultural and legal issues, along with overviews of relevant methods and sources. The INTEGRATE-HTA guidance outlines a wide range of methods and facilitates appropriate choices among them. The guidance enables understanding of how complexity matters for HTA and brings together assessments from disciplines, such as epidemiology, economics, ethics, law, and social theory. This indicates relevance for a broad range of technologies.

  16. Contributions of cultural services to the ecosystem services agenda

    PubMed Central

    Daniel, Terry C.; Muhar, Andreas; Arnberger, Arne; Aznar, Olivier; Boyd, James W.; Chan, Kai M. A.; Costanza, Robert; Elmqvist, Thomas; Flint, Courtney G.; Gobster, Paul H.; Grêt-Regamey, Adrienne; Lave, Rebecca; Muhar, Susanne; Penker, Marianne; Ribe, Robert G.; Schauppenlehner, Thomas; Sikor, Thomas; Soloviy, Ihor; Spierenburg, Marja; Taczanowska, Karolina; Tam, Jordan; von der Dunk, Andreas

    2012-01-01

    Cultural ecosystem services (ES) are consistently recognized but not yet adequately defined or integrated within the ES framework. A substantial body of models, methods, and data relevant to cultural services has been developed within the social and behavioral sciences before and outside of the ES approach. A selective review of work in landscape aesthetics, cultural heritage, outdoor recreation, and spiritual significance demonstrates opportunities for operationally defining cultural services in terms of socioecological models, consistent with the larger set of ES. Such models explicitly link ecological structures and functions with cultural values and benefits, facilitating communication between scientists and stakeholders and enabling economic, multicriterion, deliberative evaluation and other methods that can clarify tradeoffs and synergies involving cultural ES. Based on this approach, a common representation is offered that frames cultural services, along with all ES, by the relative contribution of relevant ecological structures and functions and by applicable social evaluation approaches. This perspective provides a foundation for merging ecological and social science epistemologies to define and integrate cultural services better within the broader ES framework. PMID:22615401

  17. High temperature integrated ultrasonic shear and longitudinal wave probes

    NASA Astrophysics Data System (ADS)

    Ono, Y.; Jen, C.-K.; Kobayashi, M.

    2007-02-01

    Integrated ultrasonic shear wave probes have been designed and developed using a mode conversion theory for nondestructive testing and characterization at elevated temperatures. The probes consisted of metallic substrates and high temperature piezoelectric thick (>40μm) films through a paint-on method. Shear waves are generated due to mode conversion from longitudinal to shear waves because of reflection inside the substrate having a specific shape. A novel design scheme is proposed to reduce the machining time of substrates and thick film fabrication difficulty. A probe simultaneously generating and receiving both longitudinal and shear waves is also developed and demonstrated. In addition, a shear wave probe using a clad buffer rod consisting of an aluminum core and stainless steel cladding has been developed. All the probes were tested and successfully operated at 150°C.

  18. Dimensionality and integrals of motion of the Trappist-1 planetary system

    NASA Astrophysics Data System (ADS)

    Floß, Johannes; Rein, Hanno; Brumer, Paul

    2018-04-01

    The number of isolating integrals of motion of the Trappist-1 system - a late M-dwarf orbited by seven Earth-sized planets - was determined numerically, using an adapted version of the correlation dimension method. It was found that over the investigated time-scales of up to 20 000 years the number of isolating integrals of motion is the same as one would find for a system of seven non-interacting planets - despite the fact that the planets in the Trappist-1 system are strongly interacting. Considering perturbed versions of the Trappist-1 system shows that the system may occupy an atypical part of phase-space with high stability. These findings are consistent with earlier studies.

  19. Dimensionality and integrals of motion of the Trappist-1 planetary system

    NASA Astrophysics Data System (ADS)

    Floß, Johannes; Rein, Hanno; Brumer, Paul

    2018-07-01

    The number of isolating integrals of motion of the Trappist-1 system - a late M-dwarf orbited by seven Earth-sized planets - was determined numerically, using an adapted version of the correlation dimension method. It was found that over the investigated time-scales of up to 20 000 yr the number of isolating integrals of motion is the same as one would find for a system of seven non-interacting planets - despite the fact that the planets in the Trappist-1 system are strongly interacting. Considering perturbed versions of the Trappist-1 system shows that the system may occupy an atypical part of phase-space with high stability. These findings are consistent with earlier studies.

  20. A method to deconvolve stellar rotational velocities II. The probability distribution function via Tikhonov regularization

    NASA Astrophysics Data System (ADS)

    Christen, Alejandra; Escarate, Pedro; Curé, Michel; Rial, Diego F.; Cassetti, Julia

    2016-10-01

    Aims: Knowing the distribution of stellar rotational velocities is essential for understanding stellar evolution. Because we measure the projected rotational speed v sin I, we need to solve an ill-posed problem given by a Fredholm integral of the first kind to recover the "true" rotational velocity distribution. Methods: After discretization of the Fredholm integral we apply the Tikhonov regularization method to obtain directly the probability distribution function for stellar rotational velocities. We propose a simple and straightforward procedure to determine the Tikhonov parameter. We applied Monte Carlo simulations to prove that the Tikhonov method is a consistent estimator and asymptotically unbiased. Results: This method is applied to a sample of cluster stars. We obtain confidence intervals using a bootstrap method. Our results are in close agreement with those obtained using the Lucy method for recovering the probability density distribution of rotational velocities. Furthermore, Lucy estimation lies inside our confidence interval. Conclusions: Tikhonov regularization is a highly robust method that deconvolves the rotational velocity probability density function from a sample of v sin I data directly without the need for any convergence criteria.

  1. Integral Equation Method for Electromagnetic Wave Propagation in Stratified Anisotropic Dielectric-Magnetic Materials

    NASA Astrophysics Data System (ADS)

    Shu, Wei-Xing; Fu, Na; Lü, Xiao-Fang; Luo, Hai-Lu; Wen, Shuang-Chun; Fan, Dian-Yuan

    2010-11-01

    We investigate the propagation of electromagnetic waves in stratified anisotropic dielectric-magnetic materials using the integral equation method (IEM). Based on the superposition principle, we use Hertz vector formulations of radiated fields to study the interaction of wave with matter. We derive in a new way the dispersion relation, Snell's law and reflection/transmission coefficients by self-consistent analyses. Moreover, we find two new forms of the generalized extinction theorem. Applying the IEM, we investigate the wave propagation through a slab and disclose the underlying physics, which are further verified by numerical simulations. The results lead to a unified framework of the IEM for the propagation of wave incident either from a medium or vacuum in stratified dielectric-magnetic materials.

  2. FAST NEUTRON DOSIMETER FOR HIGH TEMPERATURE OPERATION BY MEASUREMENT OF THE AMOUNT OF CESIUM 137 FORMED FROM A THORIUM WIRE

    DOEpatents

    McCune, D.A.

    1964-03-17

    A method and device for measurement of integrated fast neutron flux in the presence of a large thermal neutron field are described. The device comprises a thorium wire surrounded by a thermal neutron attenuator that is, in turn, enclosed by heat-resistant material. The method consists of irradiating the device in a neutron field whereby neutrons with energies in excess of 1.1 Mev cause fast fissions in the thorium, then removing the thorium wire, separating the cesium-137 fission product by chemical means from the thorium, and finally counting the radioactivity of the cesium to determine the number of fissions which have occurred so that the integrated fast flux may be obtained. (AEC)

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vaccarono, Mattia; Bechini, Renzo; Chandrasekar, Chandra V.

    The stability of weather radar calibration is a mandatory aspect for quantitative applications, such as rainfall estimation, short-term weather prediction and initialization of numerical atmospheric and hydrological models. Over the years, calibration monitoring techniques based on external sources have been developed, specifically calibration using the Sun and calibration based on ground clutter returns. In this paper, these two techniques are integrated and complemented with a self-consistency procedure and an intercalibration technique. The aim of the integrated approach is to implement a robust method for online monitoring, able to detect significant changes in the radar calibration. The physical consistency of polarimetricmore » radar observables is exploited using the self-consistency approach, based on the expected correspondence between dual-polarization power and phase measurements in rain. This technique allows a reference absolute value to be provided for the radar calibration, from which eventual deviations may be detected using the other procedures. In particular, the ground clutter calibration is implemented on both polarization channels (horizontal and vertical) for each radar scan, allowing the polarimetric variables to be monitored and hardware failures to promptly be recognized. The Sun calibration allows monitoring the calibration and sensitivity of the radar receiver, in addition to the antenna pointing accuracy. It is applied using observations collected during the standard operational scans but requires long integration times (several days) in order to accumulate a sufficient amount of useful data. Finally, an intercalibration technique is developed and performed to compare colocated measurements collected in rain by two radars in overlapping regions. The integrated approach is performed on the C-band weather radar network in northwestern Italy, during July–October 2014. The set of methods considered appears suitable to establish an online tool to monitor the stability of the radar calibration with an accuracy of about 2 dB. In conclusion, this is considered adequate to automatically detect any unexpected change in the radar system requiring further data analysis or on-site measurements.« less

  4. Self-consistency in Bicultural Persons: Dialectical Self-beliefs Mediate the Relation between Identity Integration and Self-consistency

    PubMed Central

    Zhang, Rui; Noels, Kimberly A.; Lalonde, Richard N.; Salas, S. J.

    2017-01-01

    Prior research differentiates dialectical (e.g., East Asian) from non-dialectical cultures (e.g., North American and Latino) and attributes cultural differences in self-concept consistency to naïve dialecticism. In this research, we explored the effects of managing two cultural identities on consistency within the bicultural self-concept via the role of dialectical beliefs. Because the challenge of integrating more than one culture within the self is common to biculturals of various heritage backgrounds, the effects of bicultural identity integration should not depend on whether the heritage culture is dialectical or not. In four studies across diverse groups of bicultural Canadians, we showed that having an integrated bicultural identity was associated with being more consistent across roles (Studies 1–3) and making less ambiguous self-evaluations (Study 4). Furthermore, dialectical self-beliefs mediated the effect of bicultural identity integration on self-consistency (Studies 2–4). Finally, Latino biculturals reported being more consistent across roles than did East Asian biculturals (Study 2), revealing the ethnic heritage difference between the two groups. We conclude that both the content of heritage culture and the process of integrating cultural identities influence the extent of self-consistency among biculturals. Thus, consistency within the bicultural self-concept can be understood, in part, to be a unique psychological product of bicultural experience. PMID:28326052

  5. Self-consistency in Bicultural Persons: Dialectical Self-beliefs Mediate the Relation between Identity Integration and Self-consistency.

    PubMed

    Zhang, Rui; Noels, Kimberly A; Lalonde, Richard N; Salas, S J

    2017-01-01

    Prior research differentiates dialectical (e.g., East Asian) from non-dialectical cultures (e.g., North American and Latino) and attributes cultural differences in self-concept consistency to naïve dialecticism. In this research, we explored the effects of managing two cultural identities on consistency within the bicultural self-concept via the role of dialectical beliefs. Because the challenge of integrating more than one culture within the self is common to biculturals of various heritage backgrounds, the effects of bicultural identity integration should not depend on whether the heritage culture is dialectical or not. In four studies across diverse groups of bicultural Canadians, we showed that having an integrated bicultural identity was associated with being more consistent across roles (Studies 1-3) and making less ambiguous self-evaluations (Study 4). Furthermore, dialectical self-beliefs mediated the effect of bicultural identity integration on self-consistency (Studies 2-4). Finally, Latino biculturals reported being more consistent across roles than did East Asian biculturals (Study 2), revealing the ethnic heritage difference between the two groups. We conclude that both the content of heritage culture and the process of integrating cultural identities influence the extent of self-consistency among biculturals. Thus, consistency within the bicultural self-concept can be understood, in part, to be a unique psychological product of bicultural experience.

  6. Modeling and Detecting Feature Interactions among Integrated Services of Home Network Systems

    NASA Astrophysics Data System (ADS)

    Igaki, Hiroshi; Nakamura, Masahide

    This paper presents a framework for formalizing and detecting feature interactions (FIs) in the emerging smart home domain. We first establish a model of home network system (HNS), where every networked appliance (or the HNS environment) is characterized as an object consisting of properties and methods. Then, every HNS service is defined as a sequence of method invocations of the appliances. Within the model, we next formalize two kinds of FIs: (a) appliance interactions and (b) environment interactions. An appliance interaction occurs when two method invocations conflict on the same appliance, whereas an environment interaction arises when two method invocations conflict indirectly via the environment. Finally, we propose offline and online methods that detect FIs before service deployment and during execution, respectively. Through a case study with seven practical services, it is shown that the proposed framework is generic enough to capture feature interactions in HNS integrated services. We also discuss several FI resolution schemes within the proposed framework.

  7. Translation and Initial Validation of the Chinese (Cantonese) Version of Community Integration Measure for Use in Patients with Chronic Stroke

    PubMed Central

    Ng, Shamay S. M.; Ng, Gabriel Y. F.

    2014-01-01

    Objectives. To (1) translate and culturally adapt the English version Community Integration Measure into Chinese (Cantonese), (2) report the results of initial validation of the Chinese (Cantonese) version of CIM (CIM-C) including the content validity, internal consistency, test-retest reliability, and factor structure of CIM-C for use in stroke survivors in a Chinese community setting, and (3) investigate the level of community integration of stroke survivors living in Hong Kong. Design. Cross-sectional study. Setting. University-based rehabilitation centre. Participants. 62 (n = 62) subjects with chronic stroke. Methods. The CIM-C was produced after forward-backward translation, expert panel review, and pretesting. 25 (n = 25) of the same subjects were reassessed after a 1-week interval. Results. The items of the CIM-C demonstrated high internal consistency with a Cronbach's α of 0.84. The CIM-C showed good test-retest reliability with an intraclass correlation coefficient (ICC) of 0.84 (95% confidence interval, 0.64–0.93). A 3-factor structure of the CIM-C including “relationship and engagement,” “sense of knowing,” and “independent living,” was consistent with the original theoretical model. Hong Kong stroke survivors revealed a high level of community integration as measured by the CIM-C (mean (SD): 43.48 (5.79)). Conclusions. The CIM-C is a valid and reliable measure for clinical use. PMID:24995317

  8. Integrated Data Collection Analysis (IDCA) Program - SSST Testing Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandstrom, Mary M.; Brown, Geoffrey W.; Preston, Daniel N.

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the methods used for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis during the IDCA program. These methods changed throughout the Proficiency Test and the reasons for these changes are documented in this report. The most significant modifications in standard testing methods are: 1) including one specified sandpaper in impact testing among all the participants, 2) diversifying liquid test methods for selected participants, and 3) including sealed sample holders for thermal testingmore » by at least one participant. This effort, funded by the Department of Homeland Security (DHS), is putting the issues of safe handling of these materials in perspective with standard military explosives. The study is adding SSST testing results for a broad suite of different HMEs to the literature. Ultimately the study will suggest new guidelines and methods and possibly establish the SSST testing accuracies needed to develop safe handling practices for HMEs. Each participating testing laboratory uses identical test materials and preparation methods wherever possible. The testing performers involved are Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), Indian Head Division, Naval Surface Warfare Center, (NSWC IHD), Sandia National Laboratories (SNL), and Air Force Research Laboratory (AFRL/RXQL). These tests are conducted as a proficiency study in order to establish some consistency in test protocols, procedures, and experiments and to compare results when these testing variables cannot be made consistent.« less

  9. Research of the orbital evolution of asteroid 2012 DA14 (in Russian)

    NASA Astrophysics Data System (ADS)

    Zausaev, A. F.; Denisov, S. S.; Derevyanka, A. E.

    Research of the orbital evolution of asteroid 2012 DA14 on the time interval from 1800 to 2206 is made, an object close approaches with Earth and the Moon are detected, the probability of impact with Earth is calculated. The used mathematical model is consistent with the DE405, the integration was performed using a modified Everhart's method of 27th order, the probability of collision is calculated using the Monte Carlo method.

  10. A nudging-based data assimilation method: the Back and Forth Nudging (BFN) algorithm

    NASA Astrophysics Data System (ADS)

    Auroux, D.; Blum, J.

    2008-03-01

    This paper deals with a new data assimilation algorithm, called Back and Forth Nudging. The standard nudging technique consists in adding to the equations of the model a relaxation term that is supposed to force the observations to the model. The BFN algorithm consists in repeatedly performing forward and backward integrations of the model with relaxation (or nudging) terms, using opposite signs in the direct and inverse integrations, so as to make the backward evolution numerically stable. This algorithm has first been tested on the standard Lorenz model with discrete observations (perfect or noisy) and compared with the variational assimilation method. The same type of study has then been performed on the viscous Burgers equation, comparing again with the variational method and focusing on the time evolution of the reconstruction error, i.e. the difference between the reference trajectory and the identified one over a time period composed of an assimilation period followed by a prediction period. The possible use of the BFN algorithm as an initialization for the variational method has also been investigated. Finally the algorithm has been tested on a layered quasi-geostrophic model with sea-surface height observations. The behaviours of the two algorithms have been compared in the presence of perfect or noisy observations, and also for imperfect models. This has allowed us to reach a conclusion concerning the relative performances of the two algorithms.

  11. Alignment method for parabolic trough solar concentrators

    DOEpatents

    Diver, Richard B [Albuquerque, NM

    2010-02-23

    A Theoretical Overlay Photographic (TOP) alignment method uses the overlay of a theoretical projected image of a perfectly aligned concentrator on a photographic image of the concentrator to align the mirror facets of a parabolic trough solar concentrator. The alignment method is practical and straightforward, and inherently aligns the mirror facets to the receiver. When integrated with clinometer measurements for which gravity and mechanical drag effects have been accounted for and which are made in a manner and location consistent with the alignment method, all of the mirrors on a common drive can be aligned and optimized for any concentrator orientation.

  12. Analysis of real-time numerical integration methods applied to dynamic clamp experiments.

    PubMed

    Butera, Robert J; McCarthy, Maeve L

    2004-12-01

    Real-time systems are frequently used as an experimental tool, whereby simulated models interact in real time with neurophysiological experiments. The most demanding of these techniques is known as the dynamic clamp, where simulated ion channel conductances are artificially injected into a neuron via intracellular electrodes for measurement and stimulation. Methodologies for implementing the numerical integration of the gating variables in real time typically employ first-order numerical methods, either Euler or exponential Euler (EE). EE is often used for rapidly integrating ion channel gating variables. We find via simulation studies that for small time steps, both methods are comparable, but at larger time steps, EE performs worse than Euler. We derive error bounds for both methods, and find that the error can be characterized in terms of two ratios: time step over time constant, and voltage measurement error over the slope factor of the steady-state activation curve of the voltage-dependent gating variable. These ratios reliably bound the simulation error and yield results consistent with the simulation analysis. Our bounds quantitatively illustrate how measurement error restricts the accuracy that can be obtained by using smaller step sizes. Finally, we demonstrate that Euler can be computed with identical computational efficiency as EE.

  13. Analysis of mixed-mode crack propagation using the boundary integral method

    NASA Technical Reports Server (NTRS)

    Mendelson, A.; Ghosn, L. J.

    1986-01-01

    Crack propagation in a rotating inner raceway of a high speed roller bearing is analyzed using the boundary integral equation method. The method consists of an edge crack in a plate under tension, upon which varying Hertzian stress fields are superimposed. A computer program for the boundary integral equation method was written using quadratic elements to determine the stress and displacement fields for discrete roller positions. Mode I and Mode II stress intensity factors and crack extension forces G sub 00 (energy release rate due to tensile opening mode) and G sub r0 (energy release rate due to shear displacement mode) were computed. These calculations permit determination of that crack growth angle for which the change in the crack extension forces is maximum. The crack driving force was found to be the alternating mixed-mode loading that occurs with each passage of the most heavily loaded roller. The crack is predicted to propagate in a step-like fashion alternating between radial and inclined segments, and this pattern was observed experimentally. The maximum changes DeltaG sub 00 and DeltaG sub r0 of the crack extension forces are found to be good measures of the crack propagation rate and direction.

  14. Integrating Depth and Image Sequences for Planetary Rover Mapping Using Rgb-D Sensor

    NASA Astrophysics Data System (ADS)

    Peng, M.; Wan, W.; Xing, Y.; Wang, Y.; Liu, Z.; Di, K.; Zhao, Q.; Teng, B.; Mao, X.

    2018-04-01

    RGB-D camera allows the capture of depth and color information at high data rates, and this makes it possible and beneficial integrate depth and image sequences for planetary rover mapping. The proposed mapping method consists of three steps. First, the strict projection relationship among 3D space, depth data and visual texture data is established based on the imaging principle of RGB-D camera, then, an extended bundle adjustment (BA) based SLAM method with integrated 2D and 3D measurements is applied to the image network for high-precision pose estimation. Next, as the interior and exterior elements of RGB images sequence are available, dense matching is completed with the CMPMVS tool. Finally, according to the registration parameters after ICP, the 3D scene from RGB images can be registered to the 3D scene from depth images well, and the fused point cloud can be obtained. Experiment was performed in an outdoor field to simulate the lunar surface. The experimental results demonstrated the feasibility of the proposed method.

  15. An algorithm for charge-integration, pulse-shape discrimination and estimation of neutron/photon misclassification in organic scintillators

    NASA Astrophysics Data System (ADS)

    Polack, J. K.; Flaska, M.; Enqvist, A.; Sosa, C. S.; Lawrence, C. C.; Pozzi, S. A.

    2015-09-01

    Organic scintillators are frequently used for measurements that require sensitivity to both photons and fast neutrons because of their pulse shape discrimination capabilities. In these measurement scenarios, particle identification is commonly handled using the charge-integration pulse shape discrimination method. This method works particularly well for high-energy depositions, but is prone to misclassification for relatively low-energy depositions. A novel algorithm has been developed for automatically performing charge-integration pulse shape discrimination in a consistent and repeatable manner. The algorithm is able to estimate the photon and neutron misclassification corresponding to the calculated discrimination parameters, and is capable of doing so using only the information measured by a single organic scintillator. This paper describes the algorithm and assesses its performance by comparing algorithm-estimated misclassification to values computed via a more traditional time-of-flight estimation. A single data set was processed using four different low-energy thresholds: 40, 60, 90, and 120 keVee. Overall, the results compared well between the two methods; in most cases, the algorithm-estimated values fell within the uncertainties of the TOF-estimated values.

  16. A Framework for Different Levels of Integration of Computational Models Into Web-Based Virtual Patients

    PubMed Central

    Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-01

    Background Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients’ interactivity by enriching them with computational models of physiological and pathological processes. Objective The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. Methods The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. Results The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the model. The third element is the description of four integration strategies, and the last element consisted of evaluation profiles specifying the relevant feasibility features and acceptance thresholds for specific purposes. The group of experts who evaluated the virtual patient exemplar found higher integration more interesting, but at the same time they were more concerned with the validity of the result. The observed differences were not statistically significant. Conclusions This paper outlines a framework for the integration of computational models into virtual patients. The opportunities and challenges of model exploitation are discussed from a number of user perspectives, considering different levels of model integration. The long-term aim for future research is to isolate the most crucial factors in the framework and to determine their influence on the integration outcome. PMID:24463466

  17. An integrated fiberoptic-microfluidic device for agglutination detection and blood typing.

    PubMed

    Ramasubramanian, Melur K; Alexander, Stewart P

    2009-02-01

    In this paper, an integrated fiberoptic-microfluidic device for the detection of agglutination for blood type cross-matching has been described. The device consists of a straight microfluidic channel through with a reacted RBC suspension is pumped with the help of a syringe pump. The flow intersects an optical path created by an emitter-received fiber optic pair integrated into the microfluidic device. A 650 nm laser diode is used as the light source and a silicon photodiode is used to detect the light intensity. The spacing between the tips of the two optic fibers can be adjusted. When fiber spacing is large and the concentration of the suspension is high, scattering phenomenon becomes the dominant mechanism for agglutination detection while at low concentrations and small spacing, optointerruption becomes the dominant mechanism. An agglutination strength factor (ASF) is calculated from the data. Studies with a variety of blood types indicate that the sensing method correctly identifies the agglutination reaction in all cases. A disposable integrated device can be designed for future implementation of the method for near-bedside pre-transfusion check.

  18. Integration of Geodata in Documenting Castle Ruins

    NASA Astrophysics Data System (ADS)

    Delis, P.; Wojtkowska, M.; Nerc, P.; Ewiak, I.; Lada, A.

    2016-06-01

    Textured three dimensional models are currently the one of the standard methods of representing the results of photogrammetric works. A realistic 3D model combines the geometrical relations between the structure's elements with realistic textures of each of its elements. Data used to create 3D models of structures can be derived from many different sources. The most commonly used tool for documentation purposes, is a digital camera and nowadays terrestrial laser scanning (TLS). Integration of data acquired from different sources allows modelling and visualization of 3D models historical structures. Additional aspect of data integration is possibility of complementing of missing points for example in point clouds. The paper shows the possibility of integrating data from terrestrial laser scanning with digital imagery and an analysis of the accuracy of the presented methods. The paper describes results obtained from raw data consisting of a point cloud measured using terrestrial laser scanning acquired from a Leica ScanStation2 and digital imagery taken using a Kodak DCS Pro 14N camera. The studied structure is the ruins of the Ilza castle in Poland.

  19. Integration of SAR and DEM data: Geometrical considerations

    NASA Technical Reports Server (NTRS)

    Kropatsch, Walter G.

    1991-01-01

    General principles for integrating data from different sources are derived from the experience of registration of SAR images with digital elevation models (DEM) data. The integration consists of establishing geometrical relations between the data sets that allow us to accumulate information from both data sets for any given object point (e.g., elevation, slope, backscatter of ground cover, etc.). Since the geometries of the two data are completely different they cannot be compared on a pixel by pixel basis. The presented approach detects instances of higher level features in both data sets independently and performs the matching at the high level. Besides the efficiency of this general strategy it further allows the integration of additional knowledge sources: world knowledge and sensor characteristics are also useful sources of information. The SAR features layover and shadow can be detected easily in SAR images. An analytical method to find such regions also in a DEM needs in addition the parameters of the flight path of the SAR sensor and the range projection model. The generation of the SAR layover and shadow maps is summarized and new extensions to this method are proposed.

  20. Combining multi-atlas segmentation with brain surface estimation

    NASA Astrophysics Data System (ADS)

    Huo, Yuankai; Carass, Aaron; Resnick, Susan M.; Pham, Dzung L.; Prince, Jerry L.; Landman, Bennett A.

    2016-03-01

    Whole brain segmentation (with comprehensive cortical and subcortical labels) and cortical surface reconstruction are two essential techniques for investigating the human brain. The two tasks are typically conducted independently, however, which leads to spatial inconsistencies and hinders further integrated cortical analyses. To obtain self-consistent whole brain segmentations and surfaces, FreeSurfer segregates the subcortical and cortical segmentations before and after the cortical surface reconstruction. However, this "segmentation to surface to parcellation" strategy has shown limitation in various situations. In this work, we propose a novel "multi-atlas segmentation to surface" method called Multi-atlas CRUISE (MaCRUISE), which achieves self-consistent whole brain segmentations and cortical surfaces by combining multi-atlas segmentation with the cortical reconstruction method CRUISE. To our knowledge, this is the first work that achieves the reliability of state-of-the-art multi-atlas segmentation and labeling methods together with accurate and consistent cortical surface reconstruction. Compared with previous methods, MaCRUISE has three features: (1) MaCRUISE obtains 132 cortical/subcortical labels simultaneously from a single multi-atlas segmentation before reconstructing volume consistent surfaces; (2) Fuzzy tissue memberships are combined with multi-atlas segmentations to address partial volume effects; (3) MaCRUISE reconstructs topologically consistent cortical surfaces by using the sulci locations from multi-atlas segmentation. Two data sets, one consisting of five subjects with expertly traced landmarks and the other consisting of 100 volumes from elderly subjects are used for validation. Compared with CRUISE, MaCRUISE achieves self-consistent whole brain segmentation and cortical reconstruction without compromising on surface accuracy. MaCRUISE is comparably accurate to FreeSurfer while achieving greater robustness across an elderly population.

  1. Combining Multi-atlas Segmentation with Brain Surface Estimation.

    PubMed

    Huo, Yuankai; Carass, Aaron; Resnick, Susan M; Pham, Dzung L; Prince, Jerry L; Landman, Bennett A

    2016-02-27

    Whole brain segmentation (with comprehensive cortical and subcortical labels) and cortical surface reconstruction are two essential techniques for investigating the human brain. The two tasks are typically conducted independently, however, which leads to spatial inconsistencies and hinders further integrated cortical analyses. To obtain self-consistent whole brain segmentations and surfaces, FreeSurfer segregates the subcortical and cortical segmentations before and after the cortical surface reconstruction. However, this "segmentation to surface to parcellation" strategy has shown limitations in various situations. In this work, we propose a novel "multi-atlas segmentation to surface" method called Multi-atlas CRUISE (MaCRUISE), which achieves self-consistent whole brain segmentations and cortical surfaces by combining multi-atlas segmentation with the cortical reconstruction method CRUISE. To our knowledge, this is the first work that achieves the reliability of state-of-the-art multi-atlas segmentation and labeling methods together with accurate and consistent cortical surface reconstruction. Compared with previous methods, MaCRUISE has three features: (1) MaCRUISE obtains 132 cortical/subcortical labels simultaneously from a single multi-atlas segmentation before reconstructing volume consistent surfaces; (2) Fuzzy tissue memberships are combined with multi-atlas segmentations to address partial volume effects; (3) MaCRUISE reconstructs topologically consistent cortical surfaces by using the sulci locations from multi-atlas segmentation. Two data sets, one consisting of five subjects with expertly traced landmarks and the other consisting of 100 volumes from elderly subjects are used for validation. Compared with CRUISE, MaCRUISE achieves self-consistent whole brain segmentation and cortical reconstruction without compromising on surface accuracy. MaCRUISE is comparably accurate to FreeSurfer while achieving greater robustness across an elderly population.

  2. Design and fabrication of planar structures with graded electromagnetic properties

    NASA Astrophysics Data System (ADS)

    Good, Brandon Lowell

    Successfully integrating electromagnetic properties in planar structures offers numerous benefits to the microwave and optical communities. This work aims at formulating new analytic and optimized design methods, creating new fabrication techniques for achieving those methods, and matching appropriate implementation of methods to fabrication techniques. The analytic method consists of modifying an approach that realizes perfect antireflective properties from graded profiles. This method is shown for all-dielectric and magneto-dielectric grading profiles. The optimized design methods are applied to transformer (discrete) or taper (continuous) designs. From these methods, a subtractive and an additive manufacturing technique were established and are described. The additive method, dry powder dot deposition, enables three dimensional varying electromagnetic properties in a structural composite. Combining the methods and fabrication is shown in two applied methodologies. The first uses dry powder dot deposition to design one dimensionally graded electromagnetic profiles in a planar fiberglass composite. The second method simultaneously applies antireflective properties and adjusts directivity through a slab through the use of subwavelength structures to achieve a flat antireflective lens. The end result of this work is a complete set of methods, formulations, and fabrication techniques to achieve integrated electromagnetic properties in planar structures.

  3. Numerical integration of KPZ equation with restrictions

    NASA Astrophysics Data System (ADS)

    Torres, M. F.; Buceta, R. C.

    2018-03-01

    In this paper, we introduce a novel integration method of Kardar–Parisi–Zhang (KPZ) equation. It is known that if during the discrete integration of the KPZ equation the nearest-neighbor height-difference exceeds a critical value, instabilities appear and the integration diverges. One way to avoid these instabilities is to replace the KPZ nonlinear-term by a function of the same term that depends on a single adjustable parameter which is able to control pillars or grooves growing on the interface. Here, we propose a different integration method which consists of directly limiting the value taken by the KPZ nonlinearity, thereby imposing a restriction rule that is applied in each integration time-step, as if it were the growth rule of a restricted discrete model, e.g. restricted-solid-on-solid (RSOS). Taking the discrete KPZ equation with restrictions to its dimensionless version, the integration depends on three parameters: the coupling constant g, the inverse of the time-step k, and the restriction constant ε which is chosen to eliminate divergences while keeping all the properties of the continuous KPZ equation. We study in detail the conditions in the parameters’ space that avoid divergences in the 1-dimensional integration and reproduce the scaling properties of the continuous KPZ with a particular parameter set. We apply the tested methodology to the d-dimensional case (d = 3, 4 ) with the purpose of obtaining the growth exponent β, by establishing the conditions of the coupling constant g under which we recover known values reached by other authors, particularly for the RSOS model. This method allows us to infer that d  =  4 is not the critical dimension of the KPZ universality class, where the strong-coupling phase disappears.

  4. 78 FR 32179 - Fisheries of the Caribbean, Gulf of Mexico, and South Atlantic; Reef Fish Fishery of the Gulf of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-29

    ... considered to be the best scientific information available, consistent with National Standard 2 of the... upon the best scientific information available.'' MRIP has slowly been integrated into NMFS's... method for calculating these dates can be found in SERO-LAPP-2013-02 at http://sero.nmfs.noaa.gov...

  5. Emerging Issues in the Utilization of Weblogs in Higher Education Classrooms

    ERIC Educational Resources Information Center

    Ayao-ao, Shirley

    2014-01-01

    This paper examines the emerging issues in the utilization of weblogs in Philippine higher education and how these issues affect the performance of students. This study used a modified Delphi method. The Delphi panel consisted of 12 experts in the integration of technology, particularly blogs, in their teaching. The study yielded the following…

  6. Contributions of cultural services to the ecosystem services agenda

    Treesearch

    Terry C. Daniel; Andreas Muhar; Arne Arnberger; Olivier Aznar; James W. Boyd; Kai M.A. Chan; Robert Costanza; Thomas Elmqvist; Courtney G. Flint; Paul H. Gobster; A. Gret-Regamey; R. Lave; S. Muhar; M. Penker; R.G. Ribe; T. Schauppenlehner; T. Sikor; I. Soloviy; M. Spierenburg; K. Taczanowska; J. Tam; A. von der Dunk

    2012-01-01

    Cultural ecosystem services (ES) are consistently recognized but not yet adequately defined or integrated within the ES framework. A substantial body of models, methods, and data relevant to cultural services has been developed within the social and behavioral sciences before and outside of the ES approach. A selective review of work in landscape aesthetics, cultural...

  7. Establishing Consistent Fish Sampling Methods for Biological Assessments on Inter-state Great Rivers: A Case Study on the Upper Mississippi River.

    EPA Science Inventory

    The use of Indices of Biotic Integrity (IBI) to assess aquatic waters has become an acceptable practice for many Clean Water Act (CWA) agencies. For states that share waters such as Minnesota and Wisconsin along the Mississippi River, the states’ respective IBIs may show vastly d...

  8. Integrating Research Methods into Substantive Courses: A Class Project to Identify Social Backgrounds of Political Elites.

    ERIC Educational Resources Information Center

    Johnson, Margaret A.; Steward, Gary Jr.

    1997-01-01

    Reports on a class project that combined an examination of social class and political power with an introduction to sociological research. The project consisted of compiling biographical profiles of cabinet members from the Ronald Reagan, George Bush, and Bill Clinton administrations. Introduces students to issues of conceptualization,…

  9. Self-Concept Structure and the Quality of Self-Knowledge

    PubMed Central

    Showers, Carolin J.; Ditzfeld, Christopher P.; Zeigler-Hill, Virgil

    2014-01-01

    Objective Explores the hidden vulnerability of individuals with compartmentalized self-concept structures by linking research on self-organization to related models of self functioning. Method Across three studies, college students completed self-descriptive card sorts as a measure of self-concept structure and either the Contingencies of Self-Worth Scale; Likert ratings of perceived authenticity of self-aspects; or a response latency measure of self-esteem accessibility. In all, there were 382 participants (247 females; 77% White, 6% Hispanic, 5% Black, 5% Asian, 4% Native American, and 3% Other). Results Consistent with their unstable self-evaluations, compartmentalized individuals report greater contingencies of self-worth and describe their experience of multiple self-aspects as less authentic than do individuals with integrative self-organization. Compartmentalized individuals also make global self-evaluations more slowly than do integrative individuals. Conclusions Together with previous findings on self-clarity, these results suggest that compartmentalized individuals may experience difficulties in how they know the self, whereas individuals with integrative self-organization may display greater continuity and evaluative consistency across self-aspects, with easier access to evaluative self-knowledge. PMID:25180616

  10. Suppression of planar defects in the molecular beam epitaxy of GaAs/ErAs/GaAs heterostructures

    NASA Astrophysics Data System (ADS)

    Crook, Adam M.; Nair, Hari P.; Ferrer, Domingo A.; Bank, Seth R.

    2011-08-01

    We present a growth method that overcomes the mismatch in rotational symmetry of ErAs and conventional III-V semiconductors, allowing for epitaxially integrated semimetal/semiconductor heterostructures. Transmission electron microscopy and reflection high-energy electron diffraction reveal defect-free overgrowth of ErAs layers, consisting of >2× the total amount of ErAs that can be embedded with conventional layer-by-layer growth methods. We utilize epitaxial ErAs nanoparticles, overgrown with GaAs, as a seed to grow full films of ErAs. Growth proceeds by diffusion of erbium atoms through the GaAs spacer, which remains registered to the underlying substrate, preventing planar defect formation during subsequent GaAs growth. This growth method is promising for metal/semiconductor heterostructures that serve as embedded Ohmic contacts to epitaxial layers and epitaxially integrated active plasmonic devices.

  11. Performance assessment of static lead-lag feedforward controllers for disturbance rejection in PID control loops.

    PubMed

    Yu, Zhenpeng; Wang, Jiandong

    2016-09-01

    This paper assesses the performance of feedforward controllers for disturbance rejection in univariate feedback plus feedforward control loops. The structures of feedback and feedforward controllers are confined to proportional-integral-derivative and static-lead-lag forms, respectively, and the effects of feedback controllers are not considered. The integral squared error (ISE) and total squared variation (TSV) are used as performance metrics. A performance index is formulated by comparing the current ISE and TSV metrics to their own lower bounds as performance benchmarks. A controller performance assessment (CPA) method is proposed to calculate the performance index from measurements. The proposed CPA method resolves two critical limitations in the existing CPA methods, in order to be consistent with industrial scenarios. Numerical and experimental examples illustrate the effectiveness of the obtained results. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Integrating simulated teaching/learning strategies in undergraduate nursing education.

    PubMed

    Sinclair, Barbara; Ferguson, Karen

    2009-01-01

    In this article, the results of a mixed-methods study integrating the use of simulations in a nursing theory course in order to assess students' perceptions of self-efficacy for nursing practice are presented. Nursing students in an intervention group were exposed to a combination of lecture and simulation, and then asked to rate their perceptions of self-efficacy, satisfaction and effectiveness of this combined teaching and learning strategy. Based on Bandura's (1977, 1986) theory of self-efficacy, this study provides data to suggest that students' self-confidence for nursing practice may be increased through the use of simulation as a method of teaching and learning. Students also reported higher levels of satisfaction, effectiveness and consistency with their learning style when exposed to the combination of lecture and simulation than the control group, who were exposed to lecture as the only method of teaching and learning.

  13. STEM integration in middle school career and technical education programs: A Delphi design study

    NASA Astrophysics Data System (ADS)

    Wu-Rorrer, Billy Ray

    The purpose of this qualitative method study with a Delphi research design sought to determine how STEM programs can be effectively integrated into middle school career and technical education programs by local, state, and national educators, administrators, directors, specialists, and curriculum writers. The significance of the study is to provide leaders in CTE with a greater awareness, insight, and strategies about how CTE programs can more effectively integrate academics into career and technical education programs through STEM-related programming. The findings will increase the limited amount of available literature providing best practice strategies for the integration of STEM curriculum into middle school CTE programs. One basic question has guided this research: How can STEM programs be effectively integrated into middle school career and technical education programs? A total of twelve strategies were identified. The strategies of real-world applications and administrative buy-in were the two predominant strategies consistently addressed throughout the review of literature and all three sub-questions in the research study. The Delphi design study consisted of pilot round and three rounds of data collection on barriers, strategies, and professional development for STEM integration in middle school career and technical education programs. Four panelists participated in the pilot round, and 16 panel members not involved in the pilot round participated in the three rounds of questioning and consensus building. In the future, more comprehensive studies can build upon this foundational investigation of middle school CTE programs.

  14. Unified framework to evaluate panmixia and migration direction among multiple sampling locations.

    PubMed

    Beerli, Peter; Palczewski, Michal

    2010-05-01

    For many biological investigations, groups of individuals are genetically sampled from several geographic locations. These sampling locations often do not reflect the genetic population structure. We describe a framework using marginal likelihoods to compare and order structured population models, such as testing whether the sampling locations belong to the same randomly mating population or comparing unidirectional and multidirectional gene flow models. In the context of inferences employing Markov chain Monte Carlo methods, the accuracy of the marginal likelihoods depends heavily on the approximation method used to calculate the marginal likelihood. Two methods, modified thermodynamic integration and a stabilized harmonic mean estimator, are compared. With finite Markov chain Monte Carlo run lengths, the harmonic mean estimator may not be consistent. Thermodynamic integration, in contrast, delivers considerably better estimates of the marginal likelihood. The choice of prior distributions does not influence the order and choice of the better models when the marginal likelihood is estimated using thermodynamic integration, whereas with the harmonic mean estimator the influence of the prior is pronounced and the order of the models changes. The approximation of marginal likelihood using thermodynamic integration in MIGRATE allows the evaluation of complex population genetic models, not only of whether sampling locations belong to a single panmictic population, but also of competing complex structured population models.

  15. Tackling higher derivative ghosts with the Euclidean path integral

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fontanini, Michele; Department of Physics, Syracuse University, Syracuse, New York 13244; Trodden, Mark

    2011-05-15

    An alternative to the effective field theory approach to treat ghosts in higher derivative theories is to attempt to integrate them out via the Euclidean path integral formalism. It has been suggested that this method could provide a consistent framework within which we might tolerate the ghost degrees of freedom that plague, among other theories, the higher derivative gravity models that have been proposed to explain cosmic acceleration. We consider the extension of this idea to treating a class of terms with order six derivatives, and find that for a general term the Euclidean path integral approach works in themore » most trivial background, Minkowski. Moreover we see that even in de Sitter background, despite some difficulties, it is possible to define a probability distribution for tensorial perturbations of the metric.« less

  16. Integrated GNSS Attitude Determination and Positioning for Direct Geo-Referencing

    PubMed Central

    Nadarajah, Nandakumaran; Paffenholz, Jens-André; Teunissen, Peter J. G.

    2014-01-01

    Direct geo-referencing is an efficient methodology for the fast acquisition of 3D spatial data. It requires the fusion of spatial data acquisition sensors with navigation sensors, such as Global Navigation Satellite System (GNSS) receivers. In this contribution, we consider an integrated GNSS navigation system to provide estimates of the position and attitude (orientation) of a 3D laser scanner. The proposed multi-sensor system (MSS) consists of multiple GNSS antennas rigidly mounted on the frame of a rotating laser scanner and a reference GNSS station with known coordinates. Precise GNSS navigation requires the resolution of the carrier phase ambiguities. The proposed method uses the multivariate constrained integer least-squares (MC-LAMBDA) method for the estimation of rotating frame ambiguities and attitude angles. MC-LAMBDA makes use of the known antenna geometry to strengthen the underlying attitude model and, hence, to enhance the reliability of rotating frame ambiguity resolution and attitude determination. The reliable estimation of rotating frame ambiguities is consequently utilized to enhance the relative positioning of the rotating frame with respect to the reference station. This integrated (array-aided) method improves ambiguity resolution, as well as positioning accuracy between the rotating frame and the reference station. Numerical analyses of GNSS data from a real-data campaign confirm the improved performance of the proposed method over the existing method. In particular, the integrated method yields reliable ambiguity resolution and reduces position standard deviation by a factor of about 0.8, matching the theoretical gain of 3/4 for two antennas on the rotating frame and a single antenna at the reference station. PMID:25036330

  17. Higher order explicit symmetric integrators for inseparable forms of coordinates and momenta

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Wu, Xin; Huang, Guoqing; Liu, Fuyao

    2016-06-01

    Pihajoki proposed the extended phase-space second-order explicit symmetric leapfrog methods for inseparable Hamiltonian systems. On the basis of this work, we survey a critical problem on how to mix the variables in the extended phase space. Numerical tests show that sequent permutations of coordinates and momenta can make the leapfrog-like methods yield the most accurate results and the optimal long-term stabilized error behaviour. We also present a novel method to construct many fourth-order extended phase-space explicit symmetric integration schemes. Each scheme represents the symmetric production of six usual second-order leapfrogs without any permutations. This construction consists of four segments: the permuted coordinates, triple product of the usual second-order leapfrog without permutations, the permuted momenta and the triple product of the usual second-order leapfrog without permutations. Similarly, extended phase-space sixth, eighth and other higher order explicit symmetric algorithms are available. We used several inseparable Hamiltonian examples, such as the post-Newtonian approach of non-spinning compact binaries, to show that one of the proposed fourth-order methods is more efficient than the existing methods; examples include the fourth-order explicit symplectic integrators of Chin and the fourth-order explicit and implicit mixed symplectic integrators of Zhong et al. Given a moderate choice for the related mixing and projection maps, the extended phase-space explicit symplectic-like methods are well suited for various inseparable Hamiltonian problems. Samples of these problems involve the algorithmic regularization of gravitational systems with velocity-dependent perturbations in the Solar system and post-Newtonian Hamiltonian formulations of spinning compact objects.

  18. Integrated GNSS attitude determination and positioning for direct geo-referencing.

    PubMed

    Nadarajah, Nandakumaran; Paffenholz, Jens-André; Teunissen, Peter J G

    2014-07-17

    Direct geo-referencing is an efficient methodology for the fast acquisition of 3D spatial data. It requires the fusion of spatial data acquisition sensors with navigation sensors, such as Global Navigation Satellite System (GNSS) receivers. In this contribution, we consider an integrated GNSS navigation system to provide estimates of the position and attitude (orientation) of a 3D laser scanner. The proposed multi-sensor system (MSS) consists of multiple GNSS antennas rigidly mounted on the frame of a rotating laser scanner and a reference GNSS station with known coordinates. Precise GNSS navigation requires the resolution of the carrier phase ambiguities. The proposed method uses the multivariate constrained integer least-squares (MC-LAMBDA) method for the estimation of rotating frame ambiguities and attitude angles. MC-LAMBDA makes use of the known antenna geometry to strengthen the underlying attitude model and, hence, to enhance the reliability of rotating frame ambiguity resolution and attitude determination. The reliable estimation of rotating frame ambiguities is consequently utilized to enhance the relative positioning of the rotating frame with respect to the reference station. This integrated (array-aided) method improves ambiguity resolution, as well as positioning accuracy between the rotating frame and the reference station. Numerical analyses of GNSS data from a real-data campaign confirm the improved performance of the proposed method over the existing method. In particular, the integrated method yields reliable ambiguity resolution and reduces position standard deviation by a factor of about 0:8, matching the theoretical gain of √ 3/4 for two antennas on the rotating frame and a single antenna at the reference station.

  19. Team-based Learning in Therapeutics Workshop Sessions

    PubMed Central

    Kelley, Katherine A.; Metzger, Anne H.; Bellebaum, Katherine L.; McAuley, James W.

    2009-01-01

    Objectives To implement team-based learning in the workshop portion of a pathophysiology and therapeutics sequence of courses to promote integration of concepts across the pharmacy curriculum, provide a consistent problem-solving approach to patient care, and determine the impact on student perceptions of professionalism and teamwork. Design Team-based learning was incorporated into the workshop portion of 3 of 6 pathophysiology and therapeutics courses. Assignments that promoted team-building and application of key concepts were created. Assessment Readiness assurance tests were used to assess individual and team understanding of course materials. Students consistently scored 20% higher on team assessments compared with individual assessments. Mean professionalism and teamwork scores were significantly higher after implementation of team-based learning; however, this improvement was not considered educationally significant. Approximately 91% of students felt team-based learning improved understanding of course materials and 93% of students felt teamwork should continue in workshops. Conclusion Team-based learning is an effective teaching method to ensure a consistent approach to problem-solving and curriculum integration in workshop sessions for a pathophysiology and therapeutics course sequence. PMID:19885069

  20. Introduction to Field Water-Quality Methods for the Collection of Metals - 2007 Project Summary

    USGS Publications Warehouse

    Allen, Monica L.

    2008-01-01

    The U.S. Geological Survey (USGS), Region VI of the U.S. Environmental Protection Agency (USEPA), and the Osage Nation presented three 3-day workshops, in June-August 2007, entitled ?Introduction to Field Water-Quality Methods for the Collection of Metals.? The purpose of the workshops was to provide instruction to tribes within USEPA Region VI on various USGS surface-water measurement methods and water-quality sampling protocols for the collection of surface-water samples for metals analysis. Workshop attendees included members from over 22 tribes and pueblos. USGS instructors came from Oklahoma, New Mexico, and Georgia. Workshops were held in eastern and south-central Oklahoma and New Mexico and covered many topics including presampling preparation, water-quality monitors, and sampling for metals in surface water. Attendees spent one full classroom day learning the field methods used by the USGS Water Resources Discipline and learning about the complexity of obtaining valid water-quality and quality-assurance data. Lectures included (1) a description of metal contamination sources in surface water; (2) introduction on how to select field sites, equipment, and laboratories for sample analysis; (3) collection of sediment in surface water; and (4) utilization of proper protocol and methodology for sampling metals in surface water. Attendees also were provided USGS sampling equipment for use during the field portion of the class so they had actual ?hands-on? experience to take back to their own organizations. The final 2 days of the workshop consisted of field demonstrations of current USGS water-quality sample-collection methods. The hands-on training ensured that attendees were exposed to and experienced proper sampling procedures. Attendees learned integrated-flow techniques during sample collection, field-property documentation, and discharge measurements and calculations. They also used enclosed chambers for sample processing and collected quality-assurance samples to verify their techniques. Benefits of integrated water-quality sample-collection methods are varied. Tribal environmental programs now have the ability to collect data that are comparable across watersheds. The use of consistent sample collection, manipulation, and storage techniques will provide consistent quality data that will enhance the understanding of local water resources. The improved data quality also will help the USEPA better document the condition of the region?s water. Ultimately, these workshops equipped tribes to use uniform sampling methods and to provide consistent quality data that are comparable across the region.

  1. Chaotic Time Series Analysis Method Developed for Stall Precursor Identification in High-Speed Compressors

    NASA Technical Reports Server (NTRS)

    1997-01-01

    A new technique for rotating stall precursor identification in high-speed compressors has been developed at the NASA Lewis Research Center. This pseudo correlation integral method uses a mathematical algorithm based on chaos theory to identify nonlinear dynamic changes in the compressor. Through a study of four various configurations of a high-speed compressor stage, a multistage compressor rig, and an axi-centrifugal engine test, this algorithm, using only a single pressure sensor, has consistently predicted the onset of rotating stall.

  2. Stochastic many-body perturbation theory for anharmonic molecular vibrations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hermes, Matthew R.; Hirata, So, E-mail: sohirata@illinois.edu; CREST, Japan Science and Technology Agency, 4-1-8 Honcho, Kawaguchi, Saitama 332-0012

    2014-08-28

    A new quantum Monte Carlo (QMC) method for anharmonic vibrational zero-point energies and transition frequencies is developed, which combines the diagrammatic vibrational many-body perturbation theory based on the Dyson equation with Monte Carlo integration. The infinite sums of the diagrammatic and thus size-consistent first- and second-order anharmonic corrections to the energy and self-energy are expressed as sums of a few m- or 2m-dimensional integrals of wave functions and a potential energy surface (PES) (m is the vibrational degrees of freedom). Each of these integrals is computed as the integrand (including the value of the PES) divided by the value ofmore » a judiciously chosen weight function evaluated on demand at geometries distributed randomly but according to the weight function via the Metropolis algorithm. In this way, the method completely avoids cumbersome evaluation and storage of high-order force constants necessary in the original formulation of the vibrational perturbation theory; it furthermore allows even higher-order force constants essentially up to an infinite order to be taken into account in a scalable, memory-efficient algorithm. The diagrammatic contributions to the frequency-dependent self-energies that are stochastically evaluated at discrete frequencies can be reliably interpolated, allowing the self-consistent solutions to the Dyson equation to be obtained. This method, therefore, can compute directly and stochastically the transition frequencies of fundamentals and overtones as well as their relative intensities as pole strengths, without fixed-node errors that plague some QMC. It is shown that, for an identical PES, the new method reproduces the correct deterministic values of the energies and frequencies within a few cm{sup −1} and pole strengths within a few thousandths. With the values of a PES evaluated on the fly at random geometries, the new method captures a noticeably greater proportion of anharmonic effects.« less

  3. Evaluation of integration methods for hybrid simulation of complex structural systems through collapse

    NASA Astrophysics Data System (ADS)

    Del Carpio R., Maikol; Hashemi, M. Javad; Mosqueda, Gilberto

    2017-10-01

    This study examines the performance of integration methods for hybrid simulation of large and complex structural systems in the context of structural collapse due to seismic excitations. The target application is not necessarily for real-time testing, but rather for models that involve large-scale physical sub-structures and highly nonlinear numerical models. Four case studies are presented and discussed. In the first case study, the accuracy of integration schemes including two widely used methods, namely, modified version of the implicit Newmark with fixed-number of iteration (iterative) and the operator-splitting (non-iterative) is examined through pure numerical simulations. The second case study presents the results of 10 hybrid simulations repeated with the two aforementioned integration methods considering various time steps and fixed-number of iterations for the iterative integration method. The physical sub-structure in these tests consists of a single-degree-of-freedom (SDOF) cantilever column with replaceable steel coupons that provides repeatable highlynonlinear behavior including fracture-type strength and stiffness degradations. In case study three, the implicit Newmark with fixed-number of iterations is applied for hybrid simulations of a 1:2 scale steel moment frame that includes a relatively complex nonlinear numerical substructure. Lastly, a more complex numerical substructure is considered by constructing a nonlinear computational model of a moment frame coupled to a hybrid model of a 1:2 scale steel gravity frame. The last two case studies are conducted on the same porotype structure and the selection of time steps and fixed number of iterations are closely examined in pre-test simulations. The generated unbalance forces is used as an index to track the equilibrium error and predict the accuracy and stability of the simulations.

  4. Simulation of a steady-state integrated human thermal system.

    NASA Technical Reports Server (NTRS)

    Hsu, F. T.; Fan, L. T.; Hwang, C. L.

    1972-01-01

    The mathematical model of an integrated human thermal system is formulated. The system consists of an external thermal regulation device on the human body. The purpose of the device (a network of cooling tubes held in contact with the surface of the skin) is to maintain the human body in a state of thermoneutrality. The device is controlled by varying the inlet coolant temperature and coolant mass flow rate. The differential equations of the model are approximated by a set of algebraic equations which result from the application of the explicit forward finite difference method to the differential equations. The integrated human thermal system is simulated for a variety of combinations of the inlet coolant temperature, coolant mass flow rate, and metabolic rates.

  5. Multilayered photonic integration on SOI platform using waveguide-based bridge structure

    NASA Astrophysics Data System (ADS)

    Majumder, Saikat; Chakraborty, Rajib

    2018-06-01

    A waveguide based structure on silicon on insulator platform is proposed for vertical integration in photonic integrated circuits. The structure consists of two multimode interference couplers connected by a single mode (SM) section which can act as a bridge over any other underlying device. Two more SM sections acts as input and output of the first and second multimode couplers respectively. Potential application of this structure is in multilayered photonic links. It is shown that the efficiency of the structure can be improved by making some design modifications. The entire simulation is done using effective-index based matrix method. The feature size chosen are comparable to waveguides fabricated previously so as to fabricate the proposed structure easily.

  6. Maximum Temperature Detection System for Integrated Circuits

    NASA Astrophysics Data System (ADS)

    Frankiewicz, Maciej; Kos, Andrzej

    2015-03-01

    The paper describes structure and measurement results of the system detecting present maximum temperature on the surface of an integrated circuit. The system consists of the set of proportional to absolute temperature sensors, temperature processing path and a digital part designed in VHDL. Analogue parts of the circuit where designed with full-custom technique. The system is a part of temperature-controlled oscillator circuit - a power management system based on dynamic frequency scaling method. The oscillator cooperates with microprocessor dedicated for thermal experiments. The whole system is implemented in UMC CMOS 0.18 μm (1.8 V) technology.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, S.

    This report describes the use of several subroutines from the CORLIB core mathematical subroutine library for the solution of a model fluid flow problem. The model consists of the Euler partial differential equations. The equations are spatially discretized using the method of pseudo-characteristics. The resulting system of ordinary differential equations is then integrated using the method of lines. The stiff ordinary differential equation solver LSODE (2) from CORLIB is used to perform the time integration. The non-stiff solver ODE (4) is used to perform a related integration. The linear equation solver subroutines DECOMP and SOLVE are used to solve linearmore » systems whose solutions are required in the calculation of the time derivatives. The monotone cubic spline interpolation subroutines PCHIM and PCHFE are used to approximate water properties. The report describes the use of each of these subroutines in detail. It illustrates the manner in which modules from a standard mathematical software library such as CORLIB can be used as building blocks in the solution of complex problems of practical interest. 9 refs., 2 figs., 4 tabs.« less

  8. Density-functional expansion methods: evaluation of LDA, GGA, and meta-GGA functionals and different integral approximations.

    PubMed

    Giese, Timothy J; York, Darrin M

    2010-12-28

    We extend the Kohn-Sham potential energy expansion (VE) to include variations of the kinetic energy density and use the VE formulation with a 6-31G* basis to perform a "Jacob's ladder" comparison of small molecule properties using density functionals classified as being either LDA, GGA, or meta-GGA. We show that the VE reproduces standard Kohn-Sham DFT results well if all integrals are performed without further approximation, and there is no substantial improvement in using meta-GGA functionals relative to GGA functionals. The advantages of using GGA versus LDA functionals becomes apparent when modeling hydrogen bonds. We furthermore examine the effect of using integral approximations to compute the zeroth-order energy and first-order matrix elements, and the results suggest that the origin of the short-range repulsive potential within self-consistent charge density-functional tight-binding methods mainly arises from the approximations made to the first-order matrix elements.

  9. Psychometric Characteristics of a Patient Reported Outcome Measure on Ego-Integrity and Despair among Cancer Patients

    PubMed Central

    Kleijn, Gitta; Post, Lenneke; Witte, Birgit I.; Bohlmeijer, Ernst T.; Westerhof, Gerben J.; Cuijpers, Pim; Verdonck-de Leeuw, Irma M.

    2016-01-01

    Purpose To evaluate psychometric characteristics of a questionnaire (the Northwestern Ego-integrity Scale (NEIS)) on ego-integrity (the experience of wholeness and meaning in life, even in spite of negative experiences) and despair (the experience of regret about the life one has led, and feelings of sadness, failure and hopelessness) among cancer patients. Methods Cancer patients (n = 164) completed patient reported outcome measures on ego-integrity and despair (NEIS), psychological distress, anxiety and depression (Hospital Anxiety and Depression Scale (HADS)), and quality of life (EORTC QLQ-C30 (cancer survivors, n = 57) or EORTC QLQ-C15-PAL (advanced cancer patients, n = 107)). Confirmatory Factor Analysis was used to assess construct validity. Cronbach’s alpha was used to assess internal consistency. Convergent validity was tested based on a priori defined hypotheses: a higher level of ego-integrity was expected to be related to a higher level of quality of life, and lower levels of distress, depression and anxiety; a higher level of despair was expected to be related to a lower level of quality of life, and higher levels of distress, depression and anxiety. Results The majority of all items (94.5%) of the NEIS were completed by patients and single item missing rate was below 2%. The two subscales, labeled as Ego-integrity (5 items) and Despair (4 items) had acceptable internal consistency (Cronbach’s alpha .72 and .61, respectively). The Ego-integrity subscale was not significantly associated with quality of life, distress, anxiety, or depression. The Despair subscale correlated significantly (p <.001) with quality of life (r = -.29), distress (r = .44), anxiety (r = .47) and depression (r = .32). Conclusion The NEIS has good psychometric characteristics to assess ego-integrity and despair among cancer patients. PMID:27195750

  10. Second-order variational equations for N-body simulations

    NASA Astrophysics Data System (ADS)

    Rein, Hanno; Tamayo, Daniel

    2016-07-01

    First-order variational equations are widely used in N-body simulations to study how nearby trajectories diverge from one another. These allow for efficient and reliable determinations of chaos indicators such as the Maximal Lyapunov characteristic Exponent (MLE) and the Mean Exponential Growth factor of Nearby Orbits (MEGNO). In this paper we lay out the theoretical framework to extend the idea of variational equations to higher order. We explicitly derive the differential equations that govern the evolution of second-order variations in the N-body problem. Going to second order opens the door to new applications, including optimization algorithms that require the first and second derivatives of the solution, like the classical Newton's method. Typically, these methods have faster convergence rates than derivative-free methods. Derivatives are also required for Riemann manifold Langevin and Hamiltonian Monte Carlo methods which provide significantly shorter correlation times than standard methods. Such improved optimization methods can be applied to anything from radial-velocity/transit-timing-variation fitting to spacecraft trajectory optimization to asteroid deflection. We provide an implementation of first- and second-order variational equations for the publicly available REBOUND integrator package. Our implementation allows the simultaneous integration of any number of first- and second-order variational equations with the high-accuracy IAS15 integrator. We also provide routines to generate consistent and accurate initial conditions without the need for finite differencing.

  11. BIOZON: a system for unification, management and analysis of heterogeneous biological data.

    PubMed

    Birkland, Aaron; Yona, Golan

    2006-02-15

    Integration of heterogeneous data types is a challenging problem, especially in biology, where the number of databases and data types increase rapidly. Amongst the problems that one has to face are integrity, consistency, redundancy, connectivity, expressiveness and updatability. Here we present a system (Biozon) that addresses these problems, and offers biologists a new knowledge resource to navigate through and explore. Biozon unifies multiple biological databases consisting of a variety of data types (such as DNA sequences, proteins, interactions and cellular pathways). It is fundamentally different from previous efforts as it uses a single extensive and tightly connected graph schema wrapped with hierarchical ontology of documents and relations. Beyond warehousing existing data, Biozon computes and stores novel derived data, such as similarity relationships and functional predictions. The integration of similarity data allows propagation of knowledge through inference and fuzzy searches. Sophisticated methods of query that span multiple data types were implemented and first-of-a-kind biological ranking systems were explored and integrated. The Biozon system is an extensive knowledge resource of heterogeneous biological data. Currently, it holds more than 100 million biological documents and 6.5 billion relations between them. The database is accessible through an advanced web interface that supports complex queries, "fuzzy" searches, data materialization and more, online at http://biozon.org.

  12. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Technical Performance Assessment

    PubMed Central

    2017-01-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers (QIBs) to measure changes in these features. Critical to the performance of a QIB in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method and metrics used to assess a QIB for clinical use. It is therefore, difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America (RSNA) and the Quantitative Imaging Biomarker Alliance (QIBA) with technical, radiological and statistical experts developed a set of technical performance analysis methods, metrics and study designs that provide terminology, metrics and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of QIB performance studies so that results from multiple studies can be compared, contrasted or combined. PMID:24919831

  13. Classic versus millennial medical lab anatomy.

    PubMed

    Benninger, Brion; Matsler, Nik; Delamarter, Taylor

    2014-10-01

    This study investigated the integration, implementation, and use of cadaver dissection, hospital radiology modalities, surgical tools, and AV technology during a 12-week contemporary anatomy course suggesting a millennial laboratory. The teaching of anatomy has undergone the greatest fluctuation of any of the basic sciences during the past 100 years in order to make room for the meteoric rise in molecular sciences. Classically, anatomy consisted of a 2-year methodical, horizontal, anatomy course; anatomy has now morphed into a 12-week accelerated course in a vertical curriculum, at most institutions. Surface and radiological anatomy is the language for all clinicians regardless of specialty. The objective of this study was to investigate whether integration of full-body dissection anatomy and modern hospital technology, during the anatomy laboratory, could be accomplished in a 12-week anatomy course. Literature search was conducted on anatomy text, journals, and websites regarding contemporary hospital technology integrating multiple image mediums of 37 embalmed cadavers, surgical suite tools and technology, and audio/visual technology. Surgical and radiology professionals were contracted to teach during the anatomy laboratory. Literature search revealed no contemporary studies integrating full-body dissection with hospital technology and behavior. About 37 cadavers were successfully imaged with roentograms, CT, and MRI scans. Students were in favor of the dynamic laboratory consisting of multiple activity sessions occurring simultaneously. Objectively, examination scores proved to be a positive outcome and, subjectively, feedback from students was overwhelmingly positive. Despite the surging molecular based sciences consuming much of the curricula, full-body dissection anatomy is irreplaceable regarding both surface and architectural, radiological anatomy. Radiology should not be a small adjunct to understand full-body dissection, but rather, full-body dissection aids the understanding of radiology mediums. The millennial anatomy dissection laboratory should consist of, at least, 50% radiology integration during full-body dissection. This pilot study is an example of the most comprehensive integration of full-body dissection, radiology, and hospital technology. © 2014 Wiley Periodicals, Inc.

  14. A pilot systematic genomic comparison of recurrence risks of hepatitis B virus-associated hepatocellular carcinoma with low- and high-degree liver fibrosis.

    PubMed

    Yoo, Seungyeul; Wang, Wenhui; Wang, Qin; Fiel, M Isabel; Lee, Eunjee; Hiotis, Spiros P; Zhu, Jun

    2017-12-07

    Chronic hepatitis B virus (HBV) infection leads to liver fibrosis, which is a major risk factor in hepatocellular carcinoma (HCC) and an independent risk factor of recurrence after HCC tumor resection. The HBV genome can be inserted into the human genome, and chronic inflammation may trigger somatic mutations. However, how HBV integration and other genomic changes contribute to the risk of tumor recurrence with regards to the different degree of liver fibrosis is not clearly understood. We sequenced mRNAs of 21 pairs of tumor and distant non-neoplastic liver tissues of HBV-HCC patients and performed comprehensive genomic analyses of our RNAseq data and public available HBV-HCC sequencing data. We developed a robust pipeline for sensitively identifying HBV integration sites based on sequencing data. Simulations showed that our method outperformed existing methods. Applying it to our data, 374 and 106 HBV host genes were identified in non-neoplastic liver and tumor tissues, respectively. When applying it to other RNA sequencing datasets, consistently more HBV integrations were identified in non-neoplastic liver than in tumor tissues. HBV host genes identified in non-neoplastic liver samples significantly overlapped with known tumor suppressor genes. More significant enrichment of tumor suppressor genes was observed among HBV host genes identified from patients with tumor recurrence, indicating the potential risk of tumor recurrence driven by HBV integration in non-neoplastic liver tissues. We also compared SNPs of each sample with SNPs in a cancer census database and inferred samples' pathogenic SNP loads. Pathogenic SNP loads in non-neoplastic liver tissues were consistently higher than those in normal liver tissues. Additionally, HBV host genes identified in non-neoplastic liver tissues significantly overlapped with pathogenic somatic mutations, suggesting that HBV integration and somatic mutations targeting the same set of genes are important to tumorigenesis. HBV integrations and pathogenic mutations showed distinct patterns between low and high liver fibrosis patients with regards to tumor recurrence. The results suggest that HBV integrations and pathogenic SNPs in non-neoplastic tissues are important for tumorigenesis and different recurrence risk models are needed for patients with low and high degrees of liver fibrosis.

  15. Finite-difference computations of rotor loads

    NASA Technical Reports Server (NTRS)

    Caradonna, F. X.; Tung, C.

    1985-01-01

    This paper demonstrates the current and future potential of finite-difference methods for solving real rotor problems which now rely largely on empiricism. The demonstration consists of a simple means of combining existing finite-difference, integral, and comprehensive loads codes to predict real transonic rotor flows. These computations are performed for hover and high-advance-ratio flight. Comparisons are made with experimental pressure data.

  16. Finite-difference computations of rotor loads

    NASA Technical Reports Server (NTRS)

    Caradonna, F. X.; Tung, C.

    1985-01-01

    The current and future potential of finite difference methods for solving real rotor problems which now rely largely on empiricism are demonstrated. The demonstration consists of a simple means of combining existing finite-difference, integral, and comprehensive loads codes to predict real transonic rotor flows. These computations are performed for hover and high-advanced-ratio flight. Comparisons are made with experimental pressure data.

  17. At the Heart of Education: Portfolios as a Learning Tool.

    ERIC Educational Resources Information Center

    Gordon, Rick; Julius, Thomas

    This chapter consists of a conversation between a third-grade teacher and a teacher educator about the advantages of the portfolio method of assessment. The advantages of portfolios are that they are a powerful learning tool as well as an assessment tool, they can make the separate subjects in a curriculum come together in an integrated way, and…

  18. Preliminary Steps Towards Integrating Climate and Land Use (ICLUS): the Development of Land-Use Scenarios Consistent with Climate Change Emissions Storylines (2008, External Review Draft)

    EPA Science Inventory

    This report was prepared by the Global Change Research Program (GCRP) in the National Center for Environmental Assessment (NCEA) of the Office of Research and Development (ORD) at the U.S. Environmental Protection Agency (EPA). This draft report is a description of the methods u...

  19. A hybrid framework of first principles molecular orbital calculations and a three-dimensional integral equation theory for molecular liquids: Multi-center molecular Ornstein-Zernike self-consistent field approach

    NASA Astrophysics Data System (ADS)

    Kido, Kentaro; Kasahara, Kento; Yokogawa, Daisuke; Sato, Hirofumi

    2015-07-01

    In this study, we reported the development of a new quantum mechanics/molecular mechanics (QM/MM)-type framework to describe chemical processes in solution by combining standard molecular-orbital calculations with a three-dimensional formalism of integral equation theory for molecular liquids (multi-center molecular Ornstein-Zernike (MC-MOZ) method). The theoretical procedure is very similar to the 3D-reference interaction site model self-consistent field (RISM-SCF) approach. Since the MC-MOZ method is highly parallelized for computation, the present approach has the potential to be one of the most efficient procedures to treat chemical processes in solution. Benchmark tests to check the validity of this approach were performed for two solute (solute water and formaldehyde) systems and a simple SN2 reaction (Cl- + CH3Cl → ClCH3 + Cl-) in aqueous solution. The results for solute molecular properties and solvation structures obtained by the present approach were in reasonable agreement with those obtained by other hybrid frameworks and experiments. In particular, the results of the proposed approach are in excellent agreements with those of 3D-RISM-SCF.

  20. A hybrid framework of first principles molecular orbital calculations and a three-dimensional integral equation theory for molecular liquids: multi-center molecular Ornstein-Zernike self-consistent field approach.

    PubMed

    Kido, Kentaro; Kasahara, Kento; Yokogawa, Daisuke; Sato, Hirofumi

    2015-07-07

    In this study, we reported the development of a new quantum mechanics/molecular mechanics (QM/MM)-type framework to describe chemical processes in solution by combining standard molecular-orbital calculations with a three-dimensional formalism of integral equation theory for molecular liquids (multi-center molecular Ornstein-Zernike (MC-MOZ) method). The theoretical procedure is very similar to the 3D-reference interaction site model self-consistent field (RISM-SCF) approach. Since the MC-MOZ method is highly parallelized for computation, the present approach has the potential to be one of the most efficient procedures to treat chemical processes in solution. Benchmark tests to check the validity of this approach were performed for two solute (solute water and formaldehyde) systems and a simple SN2 reaction (Cl(-) + CH3Cl → ClCH3 + Cl(-)) in aqueous solution. The results for solute molecular properties and solvation structures obtained by the present approach were in reasonable agreement with those obtained by other hybrid frameworks and experiments. In particular, the results of the proposed approach are in excellent agreements with those of 3D-RISM-SCF.

  1. Unorthodox theoretical methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nedd, Sean

    2012-01-01

    The use of the ReaxFF force field to correlate with NMR mobilities of amine catalytic substituents on a mesoporous silica nanosphere surface is considered. The interfacing of the ReaxFF force field within the Surface Integrated Molecular Orbital/Molecular Mechanics (SIMOMM) method, in order to replicate earlier SIMOMM published data and to compare with the ReaxFF data, is discussed. The development of a new correlation consistent Composite Approach (ccCA) is presented, which incorporates the completely renormalized coupled cluster method with singles, doubles and non-iterative triples corrections towards the determination of heats of formations and reaction pathways which contain biradical species.

  2. Sandia Corporation (Albuquerque, NM)

    DOEpatents

    Diver, Richard B.

    2010-02-23

    A Theoretical Overlay Photographic (TOP) alignment method uses the overlay of a theoretical projected image of a perfectly aligned concentrator on a photographic image of the concentrator to align the mirror facets of a parabolic trough solar concentrator. The alignment method is practical and straightforward, and inherently aligns the mirror facets to the receiver. When integrated with clinometer measurements for which gravity and mechanical drag effects have been accounted for and which are made in a manner and location consistent with the alignment method, all of the mirrors on a common drive can be aligned and optimized for any concentrator orientation.

  3. Research on the Method of Big Data Collecting, Storing and Analyzing of Tongue Diagnosis System

    NASA Astrophysics Data System (ADS)

    Chen, Xiaowei; Wu, Qingfeng

    2018-03-01

    This paper analyzes the contents of the clinical data of tongue diagnosis of TCM (Traditional Chinese Medicine), and puts forward a method to collect, store and analyze the clinical data of tongue diagnosis. Under the guidance of TCM theory of syndrome differentiation and treatment, this method combines with Hadoop, which is a distributed computing system with strong expansibility, and integrates the functions of analysis and conversion of big data of clinic tongue diagnosis. At the same time, the consistency, scalability and security of big data in tongue diagnosis are realized.

  4. Quantifying the process and outcomes of person-centered planning.

    PubMed

    Holburn, S; Jacobson, J W; Vietze, P M; Schwartz, A A; Sersen, E

    2000-09-01

    Although person-centered planning is a popular approach in the field of developmental disabilities, there has been little systematic assessment of its process and outcomes. To measure person-centered planning, we developed three instruments designed to assess its various aspects. We then constructed variables comprising both a Process and an Outcome Index using a combined rational-empirical method. Test-retest reliability and measures of internal consistency appeared adequate. Variable correlations and factor analysis were generally consistent with our conceptualization and resulting item and variable classifications. Practical implications for intervention integrity, program evaluation, and organizational performance are discussed.

  5. The observation of biology implemented by integrated religion values in integrated Islamic school (Decriptive Study in X Integrated Senior Hight School Tasikmalaya)

    NASA Astrophysics Data System (ADS)

    Nurjanah, E.; Adisendjaja, Y. H.; Kusumastuti, M. N.

    2018-05-01

    The learning Integrated Religious value is one of the efforts to increase the motivation of learning and building the student character. This study aims to describe the application of Biology learning integrated religion values in Integrated Islamic School. Research methods used in this research is descriptive. Participants in this study involved the headmaster, headmaster of curriculum, biology teachers, boarding school teachers, the lead of boarding schools, and students. The instruments used are interview, observation and the student questionnaire about learning biology. The results showed that learning in X school consists of two curriculums, there was the curriculum of national education and curriculum of boarding school. The curriculum of national education referred to 2013 curriculum and boarding school curriculum referred to the curriculum of Salafi boarding school (Kitab Kuning). However, in its learning process not delivered integrated. The main obstacle to implementing the learning integrated religious values are 1) the background of general teacher education did not know of any connection between biology subject and subject that are studied in boarding school; 2) schools did not form the teaching team; 3) unavailability of materials integrated religious values.

  6. When integration fails: Prokaryote phylogeny and the tree of life.

    PubMed

    O'Malley, Maureen A

    2013-12-01

    Much is being written these days about integration, its desirability and even its necessity when complex research problems are to be addressed. Seldom, however, do we hear much about the failure of such efforts. Because integration is an ongoing activity rather than a final achievement, and because today's literature about integration consists mostly of manifesto statements rather than precise descriptions, an examination of unsuccessful integration could be illuminating to understand better how it works. This paper will examine the case of prokaryote phylogeny and its apparent failure to achieve integration within broader tree-of-life accounts of evolutionary history (often called 'universal phylogeny'). Despite the fact that integrated databases exist of molecules pertinent to the phylogenetic reconstruction of all lineages of life, and even though the same methods can be used to construct phylogenies wherever the organisms fall on the tree of life, prokaryote phylogeny remains at best only partly integrated within tree-of-life efforts. I will examine why integration does not occur, compare it with integrative practices in animal and other eukaryote phylogeny, and reflect on whether there might be different expectations of what integration should achieve. Finally, I will draw some general conclusions about integration and its function as a 'meta-heuristic' in the normative commitments guiding scientific practice. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. A Context-Aware Method for Authentically Simulating Outdoors Shadows for Mobile Augmented Reality.

    PubMed

    Barreira, Joao; Bessa, Maximino; Barbosa, Luis; Magalhaes, Luis

    2018-03-01

    Visual coherence between virtual and real objects is a major issue in creating convincing augmented reality (AR) applications. To achieve this seamless integration, actual light conditions must be determined in real time to ensure that virtual objects are correctly illuminated and cast consistent shadows. In this paper, we propose a novel method to estimate daylight illumination and use this information in outdoor AR applications to render virtual objects with coherent shadows. The illumination parameters are acquired in real time from context-aware live sensor data. The method works under unprepared natural conditions. We also present a novel and rapid implementation of a state-of-the-art skylight model, from which the illumination parameters are derived. The Sun's position is calculated based on the user location and time of day, with the relative rotational differences estimated from a gyroscope, compass and accelerometer. The results illustrated that our method can generate visually credible AR scenes with consistent shadows rendered from recovered illumination.

  8. A practical salient region feature based 3D multi-modality registration method for medical images

    NASA Astrophysics Data System (ADS)

    Hahn, Dieter A.; Wolz, Gabriele; Sun, Yiyong; Hornegger, Joachim; Sauer, Frank; Kuwert, Torsten; Xu, Chenyang

    2006-03-01

    We present a novel representation of 3D salient region features and its integration into a hybrid rigid-body registration framework. We adopt scale, translation and rotation invariance properties of those intrinsic 3D features to estimate a transform between underlying mono- or multi-modal 3D medical images. Our method combines advantageous aspects of both feature- and intensity-based approaches and consists of three steps: an automatic extraction of a set of 3D salient region features on each image, a robust estimation of correspondences and their sub-pixel accurate refinement with outliers elimination. We propose a region-growing based approach for the extraction of 3D salient region features, a solution to the problem of feature clustering and a reduction of the correspondence search space complexity. Results of the developed algorithm are presented for both mono- and multi-modal intra-patient 3D image pairs (CT, PET and SPECT) that have been acquired for change detection, tumor localization, and time based intra-person studies. The accuracy of the method is clinically evaluated by a medical expert with an approach that measures the distance between a set of selected corresponding points consisting of both anatomical and functional structures or lesion sites. This demonstrates the robustness of the proposed method to image overlap, missing information and artefacts. We conclude by discussing potential medical applications and possibilities for integration into a non-rigid registration framework.

  9. An Integrated Approach of Fuzzy Linguistic Preference Based AHP and Fuzzy COPRAS for Machine Tool Evaluation.

    PubMed

    Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; Aoyama, Hideki; Case, Keith

    2015-01-01

    Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process) and a fuzzy COmplex PRoportional ASsessment (COPRAS) for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment.

  10. Harmonics analysis of the ITER poloidal field converter based on a piecewise method

    NASA Astrophysics Data System (ADS)

    Xudong, WANG; Liuwei, XU; Peng, FU; Ji, LI; Yanan, WU

    2017-12-01

    Poloidal field (PF) converters provide controlled DC voltage and current to PF coils. The many harmonics generated by the PF converter flow into the power grid and seriously affect power systems and electric equipment. Due to the complexity of the system, the traditional integral operation in Fourier analysis is complicated and inaccurate. This paper presents a piecewise method to calculate the harmonics of the ITER PF converter. The relationship between the grid input current and the DC output current of the ITER PF converter is deduced. The grid current is decomposed into the sum of some simple functions. By calculating simple function harmonics based on the piecewise method, the harmonics of the PF converter under different operation modes are obtained. In order to examine the validity of the method, a simulation model is established based on Matlab/Simulink and a relevant experiment is implemented in the ITER PF integration test platform. Comparative results are given. The calculated results are found to be consistent with simulation and experiment. The piecewise method is proved correct and valid for calculating the system harmonics.

  11. A comparison of two methods of teaching. Computer managed instruction and keypad questions versus traditional classroom lecture.

    PubMed

    Halloran, L

    1995-01-01

    Computers increasingly are being integrated into nursing education. One method of integration is through computer managed instruction (CMI). Recently, technology has become available that allows the integration of keypad questions into CMI. This brings a new type of interactivity between students and teachers into the classroom. The purpose of this study was to evaluate differences in achievement between a control group taught by traditional classroom lecture (TCL) and an experimental group taught using CMI and keypad questions. Both control and experimental groups consisted of convenience samples of junior nursing students in a baccalaureate program taking a medical/surgical nursing course. Achievement was measured by three instructor-developed multiple choice examinations. Findings demonstrated that although the experimental group demonstrated increasingly higher test scores as the semester progressed, no statistical difference was found in achievement between the two groups. One reason for this may be phenomenon of vampire video. Initially, the method of presentation overshadowed the content. As students became desensitized to the method, they were able to focus and absorb more content. This study suggests that CMI and keypads are a viable teaching option for nursing education. It is equal to TCL in student achievement and provides a new level of interaction in the classroom setting.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dale, Virginia H; Kline, Keith L; Kaffka, Stephen R

    Landscape sustainability of agricultural systems considers effects of farm activities on social, economic, and ecosystem services at local and regional scales. Sustainable agriculture entails: defining sustainability, developing easily measured indicators of sustainability, moving toward integrated agricultural systems, and offering incentives or imposing regulations to affect farmer behavior. A landscape perspective is useful because landscape ecology provides theory and methods for dealing with spatial heterogeneity, scaling, integration, and complexity. To implement agricultural sustainability, we propose adopting a systems perspective, recognizing spatial heterogeneity, addressing the influences of context, and integrating landscape-design principles. Topics that need further attention at local and regional scalesmore » include (1) protocols for quantifying material and energy flows; (2) effects of management practices; (3) incentives for enhancing social, economic, and ecosystem services; (4) integrated landscape planning and management; (5) monitoring and assessment; (6) effects of societal demand; and (7) consistent and holistic policies for promoting agricultural sustainability.« less

  13. A Diagram Editor for Efficient Biomedical Knowledge Capture and Integration

    PubMed Central

    Yu, Bohua; Jakupovic, Elvis; Wilson, Justin; Dai, Manhong; Xuan, Weijian; Mirel, Barbara; Athey, Brian; Watson, Stanley; Meng, Fan

    2008-01-01

    Understanding the molecular mechanisms underlying complex disorders requires the integration of data and knowledge from different sources including free text literature and various biomedical databases. To facilitate this process, we created the Biomedical Concept Diagram Editor (BCDE) to help researchers distill knowledge from data and literature and aid the process of hypothesis development. A key feature of BCDE is the ability to capture information with a simple drag-and-drop. This is a vast improvement over manual methods of knowledge and data recording and greatly increases the efficiency of the biomedical researcher. BCDE also provides a unique concept matching function to enforce consistent terminology, which enables conceptual relationships deposited by different researchers in the BCDE database to be mined and integrated for intelligible and useful results. We hope BCDE will promote the sharing and integration of knowledge from different researchers for effective hypothesis development. PMID:21347131

  14. Mitigation for one & all: An integrated framework for mitigation of development impacts on biodiversity and ecosystem services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tallis, Heather, E-mail: htallis@tnc.org; Kennedy, Christina M., E-mail: ckennedy@tnc.org; Ruckelshaus, Mary

    Emerging development policies and lending standards call for consideration of ecosystem services when mitigating impacts from development, yet little guidance exists to inform this process. Here we propose a comprehensive framework for advancing both biodiversity and ecosystem service mitigation. We have clarified a means for choosing representative ecosystem service targets alongside biodiversity targets, identified servicesheds as a useful spatial unit for assessing ecosystem service avoidance, impact, and offset options, and discuss methods for consistent calculation of biodiversity and ecosystem service mitigation ratios. We emphasize the need to move away from area- and habitat-based assessment methods for both biodiversity and ecosystemmore » services towards functional assessments at landscape or seascape scales. Such comprehensive assessments more accurately reflect cumulative impacts and variation in environmental quality, social needs and value preferences. The integrated framework builds on the experience of biodiversity mitigation while addressing the unique opportunities and challenges presented by ecosystem service mitigation. These advances contribute to growing potential for economic development planning and execution that will minimize impacts on nature and maximize human wellbeing. - Highlights: • This is the first framework for biodiversity and ecosystem service mitigation. • Functional, landscape scale assessments are ideal for avoidance and offsets. • Servicesheds define the appropriate spatial extent for ecosystem service mitigation. • Mitigation ratios should be calculated consistently and based on standard factors. • Our framework meets the needs of integrated mitigation assessment requirements.« less

  15. Integration of electro-anatomical and imaging data of the left ventricle: An evaluation framework.

    PubMed

    Soto-Iglesias, David; Butakoff, Constantine; Andreu, David; Fernández-Armenta, Juan; Berruezo, Antonio; Camara, Oscar

    2016-08-01

    Integration of electrical and structural information for scar characterization in the left ventricle (LV) is a crucial step to better guide radio-frequency ablation therapies, which are usually performed in complex ventricular tachycardia (VT) cases. This integration requires finding a common representation where to map the electrical information from the electro-anatomical map (EAM) surfaces and tissue viability information from delay-enhancement magnetic resonance images (DE-MRI). However, the development of a consistent integration method is still an open problem due to the lack of a proper evaluation framework to assess its accuracy. In this paper we present both: (i) an evaluation framework to assess the accuracy of EAM and imaging integration strategies with simulated EAM data and a set of global and local measures; and (ii) a new integration methodology based on a planar disk representation where the LV surface meshes are quasi-conformally mapped (QCM) by flattening, allowing for simultaneous visualization and joint analysis of the multi-modal data. The developed evaluation framework was applied to estimate the accuracy of the QCM-based integration strategy on a benchmark dataset of 128 synthetically generated ground-truth cases presenting different scar configurations and EAM characteristics. The obtained results demonstrate a significant reduction in global overlap errors (50-100%) with respect to state-of-the-art integration techniques, also better preserving the local topology of small structures such as conduction channels in scars. Data from seventeen VT patients were also used to study the feasibility of the QCM technique in a clinical setting, consistently outperforming the alternative integration techniques in the presence of sparse and noisy clinical data. The proposed evaluation framework has allowed a rigorous comparison of different EAM and imaging data integration strategies, providing useful information to better guide clinical practice in complex cardiac interventions. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Quantifying complexity in translational research: an integrated approach.

    PubMed

    Munoz, David A; Nembhard, Harriet Black; Kraschnewski, Jennifer L

    2014-01-01

    The purpose of this paper is to quantify complexity in translational research. The impact of major operational steps and technical requirements is calculated with respect to their ability to accelerate moving new discoveries into clinical practice. A three-phase integrated quality function deployment (QFD) and analytic hierarchy process (AHP) method was used to quantify complexity in translational research. A case study in obesity was used to usability. Generally, the evidence generated was valuable for understanding various components in translational research. Particularly, the authors found that collaboration networks, multidisciplinary team capacity and community engagement are crucial for translating new discoveries into practice. As the method is mainly based on subjective opinion, some argue that the results may be biased. However, a consistency ratio is calculated and used as a guide to subjectivity. Alternatively, a larger sample may be incorporated to reduce bias. The integrated QFD-AHP framework provides evidence that could be helpful to generate agreement, develop guidelines, allocate resources wisely, identify benchmarks and enhance collaboration among similar projects. Current conceptual models in translational research provide little or no clue to assess complexity. The proposed method aimed to fill this gap. Additionally, the literature review includes various features that have not been explored in translational research.

  17. An address geocoding method for improving rural spatial information infrastructure

    NASA Astrophysics Data System (ADS)

    Pan, Yuchun; Chen, Baisong; Lu, Zhou; Li, Shuhua; Zhang, Jingbo; Zhou, YanBing

    2010-11-01

    The transition of rural and agricultural management from divisional to integrated mode has highlighted the importance of data integration and sharing. Current data are mostly collected by specific department to satisfy their own needs and lake of considering on wider potential uses. This led to great difference in data format, semantic, and precision even in same area, which is a significant barrier for constructing an integrated rural spatial information system to support integrated management and decision-making. Considering the rural cadastral management system and postal zones, the paper designs a rural address geocoding method based on rural cadastral parcel. It puts forward a geocoding standard which consists of absolute position code, relative position code and extended code. It designs a rural geocoding database model, and addresses collection and update model. Then, based on the rural address geocoding model, it proposed a data model for rural agricultural resources management. The results show that the address coding based on postal code is stable and easy to memorize, two-dimensional coding based on the direction and distance is easy to be located and memorized, while extended code can enhance the extensibility and flexibility of address geocoding.

  18. An adhesive contact mechanics formulation based on atomistically induced surface traction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fan, Houfu; Ren, Bo; Li, Shaofan, E-mail: shaofan@berkeley.edu

    2015-12-01

    In this work, we have developed a novel multiscale computational contact formulation based on the generalized Derjuguin approximation for continua that are characterized by atomistically enriched constitutive relations in order to study macroscopic interaction between arbitrarily shaped deformable continua. The proposed adhesive contact formulation makes use of the microscopic interaction forces between individual particles in the interacting bodies. In particular, the double-layer volume integral describing the contact interaction (energy, force vector, matrix) is converted into a double-layer surface integral through a mathematically consistent approach that employs the divergence theorem and a special partitioning technique. The proposed contact model is formulatedmore » in the nonlinear continuum mechanics framework and implemented using the standard finite element method. With no large penalty constant, the stiffness matrix of the system will in general be well-conditioned, which is of great significance for quasi-static analysis. Three numerical examples are presented to illustrate the capability of the proposed method. Results indicate that with the same mesh configuration, the finite element computation based on the surface integral approach is faster and more accurate than the volume integral based approach. In addition, the proposed approach is energy preserving even in a very long dynamic simulation.« less

  19. Changes in contraceptive use following integration of family planning into ART Services in Cross River State, Nigeria.

    PubMed

    McCarraher, Donna R; Vance, Gwyneth; Gwarzo, Usman; Taylor, Douglas; Chabikuli, Otto Nzapfurundi

    2011-12-01

    One strategy for meeting the contraceptive needs of HIV-positive women is to integrate family planning into HIV services. In 2008 in Cross River State, Nigeria,family planning was integrated into antiretroviral (ART) services in five local government areas. A basic family planning/HIV integration model was implemented in three of these areas, and an enhanced model in the other two. We conducted baseline interviews in 2008 and follow-up interviews 12-14 months later with 274 female ART clients aged 18-45 in 2009 across the five areas. Unmet need for contraception was high at baseline (28-35 percent). We found that modern contraceptive use rose in the enhanced and basic groups; most of the increase was in consistent condom use. Despite an increase in family planning counseling by ART providers, referrals to family planning services for noncondom methods were low. We conclude by presenting alternative strategies for family planning/HIV integration in settings where large families and low contraceptive use are normative.

  20. Integrating care for high-risk patients in England using the virtual ward model: lessons in the process of care integration from three case sites

    PubMed Central

    Lewis, Geraint; Vaithianathan, Rhema; Wright, Lorraine; Brice, Mary R; Lovell, Paul; Rankin, Seth; Bardsley, Martin

    2013-01-01

    Background Patients at high risk of emergency hospitalisation are particularly likely to experience fragmentation in care. The virtual ward model attempts to integrate health and social care by offering multidisciplinary case management to people at high predicted risk of unplanned hospitalisation. Objective To describe the care practice in three virtual ward sites in England and to explore how well each site had achieved meaningful integration. Method Case studies conducted in Croydon, Devon and Wandsworth during 2011–2012, consisting of semi-structured interviews, workshops, and site visits. Results Different versions of the virtual wards intervention had been implemented in each site. In Croydon, multidisciplinary care had reverted back to one-to-one case management. Conclusions To integrate successfully, virtual ward projects should safeguard the multidisciplinary nature of the intervention, ensure the active involvement of General Practitioners, and establish feedback processes to monitor performance such as the number of professions represented at each team meeting. PMID:24250284

  1. Research on efficiency evaluation model of integrated energy system based on hybrid multi-attribute decision-making.

    PubMed

    Li, Yan

    2017-05-25

    The efficiency evaluation model of integrated energy system, involving many influencing factors, and the attribute values are heterogeneous and non-deterministic, usually cannot give specific numerical or accurate probability distribution characteristics, making the final evaluation result deviation. According to the characteristics of the integrated energy system, a hybrid multi-attribute decision-making model is constructed. The evaluation model considers the decision maker's risk preference. In the evaluation of the efficiency of the integrated energy system, the evaluation value of some evaluation indexes is linguistic value, or the evaluation value of the evaluation experts is not consistent. These reasons lead to ambiguity in the decision information, usually in the form of uncertain linguistic values and numerical interval values. In this paper, the risk preference of decision maker is considered when constructing the evaluation model. Interval-valued multiple-attribute decision-making method and fuzzy linguistic multiple-attribute decision-making model are proposed. Finally, the mathematical model of efficiency evaluation of integrated energy system is constructed.

  2. Quantifying biological integrity by taxonomic completeness: its utility in regional and global assessments.

    PubMed

    Hawkins, Charles P

    2006-08-01

    Water resources managers and conservation biologists need reliable, quantitative, and directly comparable methods for assessing the biological integrity of the world's aquatic ecosystems. Large-scale assessments are constrained by the lack of consistency in the indicators used to assess biological integrity and our current inability to translate between indicators. In theory, assessments based on estimates of taxonomic completeness, i.e., the proportion of expected taxa that were observed (observed/expected, O/E) are directly comparable to one another and should therefore allow regionally and globally consistent summaries of the biological integrity of freshwater ecosystems. However, we know little about the true comparability of O/E assessments derived from different data sets or how well O/E assessments perform relative to other indicators in use. I compared the performance (precision, bias, and sensitivity to stressors) of O/E assessments based on five different data sets with the performance of the indicators previously applied to these data (three multimetric indices, a biotic index, and a hybrid method used by the state of Maine). Analyses were based on data collected from U.S. stream ecosystems in North Carolina, the Mid-Atlantic Highlands, Maine, and Ohio. O/E assessments resulted in very similar estimates of mean regional conditions compared with most other indicators once these indicators' values were standardized relative to reference-site means. However, other indicators tended to be biased estimators of O/E, a consequence of differences in their response to natural environmental gradients and sensitivity to stressors. These results imply that, in some cases, it may be possible to compare assessments derived from different indicators by standardizing their values (a statistical approach to data harmonization). In situations where it is difficult to standardize or otherwise harmonize two or more indicators, O/E values can easily be derived from existing raw sample data. With some caveats, O/E should provide more directly comparable assessments of biological integrity across regions than is possible by harmonizing values of a mix of indicators.

  3. Self-consistent projection operator theory in nonlinear quantum optical systems: A case study on degenerate optical parametric oscillators

    NASA Astrophysics Data System (ADS)

    Degenfeld-Schonburg, Peter; Navarrete-Benlloch, Carlos; Hartmann, Michael J.

    2015-05-01

    Nonlinear quantum optical systems are of paramount relevance for modern quantum technologies, as well as for the study of dissipative phase transitions. Their nonlinear nature makes their theoretical study very challenging and hence they have always served as great motivation to develop new techniques for the analysis of open quantum systems. We apply the recently developed self-consistent projection operator theory to the degenerate optical parametric oscillator to exemplify its general applicability to quantum optical systems. We show that this theory provides an efficient method to calculate the full quantum state of each mode with a high degree of accuracy, even at the critical point. It is equally successful in describing both the stationary limit and the dynamics, including regions of the parameter space where the numerical integration of the full problem is significantly less efficient. We further develop a Gaussian approach consistent with our theory, which yields sensibly better results than the previous Gaussian methods developed for this system, most notably standard linearization techniques.

  4. HIV integration sites in latently infected cell lines: evidence of ongoing replication.

    PubMed

    Symons, Jori; Chopra, Abha; Malatinkova, Eva; De Spiegelaere, Ward; Leary, Shay; Cooper, Don; Abana, Chike O; Rhodes, Ajantha; Rezaei, Simin D; Vandekerckhove, Linos; Mallal, Simon; Lewin, Sharon R; Cameron, Paul U

    2017-01-13

    Assessing the location and frequency of HIV integration sites in latently infected cells can potentially inform our understanding of how HIV persists during combination antiretroviral therapy. We developed a novel high throughput sequencing method to evaluate HIV integration sites in latently infected cell lines to determine whether there was virus replication or clonal expansion in these cell lines observed as multiple integration events at the same position. We modified a previously reported method using random DNA shearing and PCR to allow for high throughput robotic processing to identify the site and frequency of HIV integration in latently infected cell lines. Latently infected cell lines infected with intact virus demonstrated multiple distinct HIV integration sites (28 different sites in U1, 110 in ACH-2 and 117 in J1.1 per 150,000 cells). In contrast, cell lines infected with replication-incompetent viruses (J-Lat cells) demonstrated single integration sites. Following in vitro passaging of the ACH-2 cell line, we observed a significant increase in the frequency of unique HIV integration sites and there were multiple mutations and large deletions in the proviral DNA. When the ACH-2 cell line was cultured with the integrase inhibitor raltegravir, there was a significant decrease in the number of unique HIV integration sites and a transient increase in the frequency of 2-LTR circles consistent with virus replication in these cells. Cell lines latently infected with intact HIV demonstrated multiple unique HIV integration sites indicating that these cell lines are not clonal and in the ACH-2 cell line there was evidence of low level virus replication. These findings have implications for the use of latently infected cell lines as models of HIV latency and for the use of these cells as standards.

  5. A study of structural properties of gene network graphs for mathematical modeling of integrated mosaic gene networks.

    PubMed

    Petrovskaya, Olga V; Petrovskiy, Evgeny D; Lavrik, Inna N; Ivanisenko, Vladimir A

    2017-04-01

    Gene network modeling is one of the widely used approaches in systems biology. It allows for the study of complex genetic systems function, including so-called mosaic gene networks, which consist of functionally interacting subnetworks. We conducted a study of a mosaic gene networks modeling method based on integration of models of gene subnetworks by linear control functionals. An automatic modeling of 10,000 synthetic mosaic gene regulatory networks was carried out using computer experiments on gene knockdowns/knockouts. Structural analysis of graphs of generated mosaic gene regulatory networks has revealed that the most important factor for building accurate integrated mathematical models, among those analyzed in the study, is data on expression of genes corresponding to the vertices with high properties of centrality.

  6. Integrated and differential accuracy in resummed cross sections

    DOE PAGES

    Bertolini, Daniele; Solon, Mikhail P.; Walsh, Jonathan R.

    2017-03-30

    Standard QCD resummation techniques provide precise predictions for the spectrum and the cumulant of a given observable. The integrated spectrum and the cumulant differ by higher-order terms which, however, can be numerically significant. Here in this paper we propose a method, which we call the σ-improved scheme, to resolve this issue. It consists of two steps: (i) include higher-order terms in the spectrum to improve the agreement with the cumulant central value, and (ii) employ profile scales that encode correlations between different points to give robust uncertainty estimates for the integrated spectrum. We provide a generic algorithm for determining suchmore » profile scales, and show the application to the thrust distribution in e +e - collisions at NLL'+NLO and NNLL'+NNLO.« less

  7. Reducing Stiffness and Electrical Losses of High Channel Hybrid Nerve Cuff Electrodes

    DTIC Science & Technology

    2001-10-25

    Electrodes were developed. These electrodes consisted of a micromachined polyimide -based thin-film structure with integrated electrode contacts and...electrodes, mechanical properties were enhanced by changing the method of joining silicone and polyimide from using one part silicone adhesive to...gold, platinum, platinum black, polyimide , silicone, polymer bonding I. INTRODUCTION Cuff-type electrodes are probably the most commonly used neural

  8. Man and Environment for the Intermediate Grades; A Curriculum Guide for Environmental Studies for Grades 4-8.

    ERIC Educational Resources Information Center

    National Association for Environmental Education, Miami, FL.

    This curriculum guide consists of environmental studies modules for grades 4-8. The curriculum, which is organized around major concepts, is intended to serve as a guide for program development and as a framework for compiling and sharing ideas on methods and application on a national basis. Each module may be utilized as an integral part of the…

  9. Experiences Obtained with Integration of Student Response Systems for iPod Touch and iPhone into e-Learning Environments

    ERIC Educational Resources Information Center

    Stav, John; Nielsen, Kjetil; Hansen-Nygard, Gabrielle; Thorseth, Trond

    2010-01-01

    A new type of Student Response System (SRS) based up on the latest wireless technologies and hand held mobile devices has been developed to enhance active learning methods and assess students' understanding. The key services involve a set of XML technologies, web services and modern mobile devices. A group consisting of engineers, scientists and…

  10. Classical field configurations and infrared slavery

    NASA Astrophysics Data System (ADS)

    Swanson, Mark S.

    1987-09-01

    The problem of determining the energy of two spinor particles interacting through massless-particle exchange is analyzed using the path-integral method. A form for the long-range interaction energy is obtained by analyzing an abridged vertex derived from the parent theory. This abridged vertex describes the radiation of zero-momentum particles by pointlike sources. A path-integral formalism for calculating the energy of the radiation field associated with this abridged vertex is developed and applications are made to determine the energy necessary for adiabatic separation of two sources in quantum electrodynamics and for an SU(2) Yang-Mills theory. The latter theory is shown to be consistent with confinement via infrared slavery.

  11. Video quality assessment using motion-compensated temporal filtering and manifold feature similarity

    PubMed Central

    Yu, Mei; Jiang, Gangyi; Shao, Feng; Peng, Zongju

    2017-01-01

    Well-performed Video quality assessment (VQA) method should be consistent with human visual systems for better prediction accuracy. In this paper, we propose a VQA method using motion-compensated temporal filtering (MCTF) and manifold feature similarity. To be more specific, a group of frames (GoF) is first decomposed into a temporal high-pass component (HPC) and a temporal low-pass component (LPC) by MCTF. Following this, manifold feature learning (MFL) and phase congruency (PC) are used to predict the quality of temporal LPC and temporal HPC respectively. The quality measures of the LPC and the HPC are then combined as GoF quality. A temporal pooling strategy is subsequently used to integrate GoF qualities into an overall video quality. The proposed VQA method appropriately processes temporal information in video by MCTF and temporal pooling strategy, and simulate human visual perception by MFL. Experiments on publicly available video quality database showed that in comparison with several state-of-the-art VQA methods, the proposed VQA method achieves better consistency with subjective video quality and can predict video quality more accurately. PMID:28445489

  12. METHOD OF APPLYING NICKEL COATINGS ON URANIUM

    DOEpatents

    Gray, A.G.

    1959-07-14

    A method is presented for protectively coating uranium which comprises etching the uranium in an aqueous etching solution containing chloride ions, electroplating a coating of nickel on the etched uranium and heating the nickel plated uranium by immersion thereof in a molten bath composed of a material selected from the group consisting of sodium chloride, potassium chloride, lithium chloride, and mixtures thereof, maintained at a temperature of between 700 and 800 deg C, for a time sufficient to alloy the nickel and uranium and form an integral protective coating of corrosion-resistant uranium-nickel alloy.

  13. Introducing Hurst exponent in pair trading

    NASA Astrophysics Data System (ADS)

    Ramos-Requena, J. P.; Trinidad-Segovia, J. E.; Sánchez-Granero, M. A.

    2017-12-01

    In this paper we introduce a new methodology for pair trading. This new method is based on the calculation of the Hurst exponent of a pair. Our approach is inspired by the classical concepts of co-integration and mean reversion but joined under a unique strategy. We will show how Hurst approach presents better results than classical Distance Method and Correlation strategies in different scenarios. Results obtained prove that this new methodology is consistent and suitable by reducing the drawdown of trading over the classical ones getting as a result a better performance.

  14. Bringing global gyrokinetic turbulence simulations to the transport timescale using a multiscale approach

    NASA Astrophysics Data System (ADS)

    Parker, Jeffrey B.; LoDestro, Lynda L.; Told, Daniel; Merlo, Gabriele; Ricketson, Lee F.; Campos, Alejandro; Jenko, Frank; Hittinger, Jeffrey A. F.

    2018-05-01

    The vast separation dividing the characteristic times of energy confinement and turbulence in the core of toroidal plasmas makes first-principles prediction on long timescales extremely challenging. Here we report the demonstration of a multiple-timescale method that enables coupling global gyrokinetic simulations with a transport solver to calculate the evolution of the self-consistent temperature profile. This method, which exhibits resiliency to the intrinsic fluctuations arising in turbulence simulations, holds potential for integrating nonlocal gyrokinetic turbulence simulations into predictive, whole-device models.

  15. Interchip link system using an optical wiring method.

    PubMed

    Cho, In-Kui; Ryu, Jin-Hwa; Jeong, Myung-Yung

    2008-08-15

    A chip-scale optical link system is presented with a transmitter/receiver and optical wire link. The interchip link system consists of a metal optical bench, a printed circuit board module, a driver/receiver integrated circuit, a vertical cavity surface-emitting laser/photodiode array, and an optical wire link composed of plastic optical fibers (POFs). We have developed a downsized POF and an optical wiring method that allows on-site installation with a simple annealing as optical wiring technologies for achieving high-density optical interchip interconnection within such devices. Successful data transfer measurements are presented.

  16. 2.3 µm range InP-based type-II quantum well Fabry-Perot lasers heterogeneously integrated on a silicon photonic integrated circuit.

    PubMed

    Wang, Ruijun; Sprengel, Stephan; Boehm, Gerhard; Muneeb, Muhammad; Baets, Roel; Amann, Markus-Christian; Roelkens, Gunther

    2016-09-05

    Heterogeneously integrated InP-based type-II quantum well Fabry-Perot lasers on a silicon waveguide circuit emitting in the 2.3 µm wavelength range are demonstrated. The devices consist of a "W"-shaped InGaAs/GaAsSb multi-quantum-well gain section, III-V/silicon spot size converters and two silicon Bragg grating reflectors to form the laser cavity. In continuous-wave (CW) operation, we obtain a threshold current density of 2.7 kA/cm2 and output power of 1.3 mW at 5 °C for 2.35 μm lasers. The lasers emit over 3.7 mW of peak power with a threshold current density of 1.6 kA/cm2 in pulsed regime at room temperature. This demonstration of heterogeneously integrated lasers indicates that the material system and heterogeneous integration method are promising to realize fully integrated III-V/silicon photonics spectroscopic sensors in the 2 µm wavelength range.

  17. The Virtual Hospital: an IAIMS integrating continuing education into the work flow.

    PubMed

    D'Alessandro, M P; Galvin, J R; Erkonen, W E; Curry, D S; Flanagan, J R; D'Alessandro, D M; Lacey, D L; Wagner, J R

    1996-01-01

    Researchers at the University of Iowa are developing an integrated academic information management system (IAIMS) for use on the World Wide Web. The focus is on integrating continuing medical education (CME) into the clinicians' daily work and incorporating consumer health information into patients' life styles. Phase I of the project consists of loosely integrating patients' data, printed library information, and digital library information. Phase II consists of more tightly integrating the three types of information, and Phase III consists of awarding CME credits for reviewing educational, material at the point of patient care, when it has the most potential for improving outcomes. This IAIMS serves a statewide population. Its design and evolution have been heavily influenced by user-centered evaluation.

  18. A Least-Squares Commutator in the Iterative Subspace Method for Accelerating Self-Consistent Field Convergence.

    PubMed

    Li, Haichen; Yaron, David J

    2016-11-08

    A least-squares commutator in the iterative subspace (LCIIS) approach is explored for accelerating self-consistent field (SCF) calculations. LCIIS is similar to direct inversion of the iterative subspace (DIIS) methods in that the next iterate of the density matrix is obtained as a linear combination of past iterates. However, whereas DIIS methods find the linear combination by minimizing a sum of error vectors, LCIIS minimizes the Frobenius norm of the commutator between the density matrix and the Fock matrix. This minimization leads to a quartic problem that can be solved iteratively through a constrained Newton's method. The relationship between LCIIS and DIIS is discussed. Numerical experiments suggest that LCIIS leads to faster convergence than other SCF convergence accelerating methods in a statistically significant sense, and in a number of cases LCIIS leads to stable SCF solutions that are not found by other methods. The computational cost involved in solving the quartic minimization problem is small compared to the typical cost of SCF iterations and the approach is easily integrated into existing codes. LCIIS can therefore serve as a powerful addition to SCF convergence accelerating methods in computational quantum chemistry packages.

  19. Evaluation for the ecological quality status of coastal waters in East China Sea using fuzzy integrated assessment method.

    PubMed

    Wu, H Y; Chen, K L; Chen, Z H; Chen, Q H; Qiu, Y P; Wu, J C; Zhang, J F

    2012-03-01

    This research presented an evaluation for the ecological quality status (EcoQS) of three semi-enclosed coastal areas using fuzzy integrated assessment method (FIAM). With this method, the hierarchy structure was clarified by an index system of 11 indicators selected from biotic elements and physicochemical elements, and the weight vector of index system was calculated with Delphi-Analytic Hierarchy Process (AHP) procedure. Then, the FIAM was used to achieve an EcoQS assessment. As a result of assessment, most of the sampling stations demonstrated a clear gradient in EcoQS, ranging from high to poor status. Among the four statuses, high and good, owning a ratio of 55.9% and 26.5%, respectively, were two dominant statuses for three bays, especially for Sansha Bay and Luoyuan Bay. The assessment results were found consistent with the pressure information and parameters obtained at most stations. In addition, the sources of uncertainty in classification of EcoQS were also discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Characterization of airborne transducers by optical tomography

    PubMed

    Bou Matar O; Pizarro; Certon; Remenieras; Patat

    2000-03-01

    This paper describes the application of an acousto-optic method to the measurement of airborne ultrasound. The method consists of a heterodyne interferometric probing of the pressure emitted by the transducer combined with a tomographic algorithm. The heterodyne interferometer measures the optical phase shift of the probe laser beam, proportional to the acoustic pressure integrated along the light path. A number of projections of the sound field, e.g. a set of ray integrals obtained along parallel paths, are made in moving the transducer to be tested. The main advantage of the method is its very high sensitivity in air (2 x 10(-4) Pa Hz-1/2), combined with a large bandwidth. Using the same principle as X-ray tomography the ultrasonic pressure in a plane perpendicular to the transducer axis can be reconstructed. Several ultrasonic fields emitted by wide-band home made electrostatic transducers, with operating frequencies between 200 and 700 kHz, have been measured. The sensitivities compared favorably with those of commercial airborne transducers.

  1. Tire Force Estimation using a Proportional Integral Observer

    NASA Astrophysics Data System (ADS)

    Farhat, Ahmad; Koenig, Damien; Hernandez-Alcantara, Diana; Morales-Menendez, Ruben

    2017-01-01

    This paper addresses a method for detecting critical stability situations in the lateral vehicle dynamics by estimating the non-linear part of the tire forces. These forces indicate the road holding performance of the vehicle. The estimation method is based on a robust fault detection and estimation approach which minimize the disturbance and uncertainties to residual sensitivity. It consists in the design of a Proportional Integral Observer (PIO), while minimizing the well known H ∞ norm for the worst case uncertainties and disturbance attenuation, and combining a transient response specification. This multi-objective problem is formulated as a Linear Matrix Inequalities (LMI) feasibility problem where a cost function subject to LMI constraints is minimized. This approach is employed to generate a set of switched robust observers for uncertain switched systems, where the convergence of the observer is ensured using a Multiple Lyapunov Function (MLF). Whilst the forces to be estimated can not be physically measured, a simulation scenario with CarSimTM is presented to illustrate the developed method.

  2. Time-resolved flowmetering of gas-liquid two-phase pipe flow by ultrasound pulse Doppler method

    NASA Astrophysics Data System (ADS)

    Murai, Yuichi; Tasaka, Yuji; Takeda, Yasushi

    2012-03-01

    Ultrasound pulse Doppler method is applied for componential volumetric flow rate measurement in multiphase pipe flow consisted of gas and liquid phases. The flowmetering is realized with integration of measured velocity profile over the cross section of the pipe within liquid phase. Spatio-temporal position of interface is detected also with the same ultrasound pulse, which further gives cross sectional void fraction. A series of experimental demonstration was shown by applying this principle of measurement to air-water two-phase flow in a horizontal tube of 40 mm in diameter, of which void fraction ranges from 0 to 90% at superficial velocity from 0 to 15 m/s. The measurement accuracy is verified with a volumetric type flowmeter. We also analyze the accuracy of area integration of liquid velocity distribution for many different patterns of ultrasound measurement lines assigned on the cross section of the tube. The present method is also identified to be pulsation sensor of flow rate that fluctuates with complex gas-liquid interface behavior.

  3. One Solution of the Forward Problem of DC Resistivity Well Logging by the Method of Volume Integral Equations with Allowance for Induced Polarization

    NASA Astrophysics Data System (ADS)

    Kevorkyants, S. S.

    2018-03-01

    For theoretically studying the intensity of the influence exerted by the polarization of the rocks on the results of direct current (DC) well logging, a solution is suggested for the direct inner problem of the DC electric logging in the polarizable model of plane-layered medium containing a heterogeneity by the example of the three-layer model of the hosting medium. Initially, the solution is presented in the form of a traditional vector volume-integral equation of the second kind (IE2) for the electric current density vector. The vector IE2 is solved by the modified iteration-dissipation method. By the transformations, the initial IE2 is reduced to the equation with the contraction integral operator for an axisymmetric model of electrical well-logging of the three-layer polarizable medium intersected by an infinitely long circular cylinder. The latter simulates the borehole with a zone of penetration where the sought vector consists of the radial J r and J z axial (relative to the cylinder's axis) components. The decomposition of the obtained vector IE2 into scalar components and the discretization in the coordinates r and z lead to a heterogeneous system of linear algebraic equations with a block matrix of the coefficients representing 2x2 matrices whose elements are the triple integrals of the mixed derivatives of the second-order Green's function with respect to the parameters r, z, r', and z'. With the use of the analytical transformations and standard integrals, the integrals over the areas of the partition cells and azimuthal coordinate are reduced to single integrals (with respect to the variable t = cos ϕ on the interval [-1, 1]) calculated by the Gauss method for numerical integration. For estimating the effective coefficient of polarization of the complex medium, it is suggested to use the Siegel-Komarov formula.

  4. An Integration of Geophysical Methods to Explore Buried Structures on the Bench and in the Field

    NASA Astrophysics Data System (ADS)

    Booterbaugh, A. P.; Lachhab, A.

    2011-12-01

    In the following study, an integration of geophysical methods and devices were implemented on the bench and in the field to accurately identify buried structures. Electrical resistivity and ground penetrating radar methods, including both a fabricated electrical resistivity apparatus and an electrical resistivity device were all used in this study. The primary goal of the study was to test the accuracy and reliability of the apparatus which costs a fraction of the price of a commercially sold resistivity instrument. The apparatus consists of four electrodes, two multimeters, a 12-volt battery, a DC to AC inverter and wires. Using this apparatus, an electrical current, is injected into earth material through the outer electrodes and the potential voltage is measured across the inner electrodes using a multimeter. The recorded potential and the intensity of the current can then be used to calculate the apparent resistivity of a given material. In this study the Wenner array, which consists of four equally spaced electrodes, was used due to its higher accuracy and greater resolution when investigating lateral variations of resistivity in shallow depths. In addition, the apparatus was used with an electrical resistivity device and a ground penetrating radar unit to explore the buried building foundation of Gustavus Adolphus Hall located on Susquehanna University Campus, Selinsgrove, PA. The apparatus successfully produced consistent results on the bench level revealing the location of small bricks buried under a soil material. In the summer of 2010, seventeen electrical resistivity transects were conducted on the Gustavus Adolphus site where and revealed remnants of the foundation. In the summer of 2011, a ground penetrating radar survey and an electrical resistivity tomography survey were conducted to further explore the site. Together these methods identified the location of the foundation and proved that the apparatus was a reliable tool for regular use on the bench and in the field.

  5. A computerized symbolic integration technique for development of triangular and quadrilateral composite shallow-shell finite elements

    NASA Technical Reports Server (NTRS)

    Anderson, C. M.; Noor, A. K.

    1975-01-01

    Computerized symbolic integration was used in conjunction with group-theoretic techniques to obtain analytic expressions for the stiffness, geometric stiffness, consistent mass, and consistent load matrices of composite shallow shell structural elements. The elements are shear flexible and have variable curvature. A stiffness (displacement) formulation was used with the fundamental unknowns consisting of both the displacement and rotation components of the reference surface of the shell. The triangular elements have six and ten nodes; the quadrilateral elements have four and eight nodes and can have internal degrees of freedom associated with displacement modes which vanish along the edges of the element (bubble modes). The stiffness, geometric stiffness, consistent mass, and consistent load coefficients are expressed as linear combinations of integrals (over the element domain) whose integrands are products of shape functions and their derivatives. The evaluation of the elemental matrices is divided into two separate problems - determination of the coefficients in the linear combination and evaluation of the integrals. The integrals are performed symbolically by using the symbolic-and-algebraic-manipulation language MACSYMA. The efficiency of using symbolic integration in the element development is demonstrated by comparing the number of floating-point arithmetic operations required in this approach with those required by a commonly used numerical quadrature technique.

  6. Staining Methods for Normal and Regenerative Myelin in the Nervous System.

    PubMed

    Carriel, Víctor; Campos, Antonio; Alaminos, Miguel; Raimondo, Stefania; Geuna, Stefano

    2017-01-01

    Histochemical techniques enable the specific identification of myelin by light microscopy. Here we describe three histochemical methods for the staining of myelin suitable for formalin-fixed and paraffin-embedded materials. The first method is conventional luxol fast blue (LFB) method which stains myelin in blue and Nissl bodies and mast cells in purple. The second method is a LBF-based method called MCOLL, which specifically stains the myelin as well the collagen fibers and cells, giving an integrated overview of the histology and myelin content of the tissue. Finally, we describe the osmium tetroxide method, which consist in the osmication of previously fixed tissues. Osmication is performed prior the embedding of tissues in paraffin giving a permanent positive reaction for myelin as well as other lipids present in the tissue.

  7. A review method for UML requirements analysis model employing system-side prototyping.

    PubMed

    Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    User interface prototyping is an effective method for users to validate the requirements defined by analysts at an early stage of a software development. However, a user interface prototype system offers weak support for the analysts to verify the consistency of the specifications about internal aspects of a system such as business logic. As the result, the inconsistency causes a lot of rework costs because the inconsistency often makes the developers impossible to actualize the system based on the specifications. For verifying such consistency, functional prototyping is an effective method for the analysts, but it needs a lot of costs and more detailed specifications. In this paper, we propose a review method so that analysts can verify the consistency among several different kinds of diagrams in UML efficiently by employing system-side prototyping without the detailed model. The system-side prototype system does not have any functions to achieve business logic, but visualizes the results of the integration among the diagrams in UML as Web pages. The usefulness of our proposal was evaluated by applying our proposal into a development of Library Management System (LMS) for a laboratory. This development was conducted by a group. As the result, our proposal was useful for discovering the serious inconsistency caused by the misunderstanding among the members of the group.

  8. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    PubMed

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bovy, Jo, E-mail: bovy@ias.edu

    I describe the design, implementation, and usage of galpy, a python package for galactic-dynamics calculations. At its core, galpy consists of a general framework for representing galactic potentials both in python and in C (for accelerated computations); galpy functions, objects, and methods can generally take arbitrary combinations of these as arguments. Numerical orbit integration is supported with a variety of Runge-Kutta-type and symplectic integrators. For planar orbits, integration of the phase-space volume is also possible. galpy supports the calculation of action-angle coordinates and orbital frequencies for a given phase-space point for general spherical potentials, using state-of-the-art numerical approximations for axisymmetricmore » potentials, and making use of a recent general approximation for any static potential. A number of different distribution functions (DFs) are also included in the current release; currently, these consist of two-dimensional axisymmetric and non-axisymmetric disk DFs, a three-dimensional disk DF, and a DF framework for tidal streams. I provide several examples to illustrate the use of the code. I present a simple model for the Milky Way's gravitational potential consistent with the latest observations. I also numerically calculate the Oort functions for different tracer populations of stars and compare them to a new analytical approximation. Additionally, I characterize the response of a kinematically warm disk to an elliptical m = 2 perturbation in detail. Overall, galpy consists of about 54,000 lines, including 23,000 lines of code in the module, 11,000 lines of test code, and about 20,000 lines of documentation. The test suite covers 99.6% of the code. galpy is available at http://github.com/jobovy/galpy with extensive documentation available at http://galpy.readthedocs.org/en/latest.« less

  10. Matrix method for acoustic levitation simulation.

    PubMed

    Andrade, Marco A B; Perez, Nicolas; Buiochi, Flavio; Adamowski, Julio C

    2011-08-01

    A matrix method is presented for simulating acoustic levitators. A typical acoustic levitator consists of an ultrasonic transducer and a reflector. The matrix method is used to determine the potential for acoustic radiation force that acts on a small sphere in the standing wave field produced by the levitator. The method is based on the Rayleigh integral and it takes into account the multiple reflections that occur between the transducer and the reflector. The potential for acoustic radiation force obtained by the matrix method is validated by comparing the matrix method results with those obtained by the finite element method when using an axisymmetric model of a single-axis acoustic levitator. After validation, the method is applied in the simulation of a noncontact manipulation system consisting of two 37.9-kHz Langevin-type transducers and a plane reflector. The manipulation system allows control of the horizontal position of a small levitated sphere from -6 mm to 6 mm, which is done by changing the phase difference between the two transducers. The horizontal position of the sphere predicted by the matrix method agrees with the horizontal positions measured experimentally with a charge-coupled device camera. The main advantage of the matrix method is that it allows simulation of non-symmetric acoustic levitators without requiring much computational effort.

  11. From themes to hypotheses: following up with quantitative methods.

    PubMed

    Morgan, David L

    2015-06-01

    One important category of mixed-methods research designs consists of quantitative studies that follow up on qualitative research. In this case, the themes that serve as the results from the qualitative methods generate hypotheses for testing through the quantitative methods. That process requires operationalization to translate the concepts from the qualitative themes into quantitative variables. This article illustrates these procedures with examples that range from simple operationalization to the evaluation of complex models. It concludes with an argument for not only following up qualitative work with quantitative studies but also the reverse, and doing so by going beyond integrating methods within single projects to include broader mutual attention from qualitative and quantitative researchers who work in the same field. © The Author(s) 2015.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Y. B.; Zhu, X. W., E-mail: xiaowuzhu1026@znufe.edu.cn; Dai, H. H.

    Though widely used in modelling nano- and micro- structures, Eringen’s differential model shows some inconsistencies and recent study has demonstrated its differences between the integral model, which then implies the necessity of using the latter model. In this paper, an analytical study is taken to analyze static bending of nonlocal Euler-Bernoulli beams using Eringen’s two-phase local/nonlocal model. Firstly, a reduction method is proved rigorously, with which the integral equation in consideration can be reduced to a differential equation with mixed boundary value conditions. Then, the static bending problem is formulated and four types of boundary conditions with various loadings aremore » considered. By solving the corresponding differential equations, exact solutions are obtained explicitly in all of the cases, especially for the paradoxical cantilever beam problem. Finally, asymptotic analysis of the exact solutions reveals clearly that, unlike the differential model, the integral model adopted herein has a consistent softening effect. Comparisons are also made with existing analytical and numerical results, which further shows the advantages of the analytical results obtained. Additionally, it seems that the once controversial nonlocal bar problem in the literature is well resolved by the reduction method.« less

  13. Study on the Algorithm of Judgment Matrix in Analytic Hierarchy Process

    NASA Astrophysics Data System (ADS)

    Lu, Zhiyong; Qin, Futong; Jin, Yican

    2017-10-01

    A new algorithm is proposed for the non-consistent judgment matrix in AHP. A primary judgment matrix is generated firstly through pre-ordering the targeted factor set, and a compared matrix is built through the top integral function. Then a relative error matrix is created by comparing the compared matrix with the primary judgment matrix which is regulated under the control of the relative error matrix and the dissimilar degree of the matrix step by step. Lastly, the targeted judgment matrix is generated to satisfy the requirement of consistence and the least dissimilar degree. The feasibility and validity of the proposed method are verified by simulation results.

  14. Cause and Effect: Testing a Mechanism and Method for the Cognitive Integration of Basic Science.

    PubMed

    Kulasegaram, Kulamakan; Manzone, Julian C; Ku, Cheryl; Skye, Aimee; Wadey, Veronica; Woods, Nicole N

    2015-11-01

    Methods of integrating basic science with clinical knowledge are still debated in medical training. One possibility is increasing the spatial and temporal proximity of clinical content to basic science. An alternative model argues that teaching must purposefully expose relationships between the domains. The authors compared different methods of integrating basic science: causal explanations linking basic science to clinical features, presenting both domains separately but in proximity, and simply presenting clinical features First-year undergraduate health professions students were randomized to four conditions: (1) science-causal explanations (SC), (2) basic science before clinical concepts (BC), (3) clinical concepts before basic science (CB), and (4) clinical features list only (FL). Based on assigned conditions, participants were given explanations for four disorders in neurology or rheumatology followed by a memory quiz and diagnostic test consisting of 12 cases which were repeated after one week. Ninety-four participants completed the study. No difference was found on memory test performance, but on the diagnostic test, a condition by time interaction was found (F[3,88] = 3.05, P < .03, ηp = 0.10). Although all groups had similar immediate performance, the SC group had a minimal decrease in performance on delayed testing; the CB and FL groups had the greatest decreases. These results suggest that creating proximity between basic science and clinical concepts may not guarantee cognitive integration. Although cause-and-effect explanations may not be possible for all domains, making explicit and specific connections between domains will likely facilitate the benefits of integration for learners.

  15. A single EBV-based vector for stable episomal maintenance and expression of GFP in human embryonic stem cells.

    PubMed

    Thyagarajan, Bhaskar; Scheyhing, Kelly; Xue, Haipeng; Fontes, Andrew; Chesnut, Jon; Rao, Mahendra; Lakshmipathy, Uma

    2009-03-01

    Stable expression of transgenes in stem cells has been a challenge due to the nonavailability of efficient transfection methods and the inability of transgenes to support sustained gene expression. Several methods have been reported to stably modify both embryonic and adult stem cells. These methods rely on integration of the transgene into the genome of the host cell, which could result in an expression pattern dependent on the number of integrations and the genomic locus of integration. To overcome this issue, site-specific integration methods mediated by integrase, adeno-associated virus or via homologous recombination have been used to generate stable human embryonic stem cell (hESC) lines. In this study, we describe a vector that is maintained episomally in hESCs. The vector used in this study is based on components derived from the Epstein-Barr virus, containing the Epstein-Barr virus nuclear antigen 1 expression cassette and the OriP origin of replication. The vector also expresses the drug-resistance marker gene hygromycin, which allows for selection and long-term maintenance of cells harboring the plasmid. Using this vector system, we show sustained expression of green fluorescent protein in undifferentiated hESCs and their differentiating embryoid bodies. In addition, the stable hESC clones show comparable expression with and without drug selection. Consistent with this observation, bulk-transfected adipose tissue-derived mesenchymal stem cells showed persistent marker gene expression as they differentiate into adipocytes, osteoblasts and chondroblasts. Episomal vectors offer a fast and efficient method to create hESC reporter lines, which in turn allows one to test the effect of overexpression of various genes on stem cell growth, proliferation and differentiation.

  16. Psychosocial environment for the integrated education opportunities of the disabled in Lithuania

    PubMed Central

    Samsoniene, Laimute; Juozulynas, Algirdas; Surkiene, Gene; Jankauskiene, Konstancija; Lukšiene, Aloyza

    2006-01-01

    Background The policy of the diminution of the social isolation of the disabled is the main objective of the strategy of the EU new policy concerning the disabled. Lithuanian society faces this objective as well. For this reason, this study aiming at providing the theoretical basis for and predicting the possible psycho-social environment in an integrated education system, as well as at the evaluation of the reasons for the formation of a positive approach to the disabled, is especially relevant, since it creates the prerequisites for the optimisation of the process of the integration of disabled schoolchildren into the general system of education. Method The sample of the study consisted of 2471 children from the same schools: not integrated (1958), integrated (126) and special schools (382). Empirical methods: questionnaire poll, comparative analysis. The statistical analysis was carried out using SAS. Results Our study showed that the majority of schoolchildren without disabilities and disabled schoolchildren have positive intentions for interpersonal interactions (>82%) and positive emotions (>69%) independently of the discrepant character of interpersonal contacts, different conditions of education and family life, and despite of low level of knowledge. Conclusion The results of the study confirmed positive intentions for interpersonal interaction between disabled schoolchildren and schoolchildren without disabilities, as well as a positive character of emotions, and disprove the unsound myth of the opponents of the social integration of the disabled stating that disabled children in comprehensive schools would undoubtedly experience offence from their peers without disabilities. PMID:17173706

  17. Precise and Fast Computation of the Gravitational Field of a General Finite Body and Its Application to the Gravitational Study of Asteroid Eros

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fukushima, Toshio, E-mail: Toshio.Fukushima@nao.ac.jp

    In order to obtain the gravitational field of a general finite body inside its Brillouin sphere, we developed a new method to compute the field accurately. First, the body is assumed to consist of some layers in a certain spherical polar coordinate system and the volume mass density of each layer is expanded as a Maclaurin series of the radial coordinate. Second, the line integral with respect to the radial coordinate is analytically evaluated in a closed form. Third, the resulting surface integrals are numerically integrated by the split quadrature method using the double exponential rule. Finally, the associated gravitationalmore » acceleration vector is obtained by numerically differentiating the numerically integrated potential. Numerical experiments confirmed that the new method is capable of computing the gravitational field independently of the location of the evaluation point, namely whether inside, on the surface of, or outside the body. It can also provide sufficiently precise field values, say of 14–15 digits for the potential and of 9–10 digits for the acceleration. Furthermore, its computational efficiency is better than that of the polyhedron approximation. This is because the computational error of the new method decreases much faster than that of the polyhedron models when the number of required transcendental function calls increases. As an application, we obtained the gravitational field of 433 Eros from its shape model expressed as the 24 × 24 spherical harmonic expansion by assuming homogeneity of the object.« less

  18. An Integrated Approach of Fuzzy Linguistic Preference Based AHP and Fuzzy COPRAS for Machine Tool Evaluation

    PubMed Central

    Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; Aoyama, Hideki; Case, Keith

    2015-01-01

    Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process) and a fuzzy COmplex PRoportional ASsessment (COPRAS) for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment. PMID:26368541

  19. Photon-number statistics of twin beams: Self-consistent measurement, reconstruction, and properties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peřina, Jan Jr.; Haderka, Ondřej; Michálek, Václav

    2014-12-04

    A method for the determination of photon-number statistics of twin beams using the joint signal-idler photocount statistics obtained by an iCCD camera is described. It also provides absolute quantum detection efficiency of the camera. Using the measured photocount statistics, quasi-distributions of integrated intensities are obtained. They attain negative values occurring in characteristic strips an a consequence of pairing of photons in twin beams.

  20. Integrated mass transportation system study/definition/implementation program definition

    NASA Technical Reports Server (NTRS)

    Ransone, R. K.; Deptula, D. A.; Yorke, G. G.

    1975-01-01

    Specific actions needed to plan and effect transportation system improvements are identified within the constraints of limited financial, energy and land use resources, and diverse community requirements. A specific program is described which would develop the necessary generalized methodology for devising improved transportation systems and evaluate them against specific criteria for intermodal and intramodal optimization. A consistent, generalized method is provided for study and evaluation of transportation system improvements.

  1. Obstacle Avoidance for Quadcopter using Ultrasonic Sensor

    NASA Astrophysics Data System (ADS)

    Fazlur Rahman, Muhammad; Adhy Sasongko, Rianto

    2018-04-01

    An obstacle avoidance system is being proposed. The system will combine available flight controller with a proposed avoidance method as a proof of concept. Quadcopter as a UAV is integrated with the system which consist of several modes in order to do avoidance. As the previous study, obstacle will be determined using ultrasonic sensor and servo. As result, the quadcopter will move according to its mode and successfully avoid obstacle.

  2. CONVEYING AN EMPATHIC UNDERSTANDING OF THE CIVILIZATION OF THE INDIAN-PAKISTANI SUBCONTINENT THROUGH THE USE OF AN INTEGRATED SERIES OF SELECT FILMS. FINAL REPORT.

    ERIC Educational Resources Information Center

    LEVISON, MELVIN E.

    THIS PROJECT TESTED A METHOD FOR DEVELOPING "AUDIO-VISUAL LITERACY" AND, AT THE SAME TIME, AN EMPATHIC UNDERSTANDING OF ANOTHER CIVILIZATION THROUGH THE USE OF A SERIES OF SELECT FILMS. THE POPULATION CONSISTED OF 28 TEACHERS IN AN IN-SERVICE COURSE AND CLASSES LATER TAUGHT BY IN-SERVICE TRAINED TEACHERS IN FIVE SECONDARY SCHOOLS--THREE…

  3. Generalized emission functions for photon emission from quark-gluon plasma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suryanarayana, S. V.

    The Landau-Pomeranchuk-Migdal effects on photon emission from the quark-gluon plasma have been studied as a function of photon mass, at a fixed temperature of the plasma. The integral equations for the transverse vector function [f-tilde)(p-tilde){sub (perpendicular)})] and the longitudinal function [g-tilde)(p-tilde){sub (perpendicular)})] consisting of multiple scattering effects are solved by the self-consistent iterations method and also by the variational method for the variable set {l_brace}p{sub 0},q{sub 0},Q{sup 2}{r_brace}. We considered the bremsstrahlung and the off shell annihilation (aws) processes. We define two new dynamical scaling variables, x{sub T},x{sub L}, for bremsstrahlung and aws processes which are functions of variables p{submore » 0},q{sub 0},Q{sup 2}. We define four new emission functions for massive photon emission represented by g{sub T}{sup b},g{sub T}{sup a},g{sub L}{sup b},g{sub L}{sup a} and we constructed these using the exact numerical solutions of the integral equations. These four emission functions have been parametrized by suitable simple empirical fits. Using the empirical emission functions, we calculated the imaginary part of the photon polarization tensor as a function of photon mass and energy.« less

  4. Techniques for control of long-term reliability of complex integrated circuits. I - Reliability assurance by test vehicle qualification.

    NASA Technical Reports Server (NTRS)

    Van Vonno, N. W.

    1972-01-01

    Development of an alternate approach to the conventional methods of reliability assurance for large-scale integrated circuits. The product treated is a large-scale T squared L array designed for space applications. The concept used is that of qualification of product by evaluation of the basic processing used in fabricating the product, providing an insight into its potential reliability. Test vehicles are described which enable evaluation of device characteristics, surface condition, and various parameters of the two-level metallization system used. Evaluation of these test vehicles is performed on a lot qualification basis, with the lot consisting of one wafer. Assembled test vehicles are evaluated by high temperature stress at 300 C for short time durations. Stressing at these temperatures provides a rapid method of evaluation and permits a go/no go decision to be made on the wafer lot in a timely fashion.

  5. Real Time Monitoring System of Pollution Waste on Musi River Using Support Vector Machine (SVM) Method

    NASA Astrophysics Data System (ADS)

    Fachrurrozi, Muhammad; Saparudin; Erwin

    2017-04-01

    Real-time Monitoring and early detection system which measures the quality standard of waste in Musi River, Palembang, Indonesia is a system for determining air and water pollution level. This system was designed in order to create an integrated monitoring system and provide real time information that can be read. It is designed to measure acidity and water turbidity polluted by industrial waste, as well as to show and provide conditional data integrated in one system. This system consists of inputting and processing the data, and giving output based on processed data. Turbidity, substances, and pH sensor is used as a detector that produce analog electrical direct current voltage (DC). Early detection system works by determining the value of the ammonia threshold, acidity, and turbidity level of water in Musi River. The results is then presented based on the level group pollution by the Support Vector Machine classification method.

  6. Review of Integrated Noise Model (INM) Equations and Processes

    NASA Technical Reports Server (NTRS)

    Shepherd, Kevin P. (Technical Monitor); Forsyth, David W.; Gulding, John; DiPardo, Joseph

    2003-01-01

    The FAA's Integrated Noise Model (INM) relies on the methods of the SAE AIR-1845 'Procedure for the Calculation of Airplane Noise in the Vicinity of Airports' issued in 1986. Simplifying assumptions for aerodynamics and noise calculation were made in the SAE standard and the INM based on the limited computing power commonly available then. The key objectives of this study are 1) to test some of those assumptions against Boeing source data, and 2) to automate the manufacturer's methods of data development to enable the maintenance of a consistent INM database over time. These new automated tools were used to generate INM database submissions for six airplane types :737-700 (CFM56-7 24K), 767-400ER (CF6-80C2BF), 777-300 (Trent 892), 717-200 (BR7 15), 757-300 (RR535E4B), and the 737-800 (CFM56-7 26K).

  7. A Computer Program for the Computation of Running Gear Temperatures Using Green's Function

    NASA Technical Reports Server (NTRS)

    Koshigoe, S.; Murdock, J. W.; Akin, L. S.; Townsend, D. P.

    1996-01-01

    A new technique has been developed to study two dimensional heat transfer problems in gears. This technique consists of transforming the heat equation into a line integral equation with the use of Green's theorem. The equation is then expressed in terms of eigenfunctions that satisfy the Helmholtz equation, and their corresponding eigenvalues for an arbitrarily shaped region of interest. The eigenfunction are obtalned by solving an intergral equation. Once the eigenfunctions are found, the temperature is expanded in terms of the eigenfunctions with unknown time dependent coefficients that can be solved by using Runge Kutta methods. The time integration is extremely efficient. Therefore, any changes in the time dependent coefficients or source terms in the boundary conditions do not impose a great computational burden on the user. The method is demonstrated by applying it to a sample gear tooth. Temperature histories at representative surface locatons are given.

  8. Bayesian tomography and integrated data analysis in fusion diagnostics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Dong, E-mail: lid@swip.ac.cn; Dong, Y. B.; Deng, Wei

    2016-11-15

    In this article, a Bayesian tomography method using non-stationary Gaussian process for a prior has been introduced. The Bayesian formalism allows quantities which bear uncertainty to be expressed in the probabilistic form so that the uncertainty of a final solution can be fully resolved from the confidence interval of a posterior probability. Moreover, a consistency check of that solution can be performed by checking whether the misfits between predicted and measured data are reasonably within an assumed data error. In particular, the accuracy of reconstructions is significantly improved by using the non-stationary Gaussian process that can adapt to the varyingmore » smoothness of emission distribution. The implementation of this method to a soft X-ray diagnostics on HL-2A has been used to explore relevant physics in equilibrium and MHD instability modes. This project is carried out within a large size inference framework, aiming at an integrated analysis of heterogeneous diagnostics.« less

  9. Quasi-one dimensional (Q1D) nanostructures: Synthesis, integration and device application

    NASA Astrophysics Data System (ADS)

    Chien, Chung-Jen

    Quasi-one-dimensional (Q1D) nanostructures such as nanotubes and nanowires have been widely regarded as the potential building blocks for nanoscale electronic, optoelectronic and sensing devices. In this work, the content can be divided into three categories: Nano-material synthesis and characterizations, alignment and integration, physical properties and application. The dissertation consists of seven chapters as following. Chapter 1 will give an introduction to low dimensional nano-materials. Chapter 2 explains the mechanism how Q1D nanostructure grows. Chapter 3 describes the methods how we horizontally and vertically align the Q1D nanostructure. Chapter 4 and 5 are the electrical and optical device characterization respectively. Chapter 6 demonstrates the integration of Q1D nanostructures and the device application. The last chapter will discuss the future work and conclusion of the thesis.

  10. An improved method for calculating power density in the Fresnel region of circular parabolic reflector antennas

    NASA Astrophysics Data System (ADS)

    Mize, Johnnie E.

    1988-03-01

    A computer program is presented which calculates power density in the Fresnel region of circular parabolic reflector antennas. The aperture illumination model is the one-parameter circular distribution developed by Hansen. The program is applicable to the analysis of electrically large, center-fed (or Cassegrain) paraboloids with linearly polarized feeds. The scalar Kirchoff diffraction integral is solved numerically by Romberg integration for points both on and perpendicular to the antenna boresight. Axial results cannot be directly compared to any others obtained with this illumination model, but they are consistent with what is expected in the Fresnel region where a quadratic must be added to the linear phase term of the integral expression. Graphical results are presented for uniform illumination and for cases where the first sidelobe ratio is 20, 25, 30, and 35 dB.

  11. Methods of Genomic Competency Integration in Practice

    PubMed Central

    Jenkins, Jean; Calzone, Kathleen A.; Caskey, Sarah; Culp, Stacey; Weiner, Marsha; Badzek, Laurie

    2015-01-01

    Purpose Genomics is increasingly relevant to health care, necessitating support for nurses to incorporate genomic competencies into practice. The primary aim of this project was to develop, implement, and evaluate a year-long genomic education intervention that trained, supported, and supervised institutional administrator and educator champion dyads to increase nursing capacity to integrate genomics through assessments of program satisfaction and institutional achieved outcomes. Design Longitudinal study of 23 Magnet Recognition Program® Hospitals (21 intervention, 2 controls) participating in a 1-year new competency integration effort aimed at increasing genomic nursing competency and overcoming barriers to genomics integration in practice. Methods Champion dyads underwent genomic training consisting of one in-person kick-off training meeting followed by monthly education webinars. Champion dyads designed institution-specific action plans detailing objectives, methods or strategies used to engage and educate nursing staff, timeline for implementation, and outcomes achieved. Action plans focused on a minimum of seven genomic priority areas: champion dyad personal development; practice assessment; policy content assessment; staff knowledge needs assessment; staff development; plans for integration; and anticipated obstacles and challenges. Action plans were updated quarterly, outlining progress made as well as inclusion of new methods or strategies. Progress was validated through virtual site visits with the champion dyads and chief nursing officers. Descriptive data were collected on all strategies or methods utilized, and timeline for achievement. Descriptive data were analyzed using content analysis. Findings The complexity of the competency content and the uniqueness of social systems and infrastructure resulted in a significant variation of champion dyad interventions. Conclusions Nursing champions can facilitate change in genomic nursing capacity through varied strategies but require substantial training in order to design and implement interventions. Clinical Relevance Genomics is critical to the practice of all nurses. There is a great opportunity and interest to address genomic knowledge deficits in the practicing nurse workforce as a strategy to improve patient outcomes. Exemplars of champion dyad interventions designed to increase nursing capacity focus on improving education, policy, and healthcare services. PMID:25808828

  12. Integrated geophysical investigations in a fault zone located on southwestern part of İzmir city, Western Anatolia, Turkey

    NASA Astrophysics Data System (ADS)

    Drahor, Mahmut G.; Berge, Meriç A.

    2017-01-01

    Integrated geophysical investigations consisting of joint application of various geophysical techniques have become a major tool of active tectonic investigations. The choice of integrated techniques depends on geological features, tectonic and fault characteristics of the study area, required resolution and penetration depth of used techniques and also financial supports. Therefore, fault geometry and offsets, sediment thickness and properties, features of folded strata and tectonic characteristics of near-surface sections of the subsurface could be thoroughly determined using integrated geophysical approaches. Although Ground Penetrating Radar (GPR), Electrical Resistivity Tomography (ERT) and Seismic Refraction Tomography (SRT) methods are commonly used in active tectonic investigations, other geophysical techniques will also contribute in obtaining of different properties in the complex geological environments of tectonically active sites. In this study, six different geophysical methods used to define faulting locations and characterizations around the study area. These are GPR, ERT, SRT, Very Low Frequency electromagnetic (VLF), magnetics and self-potential (SP). Overall integrated geophysical approaches used in this study gave us commonly important results about the near surface geological properties and faulting characteristics in the investigation area. After integrated interpretations of geophysical surveys, we determined an optimal trench location for paleoseismological studies. The main geological properties associated with faulting process obtained after trenching studies. In addition, geophysical results pointed out some indications concerning the active faulting mechanism in the area investigated. Consequently, the trenching studies indicate that the integrated approach of geophysical techniques applied on the fault problem reveals very useful and interpretative results in description of various properties of faulting zone in the investigation site.

  13. Generalized fourier analyses of the advection-diffusion equation - Part II: two-dimensional domains

    NASA Astrophysics Data System (ADS)

    Voth, Thomas E.; Martinez, Mario J.; Christon, Mark A.

    2004-07-01

    Part I of this work presents a detailed multi-methods comparison of the spatial errors associated with the one-dimensional finite difference, finite element and finite volume semi-discretizations of the scalar advection-diffusion equation. In Part II we extend the analysis to two-dimensional domains and also consider the effects of wave propagation direction and grid aspect ratio on the phase speed, and the discrete and artificial diffusivities. The observed dependence of dispersive and diffusive behaviour on propagation direction makes comparison of methods more difficult relative to the one-dimensional results. For this reason, integrated (over propagation direction and wave number) error and anisotropy metrics are introduced to facilitate comparison among the various methods. With respect to these metrics, the consistent mass Galerkin and consistent mass control-volume finite element methods, and their streamline upwind derivatives, exhibit comparable accuracy, and generally out-perform their lumped mass counterparts and finite-difference based schemes. While this work can only be considered a first step in a comprehensive multi-methods analysis and comparison, it serves to identify some of the relative strengths and weaknesses of multiple numerical methods in a common mathematical framework. Published in 2004 by John Wiley & Sons, Ltd.

  14. Fully integrated carbon nanotube composite thin film strain sensors on flexible substrates for structural health monitoring

    NASA Astrophysics Data System (ADS)

    Burton, A. R.; Lynch, J. P.; Kurata, M.; Law, K. H.

    2017-09-01

    Multifunctional thin film materials have opened many opportunities for novel sensing strategies for structural health monitoring. While past work has established methods of optimizing multifunctional materials to exhibit sensing properties, comparatively less work has focused on their integration into fully functional sensing systems capable of being deployed in the field. This study focuses on the advancement of a scalable fabrication process for the integration of multifunctional thin films into a fully integrated sensing system. This is achieved through the development of an optimized fabrication process that can create a broad range of sensing systems using multifunctional materials. A layer-by-layer deposited multifunctional composite consisting of single walled carbon nanotubes (SWNT) in a polyvinyl alcohol and polysodium-4-styrene sulfonate matrix are incorporated with a lithography process to produce a fully integrated sensing system deposited on a flexible substrate. To illustrate the process, a strain sensing platform consisting of a patterned SWNT-composite thin film as a strain-sensitive element within an amplified Wheatstone bridge sensing circuit is presented. Strain sensing is selected because it presents many of the design and processing challenges that are core to patterning multifunctional thin film materials into sensing systems. Strain sensors fabricated on a flexible polyimide substrate are experimentally tested under cyclic loading using standard four-point bending coupons and a partial-scale steel frame assembly under lateral loading. The study reveals the material process is highly repeatable to produce fully integrated strain sensors with linearity and sensitivity exceeding 0.99 and 5 {{V}}/{ε }, respectively. The thin film strain sensors are robust and are capable of high strain measurements beyond 3000 μ {ε }.

  15. The enhancement of heavy metal removal from polluted river water treatment by integrated carbon-aluminium electrodes using electrochemical method

    NASA Astrophysics Data System (ADS)

    Yussuf, N. M.; Embong, Z.; Abdullah, S.; Masirin, M. I. M.; Tajudin, S. A. A.; Ahmad, S.; Sahari, S. K.; Anuar, A. A.; Maxwell, O.

    2018-01-01

    The heavy metal removal enhancement from polluted river water was investigated using two types of electrodes consist of integrated carbon-aluminium and a conventional aluminium plate electrode at laboratory-scale experiments. In the integrated electrode systems, the aluminium electrode surface was coated with carbon using mixed slurry containing carbon black, polyvinyl acetate and methanol. The electrochemical treatment was conducted on the parameter condition of 90V applied voltage, 3cm of electrode distance and 60 minutes of electrolysis operational time. Surface of both electrodes was investigated for pre and post electrolysis treatment by using SEM-EDX analytical technique. Comparison between both of the electrode configuration exhibits that more metals were accumulated on carbon integrated electrode surfaces for both anode and cathode, and more heavy metals were detected on the cathode. The atomic percentage of metals distributed on the cathode conventional electrode surface consist of Al (94.62%), Zn (1.19%), Mn (0.73%), Fe (2.81%) and Cu (0.64%), while on the anode contained O (12.08%), Al (87.63%) and Zn (0.29%). Meanwhile, cathode surface of integrated electrode was accumulated with more metals; O (75.40%), Al (21.06%), Zn (0.45%), Mn (0.22), Fe (0.29%), Cu (0.84%), Pb (0.47%), Na (0.94%), Cr (0.08%), Ni (0.02%) and Ag (0.22%), while on anode contain Al (3.48%), Fe (0.49 %), C (95.77%), and Pb (0.26%). According to this experiment, it was found that integrated carbon-aluminium electrodes have a great potential to accumulate more heavy metal species from polluted water compare to the conventional aluminium electrode. Here, heavy metal accumulation process obviously very significant on the cathode surface.

  16. Transition of Premature Infants From Hospital to Home Life

    PubMed Central

    Lopez, Greta L.; Anderson, Kathryn Hoehn; Feutchinger, Johanna

    2013-01-01

    Purpose To conduct an integrative literature review to studies that focus on the transition of premature infants from neonatal intensive care unit (NICU) to home. Method A literature search was performed in Cumulative Index to Nursing and Allied Health Literature (CINAHL), PubMed, and MEDLINE to identify studies consisting on the transition of premature infants from hospital to home life. Results The search yielded seven articles that emphasized the need for home visits, child and family assessment methods, methods of keeping contact with health care providers, educational and support groups, and described the nurse’s role in the transition program. The strategy to ease the transition differed in each article. Conclusion Home visits by a nurse were a key component by providing education, support, and nursing care. A program therefore should consist of providing parents of premature infants with home visits implemented by a nurse or staying in contact with a nurse (e.g., via video-conference). PMID:22763247

  17. Hydrological change: Towards a consistent approach to assess changes on both floods and droughts

    NASA Astrophysics Data System (ADS)

    Quesada-Montano, Beatriz; Di Baldassarre, Giuliano; Rangecroft, Sally; Van Loon, Anne F.

    2018-01-01

    Several studies have found that the frequency, magnitude and spatio-temporal distribution of droughts and floods have significantly increased in many regions of the world. Yet, most of the methods used in detecting trends in hydrological extremes 1) focus on either floods or droughts, and/or 2) base their assessment on characteristics that, even though useful for trend identification, cannot be directly used in decision making, e.g. integrated water resources management and disaster risk reduction. In this paper, we first discuss the need for a consistent approach to assess changes on both floods and droughts, and then propose a method based on the theory of runs and threshold levels. Flood and drought changes were assessed in terms of frequency, length and surplus/deficit volumes. This paper also presents an example application using streamflow data from two hydrometric stations along the Po River basin (Italy), Piacenza and Pontelagoscuro, and then discuss opportunities and challenges of the proposed method.

  18. Spectral Element Method for the Simulation of Unsteady Compressible Flows

    NASA Technical Reports Server (NTRS)

    Diosady, Laslo Tibor; Murman, Scott M.

    2013-01-01

    This work uses a discontinuous-Galerkin spectral-element method (DGSEM) to solve the compressible Navier-Stokes equations [1{3]. The inviscid ux is computed using the approximate Riemann solver of Roe [4]. The viscous fluxes are computed using the second form of Bassi and Rebay (BR2) [5] in a manner consistent with the spectral-element approximation. The method of lines with the classical 4th-order explicit Runge-Kutta scheme is used for time integration. Results for polynomial orders up to p = 15 (16th order) are presented. The code is parallelized using the Message Passing Interface (MPI). The computations presented in this work are performed using the Sandy Bridge nodes of the NASA Pleiades supercomputer at NASA Ames Research Center. Each Sandy Bridge node consists of 2 eight-core Intel Xeon E5-2670 processors with a clock speed of 2.6Ghz and 2GB per core memory. On a Sandy Bridge node the Tau Benchmark [6] runs in a time of 7.6s.

  19. GC-ASM: Synergistic Integration of Graph-Cut and Active Shape Model Strategies for Medical Image Segmentation

    PubMed Central

    Chen, Xinjian; Udupa, Jayaram K.; Alavi, Abass; Torigian, Drew A.

    2013-01-01

    Image segmentation methods may be classified into two categories: purely image based and model based. Each of these two classes has its own advantages and disadvantages. In this paper, we propose a novel synergistic combination of the image based graph-cut (GC) method with the model based ASM method to arrive at the GC-ASM method for medical image segmentation. A multi-object GC cost function is proposed which effectively integrates the ASM shape information into the GC framework. The proposed method consists of two phases: model building and segmentation. In the model building phase, the ASM model is built and the parameters of the GC are estimated. The segmentation phase consists of two main steps: initialization (recognition) and delineation. For initialization, an automatic method is proposed which estimates the pose (translation, orientation, and scale) of the model, and obtains a rough segmentation result which also provides the shape information for the GC method. For delineation, an iterative GC-ASM algorithm is proposed which performs finer delineation based on the initialization results. The proposed methods are implemented to operate on 2D images and evaluated on clinical chest CT, abdominal CT, and foot MRI data sets. The results show the following: (a) An overall delineation accuracy of TPVF > 96%, FPVF < 0.6% can be achieved via GC-ASM for different objects, modalities, and body regions. (b) GC-ASM improves over ASM in its accuracy and precision to search region. (c) GC-ASM requires far fewer landmarks (about 1/3 of ASM) than ASM. (d) GC-ASM achieves full automation in the segmentation step compared to GC which requires seed specification and improves on the accuracy of GC. (e) One disadvantage of GC-ASM is its increased computational expense owing to the iterative nature of the algorithm. PMID:23585712

  20. GC-ASM: Synergistic Integration of Graph-Cut and Active Shape Model Strategies for Medical Image Segmentation.

    PubMed

    Chen, Xinjian; Udupa, Jayaram K; Alavi, Abass; Torigian, Drew A

    2013-05-01

    Image segmentation methods may be classified into two categories: purely image based and model based. Each of these two classes has its own advantages and disadvantages. In this paper, we propose a novel synergistic combination of the image based graph-cut (GC) method with the model based ASM method to arrive at the GC-ASM method for medical image segmentation. A multi-object GC cost function is proposed which effectively integrates the ASM shape information into the GC framework. The proposed method consists of two phases: model building and segmentation. In the model building phase, the ASM model is built and the parameters of the GC are estimated. The segmentation phase consists of two main steps: initialization (recognition) and delineation. For initialization, an automatic method is proposed which estimates the pose (translation, orientation, and scale) of the model, and obtains a rough segmentation result which also provides the shape information for the GC method. For delineation, an iterative GC-ASM algorithm is proposed which performs finer delineation based on the initialization results. The proposed methods are implemented to operate on 2D images and evaluated on clinical chest CT, abdominal CT, and foot MRI data sets. The results show the following: (a) An overall delineation accuracy of TPVF > 96%, FPVF < 0.6% can be achieved via GC-ASM for different objects, modalities, and body regions. (b) GC-ASM improves over ASM in its accuracy and precision to search region. (c) GC-ASM requires far fewer landmarks (about 1/3 of ASM) than ASM. (d) GC-ASM achieves full automation in the segmentation step compared to GC which requires seed specification and improves on the accuracy of GC. (e) One disadvantage of GC-ASM is its increased computational expense owing to the iterative nature of the algorithm.

  1. Testing for Divergent Transmission Histories among Cultural Characters: A Study Using Bayesian Phylogenetic Methods and Iranian Tribal Textile Data

    PubMed Central

    Matthews, Luke J.; Tehrani, Jamie J.; Jordan, Fiona M.; Collard, Mark; Nunn, Charles L.

    2011-01-01

    Background Archaeologists and anthropologists have long recognized that different cultural complexes may have distinct descent histories, but they have lacked analytical techniques capable of easily identifying such incongruence. Here, we show how Bayesian phylogenetic analysis can be used to identify incongruent cultural histories. We employ the approach to investigate Iranian tribal textile traditions. Methods We used Bayes factor comparisons in a phylogenetic framework to test two models of cultural evolution: the hierarchically integrated system hypothesis and the multiple coherent units hypothesis. In the hierarchically integrated system hypothesis, a core tradition of characters evolves through descent with modification and characters peripheral to the core are exchanged among contemporaneous populations. In the multiple coherent units hypothesis, a core tradition does not exist. Rather, there are several cultural units consisting of sets of characters that have different histories of descent. Results For the Iranian textiles, the Bayesian phylogenetic analyses supported the multiple coherent units hypothesis over the hierarchically integrated system hypothesis. Our analyses suggest that pile-weave designs represent a distinct cultural unit that has a different phylogenetic history compared to other textile characters. Conclusions The results from the Iranian textiles are consistent with the available ethnographic evidence, which suggests that the commercial rug market has influenced pile-rug designs but not the techniques or designs incorporated in the other textiles produced by the tribes. We anticipate that Bayesian phylogenetic tests for inferring cultural units will be of great value for researchers interested in studying the evolution of cultural traits including language, behavior, and material culture. PMID:21559083

  2. Intercomparison of active and passive instruments for radon and radon progeny in North America

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George, A.C.; Tu, Keng-Wu; Knutson, E.O.

    1995-02-01

    An intercomparison exercise for radon and radon progeny instruments and methods was held at the Environmental Measurements Laboratory (EML) from April 22--May 2, 1994. The exercise was conducted in the new EML radon test and calibration facility in which conditions of exposure are very well controlled. The detection systems of the intercompared instruments consisted of. (1) pulse ionization chambers, (2) electret ionization chambers, (3) scintillation detectors, (4) alpha particle spectrometers with silicon diodes, surface barrier or diffused junction detectors, (5) registration of nuclear tracks in solid-state materials, and (6) activated carbon collectors counted by gamma-ray spectrometry or by alpha- andmore » beta-liquid scintillation counting. 23 private firms, government laboratories and universities participated with a 165 passive integrating devices consisting of: Activated carbon collectors, nuclear alpha track detectors and electret ionization chambers, and 11 active and passive continuous radon monitors. Five portable integrating and continuous instruments were intercompared for radon progeny. Forty grab samples for radon progeny were taken by five groups that participated in person to test and evaluate their primary instruments and methods that measure individual radon progeny and the potential alpha energy concentration (PAEC) in indoor air. Results indicate that more than 80% of the measurements for radon performed with a variety of instruments, are within {plus_minus}10% of actual value. The majority of the instruments that measure individual radon progeny and the PAEC gave results that are in good agreement with the EML reference value. Radon progeny measurements made with continuous and integrating instruments are satisfactory with room for improvement.« less

  3. Time-integrated passive sampling as a complement to conventional point-in-time sampling for investigating drinking-water quality, McKenzie River Basin, Oregon, 2007 and 2010-11

    USGS Publications Warehouse

    McCarthy, Kathleen A.; Alvarez, David A.

    2014-01-01

    The Eugene Water & Electric Board (EWEB) supplies drinking water to approximately 200,000 people in Eugene, Oregon. The sole source of this water is the McKenzie River, which has consistently excellent water quality relative to established drinking-water standards. To ensure that this quality is maintained as land use in the source basin changes and water demands increase, EWEB has developed a proactive management strategy that includes a combination of conventional point-in-time discrete water sampling and time‑integrated passive sampling with a combination of chemical analyses and bioassays to explore water quality and identify where vulnerabilities may lie. In this report, we present the results from six passive‑sampling deployments at six sites in the basin, including the intake and outflow from the EWEB drinking‑water treatment plant (DWTP). This is the first known use of passive samplers to investigate both the source and finished water of a municipal DWTP. Results indicate that low concentrations of several polycyclic aromatic hydrocarbons and organohalogen compounds are consistently present in source waters, and that many of these compounds are also present in finished drinking water. The nature and patterns of compounds detected suggest that land-surface runoff and atmospheric deposition act as ongoing sources of polycyclic aromatic hydrocarbons, some currently used pesticides, and several legacy organochlorine pesticides. Comparison of results from point-in-time and time-integrated sampling indicate that these two methods are complementary and, when used together, provide a clearer understanding of contaminant sources than either method alone.

  4. Laser cutting sandwich structure glass-silicon-glass wafer with laser induced thermal-crack propagation

    NASA Astrophysics Data System (ADS)

    Cai, Yecheng; Wang, Maolu; Zhang, Hongzhi; Yang, Lijun; Fu, Xihong; Wang, Yang

    2017-08-01

    Silicon-glass devices are widely used in IC industry, MEMS and solar energy system because of their reliability and simplicity of the manufacturing process. With the trend toward the wafer level chip scale package (WLCSP) technology, the suitable dicing method of silicon-glass bonded structure wafer has become necessary. In this paper, a combined experimental and computational approach is undertaken to investigate the feasibility of cutting the sandwich structure glass-silicon-glass (SGS) wafer with laser induced thermal-crack propagation (LITP) method. A 1064 nm semiconductor laser cutting system with double laser beams which could simultaneously irradiate on the top and bottom of the sandwich structure wafer has been designed. A mathematical model for describing the physical process of the interaction between laser and SGS wafer, which consists of two surface heating sources and two volumetric heating sources, has been established. The temperature stress distribution are simulated by using finite element method (FEM) analysis software ABAQUS. The crack propagation process is analyzed by using the J-integral method. In the FEM model, a stationary planar crack is embedded in the wafer and the J-integral values around the crack front edge are determined using the FEM. A verification experiment under typical parameters is conducted and the crack propagation profile on the fracture surface is examined by the optical microscope and explained from the stress distribution and J-integral value.

  5. The effectiveness of an integrated conceptual approach to teaching middle school science: A mixed methods investigation

    NASA Astrophysics Data System (ADS)

    Frampton, Susan K.

    This study was designed to compare the effectiveness of using traditional and integrated instructional strategies to increase student understanding of the core concepts of energy. There are mixed messages in the literature as to the success of using an integrated approach to teach science content, despite suggestions its use improves student achievement and attitudes toward science. This study used a mixed-method approach. The quantitative portion was a quasi-experimental non-equivalent control group design, and the qualitative portion included teacher journals, teacher interviews and student journals. There were three teacher participants, two in the treatment group from a district in Sussex County, Delaware. The third teacher participant, from a different district in Sussex County, Delaware was in the control group. The treatment group consisted of 180 students and the control group consisted of 124 students. The results of this study show that the treatment group had significantly less anxiety in science following the treatment, than students in the control group. The F value of 10.89 was significant at p = 0.001. Students in the treatment group also had more enjoyment of and motivation in science than did students in the control group. The F value of 25.025 was significant at a p = 0.000 for the subscale enjoyment of science. The F value of 14.1 was significant at a p = 0.000 for the subscale of motivation in science. Students in the treatment group performed significantly better on the achievement tests, the Integrated Summative Energy Assessment (ISEA) and the science portion of the Delaware State Testing Program (DSTP). The treatment group performed significantly better on the ISEA than did the control group, with an F value of 407.7 significant at p = 0.000. The treatment group performed significantly better on the science DSTP than the control group, with an F value of 65.81 which was significant at p = 0.000. The use of an integrated approach to science instruction decreases student anxiety, increases enjoyment of science and motivation in science more than a traditional approach to teaching science. The use of an integrated approach to science instruction increases student achievement on extended response assessments that measure integrated conceptual knowledge and increases student achievement on traditional summative assessments.

  6. Nonlinear fractional order proportion-integral-derivative active disturbance rejection control method design for hypersonic vehicle attitude control

    NASA Astrophysics Data System (ADS)

    Song, Jia; Wang, Lun; Cai, Guobiao; Qi, Xiaoqiang

    2015-06-01

    Near space hypersonic vehicle model is nonlinear, multivariable and couples in the reentry process, which are challenging for the controller design. In this paper, a nonlinear fractional order proportion integral derivative (NFOPIλDμ) active disturbance rejection control (ADRC) strategy based on a natural selection particle swarm (NSPSO) algorithm is proposed for the hypersonic vehicle flight control. The NFOPIλDμ ADRC method consists of a tracking-differentiator (TD), an NFOPIλDμ controller and an extended state observer (ESO). The NFOPIλDμ controller designed by combining an FOPIλDμ method and a nonlinear states error feedback control law (NLSEF) is to overcome concussion caused by the NLSEF and conversely compensate the insufficiency for relatively simple and rough signal processing caused by the FOPIλDμ method. The TD is applied to coordinate the contradiction between rapidity and overshoot. By attributing all uncertain factors to unknown disturbances, the ESO can achieve dynamic feedback compensation for these disturbances and thus reduce their effects. Simulation results show that the NFOPIλDμ ADRC method can make the hypersonic vehicle six-degree-of-freedom nonlinear model track desired nominal signals accurately and fast, has good stability, dynamic properties and strong robustness against external environmental disturbances.

  7. A comprehensive framework for data quality assessment in CER.

    PubMed

    Holve, Erin; Kahn, Michael; Nahm, Meredith; Ryan, Patrick; Weiskopf, Nicole

    2013-01-01

    The panel addresses the urgent need to ensure that comparative effectiveness research (CER) findings derived from diverse and distributed data sources are based on credible, high-quality data; and that the methods used to assess and report data quality are consistent, comprehensive, and available to data consumers. The panel consists of representatives from four teams leveraging electronic clinical data for CER, patient centered outcomes research (PCOR), and quality improvement (QI) and seeks to change the current paradigm where data quality assessment (DQA) is performed "behind the scenes" using one-off project specific methods. The panelists will present their process of harmonizing existing models for describing and measuring clinical data quality and will describe a comprehensive integrated framework for assessing and reporting DQA findings. The collaborative project is supported by the Electronic Data Methods (EDM) Forum, a three-year grant from the Agency for Healthcare Research and Quality (AHRQ) to facilitate learning and foster collaboration across a set of CER, PCOR, and QI projects designed to build infrastructure and methods for collecting and analyzing prospective data from electronic clinical data .

  8. Integrating Information in Biological Ontologies and Molecular Networks to Infer Novel Terms.

    PubMed

    Li, Le; Yip, Kevin Y

    2016-12-15

    Currently most terms and term-term relationships in Gene Ontology (GO) are defined manually, which creates cost, consistency and completeness issues. Recent studies have demonstrated the feasibility of inferring GO automatically from biological networks, which represents an important complementary approach to GO construction. These methods (NeXO and CliXO) are unsupervised, which means 1) they cannot use the information contained in existing GO, 2) the way they integrate biological networks may not optimize the accuracy, and 3) they are not customized to infer the three different sub-ontologies of GO. Here we present a semi-supervised method called Unicorn that extends these previous methods to tackle the three problems. Unicorn uses a sub-tree of an existing GO sub-ontology as training part to learn parameters in integrating multiple networks. Cross-validation results show that Unicorn reliably inferred the left-out parts of each specific GO sub-ontology. In addition, by training Unicorn with an old version of GO together with biological networks, it successfully re-discovered some terms and term-term relationships present only in a new version of GO. Unicorn also successfully inferred some novel terms that were not contained in GO but have biological meanings well-supported by the literature. Source code of Unicorn is available at http://yiplab.cse.cuhk.edu.hk/unicorn/.

  9. Quantifying complexity in translational research: an integrated approach

    PubMed Central

    Munoz, David A.; Nembhard, Harriet Black; Kraschnewski, Jennifer L.

    2014-01-01

    Purpose This article quantifies complexity in translational research. The impact of major operational steps and technical requirements (TR) is calculated with respect to their ability to accelerate moving new discoveries into clinical practice. Design/Methodology/Approach A three-phase integrated Quality Function Deployment (QFD) and Analytic Hierarchy Process (AHP) method was used to quantify complexity in translational research. A case study in obesity was used to usability. Findings Generally, the evidence generated was valuable for understanding various components in translational research. Particularly, we found that collaboration networks, multidisciplinary team capacity and community engagement are crucial for translating new discoveries into practice. Research limitations/implications As the method is mainly based on subjective opinion, some argue that the results may be biased. However, a consistency ratio is calculated and used as a guide to subjectivity. Alternatively, a larger sample may be incorporated to reduce bias. Practical implications The integrated QFD-AHP framework provides evidence that could be helpful to generate agreement, develop guidelines, allocate resources wisely, identify benchmarks and enhance collaboration among similar projects. Originality/value Current conceptual models in translational research provide little or no clue to assess complexity. The proposed method aimed to fill this gap. Additionally, the literature review includes various features that have not been explored in translational research. PMID:25417380

  10. Highly-ordered supportless three-dimensional nanowire networks with tunable complexity and interwire connectivity for device integration.

    PubMed

    Rauber, Markus; Alber, Ina; Müller, Sven; Neumann, Reinhard; Picht, Oliver; Roth, Christina; Schökel, Alexander; Toimil-Molares, Maria Eugenia; Ensinger, Wolfgang

    2011-06-08

    The fabrication of three-dimensional assemblies consisting of large quantities of nanowires is of great technological importance for various applications including (electro-)catalysis, sensitive sensing, and improvement of electronic devices. Because the spatial distribution of the nanostructured material can strongly influence the properties, architectural design is required in order to use assembled nanowires to their full potential. In addition, special effort has to be dedicated to the development of efficient methods that allow precise control over structural parameters of the nanoscale building blocks as a means of tuning their characteristics. This paper reports the direct synthesis of highly ordered large-area nanowire networks by a method based on hard templates using electrodeposition within nanochannels of ion track-etched polymer membranes. Control over the complexity of the networks and the dimensions of the integrated nanostructures are achieved by a modified template fabrication. The networks possess high surface area and excellent transport properties, turning them into a promising electrocatalyst material as demonstrated by cyclic voltammetry studies on platinum nanowire networks catalyzing methanol oxidation. Our method opens up a new general route for interconnecting nanowires to stable macroscopic network structures of very high integration level that allow easy handling of nanowires while maintaining their connectivity.

  11. Implications of Responsive Space on the Flight Software Architecture

    NASA Technical Reports Server (NTRS)

    Wilmot, Jonathan

    2006-01-01

    The Responsive Space initiative has several implications for flight software that need to be addressed not only within the run-time element, but the development infrastructure and software life-cycle process elements as well. The runtime element must at a minimum support Plug & Play, while the development and process elements need to incorporate methods to quickly generate the needed documentation, code, tests, and all of the artifacts required of flight quality software. Very rapid response times go even further, and imply little or no new software development, requiring instead, using only predeveloped and certified software modules that can be integrated and tested through automated methods. These elements have typically been addressed individually with significant benefits, but it is when they are combined that they can have the greatest impact to Responsive Space. The Flight Software Branch at NASA's Goddard Space Flight Center has been developing the runtime, infrastructure and process elements needed for rapid integration with the Core Flight software System (CFS) architecture. The CFS architecture consists of three main components; the core Flight Executive (cFE), the component catalog, and the Integrated Development Environment (DE). This paper will discuss the design of the components, how they facilitate rapid integration, and lessons learned as the architecture is utilized for an upcoming spacecraft.

  12. Correlation of ground tests and analyses of a dynamically scaled Space Station model configuration

    NASA Technical Reports Server (NTRS)

    Javeed, Mehzad; Edighoffer, Harold H.; Mcgowan, Paul E.

    1993-01-01

    Verification of analytical models through correlation with ground test results of a complex space truss structure is demonstrated. A multi-component, dynamically scaled space station model configuration is the focus structure for this work. Previously established test/analysis correlation procedures are used to develop improved component analytical models. Integrated system analytical models, consisting of updated component analytical models, are compared with modal test results to establish the accuracy of system-level dynamic predictions. Design sensitivity model updating methods are shown to be effective for providing improved component analytical models. Also, the effects of component model accuracy and interface modeling fidelity on the accuracy of integrated model predictions is examined.

  13. Systems Epidemiology: What’s in a Name?

    PubMed Central

    Dammann, O.; Gray, P.; Gressens, P.; Wolkenhauer, O.; Leviton, A.

    2014-01-01

    Systems biology is an interdisciplinary effort to integrate molecular, cellular, tissue, organ, and organism levels of function into computational models that facilitate the identification of general principles. Systems medicine adds a disease focus. Systems epidemiology adds yet another level consisting of antecedents that might contribute to the disease process in populations. In etiologic and prevention research, systems-type thinking about multiple levels of causation will allow epidemiologists to identify contributors to disease at multiple levels as well as their interactions. In public health, systems epidemiology will contribute to the improvement of syndromic surveillance methods. We encourage the creation of computational simulation models that integrate information about disease etiology, pathogenetic data, and the expertise of investigators from different disciplines. PMID:25598870

  14. Probabilistic structural analysis using a general purpose finite element program

    NASA Astrophysics Data System (ADS)

    Riha, D. S.; Millwater, H. R.; Thacker, B. H.

    1992-07-01

    This paper presents an accurate and efficient method to predict the probabilistic response for structural response quantities, such as stress, displacement, natural frequencies, and buckling loads, by combining the capabilities of MSC/NASTRAN, including design sensitivity analysis and fast probability integration. Two probabilistic structural analysis examples have been performed and verified by comparison with Monte Carlo simulation of the analytical solution. The first example consists of a cantilevered plate with several point loads. The second example is a probabilistic buckling analysis of a simply supported composite plate under in-plane loading. The coupling of MSC/NASTRAN and fast probability integration is shown to be orders of magnitude more efficient than Monte Carlo simulation with excellent accuracy.

  15. Flexible, Photopatterned, Colloidal CdSe Semiconductor Nanocrystal Integrated Circuits

    NASA Astrophysics Data System (ADS)

    Stinner, F. Scott

    As semiconductor manufacturing pushes towards smaller and faster transistors, a parallel goal exists to create transistors which are not nearly as small. These transistors are not intended to match the performance of traditional crystalline semiconductors; they are designed to be significantly lower in cost and manufactured using methods that can make them physically flexible for applications where form is more important than speed. One of the developing technologies for this application is semiconductor nanocrystals. We first explore methods to develop CdSe nanocrystal semiconducting "inks" into large-scale, high-speed integrated circuits. We demonstrate photopatterned transistors with mobilities of 10 cm2/Vs on Kapton substrates. We develop new methods for vertical interconnect access holes to demonstrate multi-device integrated circuits including inverting amplifiers with 7 kHz bandwidths, ring oscillators with <10 micros stage delays, and NAND and NOR logic gates. In order to produce higher performance and more consistent transistors, we develop a new hybrid procedure for processing the CdSe nanocrystals. This procedure produces transistors with repeatable performance exceeding 40 cm2/Vs when fabricated on silicon wafers and 16 cm 2/vs when fabricated as part of photopatterned integrated circuits on Kapton substrates. In order to demonstrate the full potential of these transistors, methods to create high-frequency oscillators were developed. These methods allow for transistors to operate at higher voltages as well as provide a means for wirebonding to the Kapton substrate, both of which are required for operating and probing high-frequency oscillators. Simulations of this system show the potential for operation at MHz frequencies. Demonstration of these transistors in this frequency range would open the door for development of CdSe integrated circuits for high-performance sensor, display, and audio applications. To develop further applications of electronics on flexible substrates, procedures are developed for the integration of polychromatic displays on polyethylene terephthalate (PET) substrates and a commercial near field communication (NFC) link. The device draws its power from the NFC transmitter common on smartphones and eliminates the need for a fixed battery. This allows for the mass deployment of flexible, interactive displays on product packaging.

  16. On the comparison of the strength of morphological integration across morphometric datasets.

    PubMed

    Adams, Dean C; Collyer, Michael L

    2016-11-01

    Evolutionary morphologists frequently wish to understand the extent to which organisms are integrated, and whether the strength of morphological integration among subsets of phenotypic variables differ among taxa or other groups. However, comparisons of the strength of integration across datasets are difficult, in part because the summary measures that characterize these patterns (RV coefficient and r PLS ) are dependent both on sample size and on the number of variables. As a solution to this issue, we propose a standardized test statistic (a z-score) for measuring the degree of morphological integration between sets of variables. The approach is based on a partial least squares analysis of trait covariation, and its permutation-based sampling distribution. Under the null hypothesis of a random association of variables, the method displays a constant expected value and confidence intervals for datasets of differing sample sizes and variable number, thereby providing a consistent measure of integration suitable for comparisons across datasets. A two-sample test is also proposed to statistically determine whether levels of integration differ between datasets, and an empirical example examining cranial shape integration in Mediterranean wall lizards illustrates its use. Some extensions of the procedure are also discussed. © 2016 The Author(s). Evolution © 2016 The Society for the Study of Evolution.

  17. New faces of porous Prussian blue: interfacial assembly of integrated hetero-structures for sensing applications.

    PubMed

    Kong, Biao; Selomulya, Cordelia; Zheng, Gengfeng; Zhao, Dongyuan

    2015-11-21

    Prussian blue (PB), the oldest synthetic coordination compound, is a classic and fascinating transition metal coordination material. Prussian blue is based on a three-dimensional (3-D) cubic polymeric porous network consisting of alternating ferric and ferrous ions, which provides facile assembly as well as precise interaction with active sites at functional interfaces. A fundamental understanding of the assembly mechanism of PB hetero-interfaces is essential to enable the full potential applications of PB crystals, including chemical sensing, catalysis, gas storage, drug delivery and electronic displays. Developing controlled assembly methods towards functionally integrated hetero-interfaces with adjustable sizes and morphology of PB crystals is necessary. A key point in the functional interface and device integration of PB nanocrystals is the fabrication of hetero-interfaces in a well-defined and oriented fashion on given substrates. This review will bring together these key aspects of the hetero-interfaces of PB nanocrystals, ranging from structure and properties, interfacial assembly strategies, to integrated hetero-structures for diverse sensing.

  18. Forming of science teacher thinking through integrated laboratory exercises

    NASA Astrophysics Data System (ADS)

    Horváthová, Daniela; Rakovská, Mária; Zelenický, Ľubomír

    2017-01-01

    Within the three-semester optional course Science we have also included into curricula the subject entitled Science Practicum consisting of laboratory exercises of complementary natural scientific disciplines whose content exceeds the boundaries of relevant a scientific discipline (physics, biology, …). The paper presents the structure and selected samples of laboratory exercises of physical part of Science Practicum in which we have processed in an integrated way the knowledge of physics and biology at secondary grammar school. When planning the exercises we have proceeded from those areas of mentioned disciplines in which we can appropriately apply integration of knowledge and where the measurement methods are used. We have focused on the integration of knowledge in the field of human sensory organs (eye, ear), dolphins, bats (spatial orientation) and bees (ommatidium of faceted eye) and their modelling. Laboratory exercises are designed in such a way that they would motivate future teachers of natural scientific subjects to work independently with specialized literature of the mentioned natural sciences and ICT.

  19. Strategies to promote a climate of academic integrity and minimize student cheating and plagiarism.

    PubMed

    Scanlan, Craig L

    2006-01-01

    Student academic misconduct is a growing problem for colleges and universities, including those responsible for preparing health professionals. Although the implementation of honor codes has had a positive impact on this problem, further reduction in student cheating and plagiarism can be achieved only via a comprehensive strategy that promotes an institutional culture of academic integrity. Such a strategy must combine efforts both to deter and detect academic misconduct, along with fair but rigorous application of sanctions against such behaviors. Methods useful in preventing or deterring dishonest behaviors among students include early integrity training complemented with course-level reinforcement, faculty role-modeling, and the application of selected testing/assignment preventive strategies, including honor pledges and honesty declarations. Giving students more responsibility for oversight of academic integrity also may help address this problem and better promote the culture needed to uphold its principles. Successful enforcement requires that academic administration provide strong and visible support for upholding academic integrity standards, including the provision of a clear and fair process and the consistent application of appropriate sanctions against those whose conduct is found to violate these standards.

  20. Quasi-optical simulation of the electron cyclotron plasma heating in a mirror magnetic trap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shalashov, A. G., E-mail: ags@appl.sci-nnov.ru; Balakin, A. A.; Khusainov, T. A.

    The resonance microwave plasma heating in a large-scale open magnetic trap is simulated taking into account all the basic wave effects during the propagation of short-wavelength wave beams (diffraction, dispersion, and aberration) within the framework of the consistent quasi-optical approximation of Maxwell’s equations. The quasi-optical method is generalized to the case of inhomogeneous media with absorption dispersion, a new form of the quasi-optical equation is obtained, the efficient method for numerical integration is found, and simulation results are verified on the GDT facility (Novosibirsk).

  1. ROLE OF YOGA IN THE TREATMENT OF NEUROTIC DISORDERS: CURRENT STATUS AND FUTURE DIRECTIONS

    PubMed Central

    Grover, Poonam; Varma, V.K.; Pershad, D.; Verma, S.K.

    1994-01-01

    A large number of studies have consistently demonstrated the potential of yoga, not only in the treatment of psychiatric and psychosomatic disorder but also in promoting positive physical and mental health. This paper reviews various studies on the treatment of neurosis with techniques derived from yoga. A few lacunae have been identified and possible directions for future research are outlined. It is hoped that research along these lines will develop a standardized method of yoga therapy which can be utilized and integrated within the existing methods of treatment of neurotic disorders. PMID:21743694

  2. Monte Carlo errors with less errors

    NASA Astrophysics Data System (ADS)

    Wolff, Ulli; Alpha Collaboration

    2004-01-01

    We explain in detail how to estimate mean values and assess statistical errors for arbitrary functions of elementary observables in Monte Carlo simulations. The method is to estimate and sum the relevant autocorrelation functions, which is argued to produce more certain error estimates than binning techniques and hence to help toward a better exploitation of expensive simulations. An effective integrated autocorrelation time is computed which is suitable to benchmark efficiencies of simulation algorithms with regard to specific observables of interest. A Matlab code is offered for download that implements the method. It can also combine independent runs (replica) allowing to judge their consistency.

  3. Extension of CE/SE method to non-equilibrium dissociating flows

    NASA Astrophysics Data System (ADS)

    Wen, C. Y.; Saldivar Massimi, H.; Shen, H.

    2018-03-01

    In this study, the hypersonic non-equilibrium flows over rounded nose geometries are numerically investigated by a robust conservation element and solution element (CE/SE) code, which is based on hybrid meshes consisting of triangular and quadrilateral elements. The dissociating and recombination chemical reactions as well as the vibrational energy relaxation are taken into account. The stiff source terms are solved by an implicit trapezoidal method of integration. Comparison with laboratory and numerical cases are provided to demonstrate the accuracy and reliability of the present CE/SE code in simulating hypersonic non-equilibrium flows.

  4. Discontinuous Skeletal Gradient Discretisation methods on polytopal meshes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di Pietro, Daniele A.; Droniou, Jérôme; Manzini, Gianmarco

    Here, in this work we develop arbitrary-order Discontinuous Skeletal Gradient Discretisations (DSGD) on general polytopal meshes. Discontinuous Skeletal refers to the fact that the globally coupled unknowns are broken polynomials on the mesh skeleton. The key ingredient is a high-order gradient reconstruction composed of two terms: (i) a consistent contribution obtained mimicking an integration by parts formula inside each element and (ii) a stabilising term for which sufficient design conditions are provided. An example of stabilisation that satisfies the design conditions is proposed based on a local lifting of high-order residuals on a Raviart–Thomas–Nédélec subspace. We prove that the novelmore » DSGDs satisfy coercivity, consistency, limit-conformity, and compactness requirements that ensure convergence for a variety of elliptic and parabolic problems. Lastly, links with Hybrid High-Order, non-conforming Mimetic Finite Difference and non-conforming Virtual Element methods are also studied. Numerical examples complete the exposition.« less

  5. Discontinuous Skeletal Gradient Discretisation methods on polytopal meshes

    DOE PAGES

    Di Pietro, Daniele A.; Droniou, Jérôme; Manzini, Gianmarco

    2017-11-21

    Here, in this work we develop arbitrary-order Discontinuous Skeletal Gradient Discretisations (DSGD) on general polytopal meshes. Discontinuous Skeletal refers to the fact that the globally coupled unknowns are broken polynomials on the mesh skeleton. The key ingredient is a high-order gradient reconstruction composed of two terms: (i) a consistent contribution obtained mimicking an integration by parts formula inside each element and (ii) a stabilising term for which sufficient design conditions are provided. An example of stabilisation that satisfies the design conditions is proposed based on a local lifting of high-order residuals on a Raviart–Thomas–Nédélec subspace. We prove that the novelmore » DSGDs satisfy coercivity, consistency, limit-conformity, and compactness requirements that ensure convergence for a variety of elliptic and parabolic problems. Lastly, links with Hybrid High-Order, non-conforming Mimetic Finite Difference and non-conforming Virtual Element methods are also studied. Numerical examples complete the exposition.« less

  6. Symbolic integration of a class of algebraic functions. [by an algorithmic approach

    NASA Technical Reports Server (NTRS)

    Ng, E. W.

    1974-01-01

    An algorithm is presented for the symbolic integration of a class of algebraic functions. This class consists of functions made up of rational expressions of an integration variable x and square roots of polynomials, trigonometric and hyperbolic functions of x. The algorithm is shown to consist of the following components:(1) the reduction of input integrands to conical form; (2) intermediate internal representations of integrals; (3) classification of outputs; and (4) reduction and simplification of outputs to well-known functions.

  7. Obstacles to the coordination of delivering integrated prenatal HIV, syphilis and hepatitis B testing services in Guangdong: using a needs assessment approach.

    PubMed

    Xia, Jianhong; Rutherford, Shannon; Ma, Yuanzhu; Wu, Li; Gao, Shuang; Chen, Tingting; Lu, Xiao; Zhang, Xiaozhuang; Chu, Cordia

    2015-03-24

    Integration of services for Prevention of Mother-To-Child Transmission of HIV (PMTCT) into routine maternal and child health care is promoted as a priority strategy by the WHO to facilitate the implementation of PMTCT. Integration of services emphasizes inter-sectoral coordination in the health systems to provide convenient services for clients. China has been integrating prenatal HIV, syphilis and hepatitis B testing services since 2009. However, as the individual health systems are complex, effective coordination among different health agencies is challenging. Few studies have examined the factors that affect the coordination of such complex systems. The aim of this study is to assess the effectiveness of and examine challenges for integrated service delivery. Findings will provide the basis for strategy development to enhance the effective delivery of integrated services. The research was conducted in Guangdong province in 2013 using a needs assessment approach that includes qualitative and quantitative methods. Quantitative data was collected through a survey and from routine monitoring for PMTCT and qualitative data was collected through stakeholder interviews. Routine monitoring data used to assess key indicators of coordination suggested numerous coordination problems. The rates of prenatal HIV (95%), syphilis (47%) and hepatitis B (47%) test were inconsistent. An average of only 20% of the HIV positive mothers was referred in the health systems. There were no regular meetings among different health agencies and the clients indicated complicated service processes. The major obstacles to the coordination of delivering these integrated services are lack of service resource integration; and lack of a mechanism for coordination of the health systems, with no uniform guidelines, clear roles or consistent evaluation. The key obstacles that have been identified in this study hinder the coordination of the delivery of integrated services. Our recommendations include: 1) Facilitate integration of the funding and information systems by fully combining the service resources of different health agencies into one unit; 2) Establish regular meetings to facilitate exchange of information and address problems; 3) Establish a client referral network between different health agencies with agreed guidelines, clear roles and consistent evaluation.

  8. Automated Reconstruction of Historic Roof Structures from Point Clouds - Development and Examples

    NASA Astrophysics Data System (ADS)

    Pöchtrager, M.; Styhler-Aydın, G.; Döring-Williams, M.; Pfeifer, N.

    2017-08-01

    The analysis of historic roof constructions is an important task for planning the adaptive reuse of buildings or for maintenance and restoration issues. Current approaches to modeling roof constructions consist of several consecutive operations that need to be done manually or using semi-automatic routines. To increase efficiency and allow the focus to be on analysis rather than on data processing, a set of methods was developed for the fully automated analysis of the roof constructions, including integration of architectural and structural modeling. Terrestrial laser scanning permits high-detail surveying of large-scale structures within a short time. Whereas 3-D laser scan data consist of millions of single points on the object surface, we need a geometric description of structural elements in order to obtain a structural model consisting of beam axis and connections. Preliminary results showed that the developed methods work well for beams in flawless condition with a quadratic cross section and no bending. Deformations or damages such as cracks and cuts on the wooden beams can lead to incomplete representations in the model. Overall, a high degree of automation was achieved.

  9. Integrated DEA models and grey system theory to evaluate past-to-future performance: a case of Indian electricity industry.

    PubMed

    Wang, Chia-Nan; Nguyen, Nhu-Ty; Tran, Thanh-Tuyen

    2015-01-01

    The growth of economy and population together with the higher demand in energy has created many concerns for the Indian electricity industry whose capacity is at 211 gigawatts mostly in coal-fired plants. Due to insufficient fuel supply, India suffers from a shortage of electricity generation, leading to rolling blackouts; thus, performance evaluation and ranking the industry turn into significant issues. By this study, we expect to evaluate the rankings of these companies under control of the Ministry of Power. Also, this research would like to test if there are any significant differences between the two DEA models: Malmquist nonradial and Malmquist radial. Then, one advance model of MPI would be chosen to see these companies' performance in recent years and next few years by using forecasting results of Grey system theory. Totally, the realistic data 14 are considered to be in this evaluation after the strict selection from the whole industry. The results found that all companies have not shown many abrupt changes on their scores, and it is always not consistently good or consistently standing out, which demonstrated the high applicable usability of the integrated methods. This integrated numerical research gives a better "past-present-future" insights into performance evaluation in Indian electricity industry.

  10. Using dual classifications in the development of avian wetland indices of biological integrity for wetlands in West Virginia, USA.

    PubMed

    Veselka, Walter; Anderson, James T; Kordek, Walter S

    2010-05-01

    Considerable resources are being used to develop and implement bioassessment methods for wetlands to ensure that "biological integrity" is maintained under the United States Clean Water Act. Previous research has demonstrated that avian composition is susceptible to human impairments at multiple spatial scales. Using a site-specific disturbance gradient, we built avian wetland indices of biological integrity (AW-IBI) specific to two wetland classification schemes, one based on vegetative structure and the other based on the wetland's position in the landscape and sources of water. The resulting class-specific AW-IBI was comprised of one to four metrics that varied in their sensitivity to the disturbance gradient. Some of these metrics were specific to only one of the classification schemes, whereas others could discriminate varying levels of disturbance regardless of classification scheme. Overall, all of the derived biological indices specific to the vegetative structure-based classes of wetlands had a significant relation with the disturbance gradient; however, the biological index derived for floodplain wetlands exhibited a more consistent response to a local disturbance gradient. We suspect that the consistency of this response is due to the inherent nature of the connectivity of available habitat in floodplain wetlands.

  11. Integrated DEA Models and Grey System Theory to Evaluate Past-to-Future Performance: A Case of Indian Electricity Industry

    PubMed Central

    Wang, Chia-Nan; Tran, Thanh-Tuyen

    2015-01-01

    The growth of economy and population together with the higher demand in energy has created many concerns for the Indian electricity industry whose capacity is at 211 gigawatts mostly in coal-fired plants. Due to insufficient fuel supply, India suffers from a shortage of electricity generation, leading to rolling blackouts; thus, performance evaluation and ranking the industry turn into significant issues. By this study, we expect to evaluate the rankings of these companies under control of the Ministry of Power. Also, this research would like to test if there are any significant differences between the two DEA models: Malmquist nonradial and Malmquist radial. Then, one advance model of MPI would be chosen to see these companies' performance in recent years and next few years by using forecasting results of Grey system theory. Totally, the realistic data 14 are considered to be in this evaluation after the strict selection from the whole industry. The results found that all companies have not shown many abrupt changes on their scores, and it is always not consistently good or consistently standing out, which demonstrated the high applicable usability of the integrated methods. This integrated numerical research gives a better “past-present-future” insights into performance evaluation in Indian electricity industry. PMID:25821854

  12. A program to compute three-dimensional subsonic unsteady aerodynamic characteristics using the doublet lattic method, L216 (DUBFLX). Volume 1: Engineering and usage

    NASA Technical Reports Server (NTRS)

    Richard, M.; Harrison, B. A.

    1979-01-01

    The program input presented consists of configuration geometry, aerodynamic parameters, and modal data; output includes element geometry, pressure difference distributions, integrated aerodynamic coefficients, stability derivatives, generalized aerodynamic forces, and aerodynamic influence coefficient matrices. Optionally, modal data may be input on magnetic file (tape or disk), and certain geometric and aerodynamic output may be saved for subsequent use.

  13. Integrated Geophysical Measurements for Bioremediation Monitoring: Combining Spectral Induced Polarization, Nuclear Magnetic Resonance and Magnetic Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keating, Kristina; Slater, Lee; Ntarlagiannis, Dimitris

    2015-02-24

    This documents contains the final report for the project "Integrated Geophysical Measurements for Bioremediation Monitoring: Combining Spectral Induced Polarization, Nuclear Magnetic Resonance and Magnetic Methods" (DE-SC0007049) Executive Summary: Our research aimed to develop borehole measurement techniques capable of monitoring subsurface processes, such as changes in pore geometry and iron/sulfur geochemistry, associated with remediation of heavy metals and radionuclides. Previous work has demonstrated that geophysical method spectral induced polarization (SIP) can be used to assess subsurface contaminant remediation; however, SIP signals can be generated from multiple sources limiting their interpretation value. Integrating multiple geophysical methods, such as nuclear magnetic resonance (NMR)more » and magnetic susceptibility (MS), with SIP, could reduce the ambiguity of interpretation that might result from a single method. Our research efforts entails combining measurements from these methods, each sensitive to different mineral forms and/or mineral-fluid interfaces, providing better constraints on changes in subsurface biogeochemical processes and pore geometries significantly improving our understanding of processes impacting contaminant remediation. The Rifle Integrated Field Research Challenge (IFRC) site was used as a test location for our measurements. The Rifle IFRC site is located at a former uranium ore-processing facility in Rifle, Colorado. Leachate from spent mill tailings has resulted in residual uranium contamination of both groundwater and sediments within the local aquifer. Studies at the site include an ongoing acetate amendment strategy, native microbial populations are stimulated by introduction of carbon intended to alter redox conditions and immobilize uranium. To test the geophysical methods in the field, NMR and MS logging measurements were collected before, during, and after acetate amendment. Next, laboratory NMR, MS, and SIP measurements were collected on columns of Rifle sediments during acetate amendment. The laboratory experiments were designed to simulate the field experiments; changes in geophysical signals were expected to correlate with changes in redox conditions and iron speciation. Field MS logging measurements revealed vertically stratified magnetic mineralization, likely the result of detrital magnetic fraction within the bulk alluvium. Little to no change was observed in the MS data suggesting negligible production of magnetic phases (e.g. magnetite, pyrrhotite) as a result of sulfidogenesis. Borehole NMR measurements contained high levels of noise contamination requiring significant signal processing, and analysis suggests that any changes may be difficult to differentiate from simultaneous changes in water content. Laboratory MS and NMR measurements remained relatively stable throughout the course of the acetate amendment experiment, consistent with field measurements. However, SIP measurements changed during the acetate amendment associated with the formation of iron-sulfide mineral phases; a finding that is consistent with chemical analysis of the solid phase materials in the columns.« less

  14. A computational method for sharp interface advection.

    PubMed

    Roenby, Johan; Bredmose, Henrik; Jasak, Hrvoje

    2016-11-01

    We devise a numerical method for passive advection of a surface, such as the interface between two incompressible fluids, across a computational mesh. The method is called isoAdvector, and is developed for general meshes consisting of arbitrary polyhedral cells. The algorithm is based on the volume of fluid (VOF) idea of calculating the volume of one of the fluids transported across the mesh faces during a time step. The novelty of the isoAdvector concept consists of two parts. First, we exploit an isosurface concept for modelling the interface inside cells in a geometric surface reconstruction step. Second, from the reconstructed surface, we model the motion of the face-interface intersection line for a general polygonal face to obtain the time evolution within a time step of the submerged face area. Integrating this submerged area over the time step leads to an accurate estimate for the total volume of fluid transported across the face. The method was tested on simple two-dimensional and three-dimensional interface advection problems on both structured and unstructured meshes. The results are very satisfactory in terms of volume conservation, boundedness, surface sharpness and efficiency. The isoAdvector method was implemented as an OpenFOAM ® extension and is published as open source.

  15. Turbulent MHD transport coefficients - An attempt at self-consistency

    NASA Technical Reports Server (NTRS)

    Chen, H.; Montgomery, D.

    1987-01-01

    In this paper, some multiple scale perturbation calculations of turbulent MHD transport coefficients begun in earlier papers are first completed. These generalize 'alpha effect' calculations by treating the velocity field and magnetic field on the same footing. Then the problem of rendering such calculations self-consistent is addressed, generalizing an eddy-viscosity hypothesis similar to that of Heisenberg for the Navier-Stokes case. The method also borrows from Kraichnan's direct interaction approximation. The output is a set of integral equations relating the spectra and the turbulent transport coefficients. Previous 'alpha effect' and 'beta effect' coefficients emerge as limiting cases. A treatment of the inertial range can also be given, consistent with a -5/3 energy spectrum power law. In the Navier-Stokes limit, a value of 1.72 is extracted for the Kolmogorov constant. Further applications to MHD are possible.

  16. Attractive electron-electron interactions within robust local fitting approximations.

    PubMed

    Merlot, Patrick; Kjærgaard, Thomas; Helgaker, Trygve; Lindh, Roland; Aquilante, Francesco; Reine, Simen; Pedersen, Thomas Bondo

    2013-06-30

    An analysis of Dunlap's robust fitting approach reveals that the resulting two-electron integral matrix is not manifestly positive semidefinite when local fitting domains or non-Coulomb fitting metrics are used. We present a highly local approximate method for evaluating four-center two-electron integrals based on the resolution-of-the-identity (RI) approximation and apply it to the construction of the Coulomb and exchange contributions to the Fock matrix. In this pair-atomic resolution-of-the-identity (PARI) approach, atomic-orbital (AO) products are expanded in auxiliary functions centered on the two atoms associated with each product. Numerical tests indicate that in 1% or less of all Hartree-Fock and Kohn-Sham calculations, the indefinite integral matrix causes nonconvergence in the self-consistent-field iterations. In these cases, the two-electron contribution to the total energy becomes negative, meaning that the electronic interaction is effectively attractive, and the total energy is dramatically lower than that obtained with exact integrals. In the vast majority of our test cases, however, the indefiniteness does not interfere with convergence. The total energy accuracy is comparable to that of the standard Coulomb-metric RI method. The speed-up compared with conventional algorithms is similar to the RI method for Coulomb contributions; exchange contributions are accelerated by a factor of up to eight with a triple-zeta quality basis set. A positive semidefinite integral matrix is recovered within PARI by introducing local auxiliary basis functions spanning the full AO product space, as may be achieved by using Cholesky-decomposition techniques. Local completion, however, slows down the algorithm to a level comparable with or below conventional calculations. Copyright © 2013 Wiley Periodicals, Inc.

  17. Land-Use Scenarios: National-Scale Housing-Density ...

    EPA Pesticide Factsheets

    EPA announced the availability of the final report, Land-Use Scenarios: National-Scale Housing-Density Scenarios Consistent with Climate Change Storylines. This report describes the scenarios and models used to generate national-scale housing density scenarios for the conterminous US to the year 2100 as part of the Integrated Climate and Land Use Scenarios (ICLUS) project. The report was prepared by the Global Change Research Program (GCRP) in the National Center for Environmental Assessment (NCEA) of the Office of Research and Development (ORD) at the U.S. Environmental Protection Agency (EPA). The ICLUS report describes the methods used to develop land-use scenarios by decade from the year 2000 to 2100 that are consistent with these storylines.

  18. Study of noise propagation and the effects of insufficient numbers of projection angles and detector samplings for iterative reconstruction using planar-integral data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, B.; Zeng, G. L.

    2006-09-15

    A rotating slat collimator can be used to acquire planar-integral data. It achieves higher geometric efficiency than a parallel-hole collimator by accepting more photons, but the planar-integral data contain less tomographic information that may result in larger noise amplification in the reconstruction. Lodge evaluated the rotating slat system and the parallel-hole system based on noise behavior for an FBP reconstruction. Here, we evaluate the noise propagation properties of the two collimation systems for iterative reconstruction. We extend Huesman's noise propagation analysis of the line-integral system to the planar-integral case, and show that approximately 2.0(D/dp) SPECT angles, 2.5(D/dp) self-spinning angles atmore » each detector position, and a 0.5dp detector sampling interval are required in order for the planar-integral data to be efficiently utilized. Here, D is the diameter of the object and dp is the linear dimension of the voxels that subdivide the object. The noise propagation behaviors of the two systems are then compared based on a least-square reconstruction using the ratio of the SNR in the image reconstructed using a planar-integral system to that reconstructed using a line-integral system. The ratio is found to be proportional to {radical}(F/D), where F is a geometric efficiency factor. This result has been verified by computer simulations. It confirms that for an iterative reconstruction, the noise tradeoff of the two systems is not only dependent on the increase of the geometric efficiency afforded by the planar projection method, but also dependent on the size of the object. The planar-integral system works better for small objects, while the line-integral system performs better for large ones. This result is consistent with Lodge's results based on the FBP method.« less

  19. Inside help: An integrative review of champions in healthcare-related implementation

    PubMed Central

    Rattray, Nicholas A; Flanagan, Mindy E; Damschroder, Laura; Schmid, Arlene A; Damush, Teresa M

    2018-01-01

    Background/aims: The idea that champions are crucial to effective healthcare-related implementation has gained broad acceptance; yet the champion construct has been hampered by inconsistent use across the published literature. This integrative review sought to establish the current state of the literature on champions in healthcare settings and bring greater clarity to this important construct. Methods: This integrative review was limited to research articles in peer-reviewed, English-language journals published from 1980 to 2016. Searches were conducted on the online MEDLINE database via OVID and PubMed using the keyword “champion.” Several additional terms often describe champions and were also included as keywords: implementation leader, opinion leader, facilitator, and change agent. Bibliographies of full-text articles that met inclusion criteria were reviewed for additional references not yet identified via the main strategy of conducting keyword searches in MEDLINE. A five-member team abstracted all full-text articles meeting inclusion criteria. Results: The final dataset for the integrative review consisted of 199 unique articles. Use of the term champion varied widely across the articles with respect to topic, specific job positions, or broader organizational roles. The most common method for operationalizing champion for purposes of analysis was the use of a dichotomous variable designating champion presence or absence. Four studies randomly allocated of the presence or absence of champions. Conclusions: The number of published champion-related articles has markedly increased: more articles were published during the last two years of this review (i.e. 2015–2016) than during its first 30 years (i.e. 1980–2009). The number of champion-related articles has continued to increase sharply since the year 2000. Individual studies consistently found that champions were important positive influences on implementation effectiveness. Although few in number, the randomized trials of champions that have been conducted demonstrate the feasibility of using experimental design to study the effects of champions in healthcare. PMID:29796266

  20. Integrating preconcentrator heat controller

    DOEpatents

    Bouchier, Francis A.; Arakaki, Lester H.; Varley, Eric S.

    2007-10-16

    A method and apparatus for controlling the electric resistance heating of a metallic chemical preconcentrator screen, for example, used in portable trace explosives detectors. The length of the heating time-period is automatically adjusted to compensate for any changes in the voltage driving the heating current across the screen, for example, due to gradual discharge or aging of a battery. The total deposited energy in the screen is proportional to the integral over time of the square of the voltage drop across the screen. Since the net temperature rise, .DELTA.T.sub.s, of the screen, from beginning to end of the heating pulse, is proportional to the total amount of heat energy deposited in the screen during the heating pulse, then this integral can be calculated in real-time and used to terminate the heating current when a pre-set target value has been reached; thereby providing a consistent and reliable screen temperature rise, .DELTA.T.sub.s, from pulse-to-pulse.

  1. Sol-gel zinc oxide humidity sensors integrated with a ring oscillator circuit on-a-chip.

    PubMed

    Yang, Ming-Zhi; Dai, Ching-Liang; Wu, Chyan-Chyi

    2014-10-28

    The study develops an integrated humidity microsensor fabricated using the commercial 0.18 μm complementary metal oxide semiconductor (CMOS) process. The integrated humidity sensor consists of a humidity sensor and a ring oscillator circuit on-a-chip. The humidity sensor is composed of a sensitive film and branch interdigitated electrodes. The sensitive film is zinc oxide prepared by sol-gel method. After completion of the CMOS process, the sensor requires a post-process to remove the sacrificial oxide layer and to coat the zinc oxide film on the interdigitated electrodes. The capacitance of the sensor changes when the sensitive film adsorbs water vapor. The circuit is used to convert the capacitance of the humidity sensor into the oscillation frequency output. Experimental results show that the output frequency of the sensor changes from 84.3 to 73.4 MHz at 30 °C as the humidity increases 40 to 90%RH.

  2. Kinematics of velocity and vorticity correlations in turbulent flow

    NASA Technical Reports Server (NTRS)

    Bernard, P. S.

    1983-01-01

    The kinematic problem of calculating second-order velocity moments from given values of the vorticity covariance is examined. Integral representation formulas for second-order velocity moments in terms of the two-point vorticity correlation tensor are derived. The special relationships existing between velocity moments in isotropic turbulence are expressed in terms of the integral formulas yielding several kinematic constraints on the two-point vorticity correlation tensor in isotropic turbulence. Numerical evaluation of these constraints suggests that a Gaussian curve may be the only form of the longitudinal velocity correlation coefficient which is consistent with the requirement of isotropy. It is shown that if this is the case, then a family of exact solutions to the decay of isotropic turbulence may be obtained which contains Batchelor's final period solution as a special case. In addition, the computed results suggest a method of approximating the integral representation formulas in general turbulent shear flows.

  3. Decadal trends in global pelagic ocean chlorophyll: A new assessment integrating multiple satellites, in situ data, and models.

    PubMed

    Gregg, Watson W; Rousseaux, Cécile S

    2014-09-01

    Quantifying change in ocean biology using satellites is a major scientific objective. We document trends globally for the period 1998-2012 by integrating three diverse methodologies: ocean color data from multiple satellites, bias correction methods based on in situ data, and data assimilation to provide a consistent and complete global representation free of sampling biases. The results indicated no significant trend in global pelagic ocean chlorophyll over the 15 year data record. These results were consistent with previous findings that were based on the first 6 years and first 10 years of the SeaWiFS mission. However, all of the Northern Hemisphere basins (north of 10° latitude), as well as the Equatorial Indian basin, exhibited significant declines in chlorophyll. Trend maps showed the local trends and their change in percent per year. These trend maps were compared with several other previous efforts using only a single sensor (SeaWiFS) and more limited time series, showing remarkable consistency. These results suggested the present effort provides a path forward to quantifying global ocean trends using multiple satellite missions, which is essential if we are to understand the state, variability, and possible changes in the global oceans over longer time scales.

  4. Design and analysis of optical waveguide elements in planar geometry

    NASA Astrophysics Data System (ADS)

    Mirkov, Mirko Georgiev

    1998-10-01

    This dissertation presents the theoretical analysis and practical design considerations for planar optical waveguide devices. The analysis takes into account both transverse dimensions of the waveguides and is based on the supermode theory combined with the resonance method for determination of the propagation constants and field profiles of the supermodes. An improved accuracy has been achieved by including the corrections due to the fields in the corner regions of the waveguides using perturbation theory. The following two classes of devices have been analyzed in detail. Curved rectangular waveguides are a common element in an integrated optics circuit. The theoretical analysis in this work shows that some commonly used approximations for determination of the propagation constants of the quasi-modes of the bent waveguides are not necessary. Specifically the imaginary part of the mode propagation constant, which determines the power loss, is calculated exactly using the resonance method, combined with a two- dimensional optimization routine for determination of the real and the imaginary parts of the propagation constants. Subsequently, the results are corrected for the effects of the fields in the corner regions. The latter corrections have not been previously computed and are shown to be significant. Power splitters are another common element of an integrated optical circuit. A new 'bend-free' splitter is suggested and analyzed. The new splitter design consists of only straight parallel channels, which considerably simplify both the analysis and the fabrication of the device. It is shown that a single design parameter determines the power splitting ratio, which can take any given value. The intrinsic power loss in the proposed splitter is minimal, which makes it an attractive alternative to the conventional Y-splitters. The accurate methods of analysis of planar optical waveguides developed in the present work can easily be applied to other integrated optic devices consisting of rectangular waveguides.

  5. On-the-fly Locata/inertial navigation system integration for precise maritime application

    NASA Astrophysics Data System (ADS)

    Jiang, Wei; Li, Yong; Rizos, Chris

    2013-10-01

    The application of Global Navigation Satellite System (GNSS) technology has meant that marine navigators have greater access to a more consistent and accurate positioning capability than ever before. However, GNSS may not be able to meet all emerging navigation performance requirements for maritime applications with respect to service robustness, accuracy, integrity and availability. In particular, applications in port areas (for example automated docking) and in constricted waterways, have very stringent performance requirements. Even when an integrated inertial navigation system (INS)/GNSS device is used there may still be performance gaps. GNSS signals are easily blocked or interfered with, and sometimes the satellite geometry may not be good enough for high accuracy and high reliability applications. Furthermore, the INS accuracy degrades rapidly during GNSS outages. This paper investigates the use of a portable ground-based positioning system, known as ‘Locata’, which was integrated with an INS, to provide accurate navigation in a marine environment without reliance on GNSS signals. An ‘on-the-fly’ Locata resolution algorithm that takes advantage of geometry change via an extended Kalman filter is proposed in this paper. Single-differenced Locata carrier phase measurements are utilized to achieve accurate and reliable solutions. A ‘loosely coupled’ decentralized Locata/INS integration architecture based on the Kalman filter is used for data processing. In order to evaluate the system performance, a field trial was conducted on Sydney Harbour. A Locata network consisting of eight Locata transmitters was set up near the Sydney Harbour Bridge. The experiment demonstrated that the Locata on-the-fly (OTF) algorithm is effective and can improve the system accuracy in comparison with the conventional ‘known point initialization’ (KPI) method. After the OTF and KPI comparison, the OTF Locata/INS integration is then assessed further and its performance improvement on both stand-alone OTF Locata and INS is shown. The Locata/INS integration can achieve centimetre-level accuracy for position solutions, and centimetre-per-second accuracy for velocity determination.

  6. A discontinuous Galerkin method for the shallow water equations in spherical triangular coordinates

    NASA Astrophysics Data System (ADS)

    Läuter, Matthias; Giraldo, Francis X.; Handorf, Dörthe; Dethloff, Klaus

    2008-12-01

    A global model of the atmosphere is presented governed by the shallow water equations and discretized by a Runge-Kutta discontinuous Galerkin method on an unstructured triangular grid. The shallow water equations on the sphere, a two-dimensional surface in R3, are locally represented in terms of spherical triangular coordinates, the appropriate local coordinate mappings on triangles. On every triangular grid element, this leads to a two-dimensional representation of tangential momentum and therefore only two discrete momentum equations. The discontinuous Galerkin method consists of an integral formulation which requires both area (elements) and line (element faces) integrals. Here, we use a Rusanov numerical flux to resolve the discontinuous fluxes at the element faces. A strong stability-preserving third-order Runge-Kutta method is applied for the time discretization. The polynomial space of order k on each curved triangle of the grid is characterized by a Lagrange basis and requires high-order quadature rules for the integration over elements and element faces. For the presented method no mass matrix inversion is necessary, except in a preprocessing step. The validation of the atmospheric model has been done considering standard tests from Williamson et al. [D.L. Williamson, J.B. Drake, J.J. Hack, R. Jakob, P.N. Swarztrauber, A standard test set for numerical approximations to the shallow water equations in spherical geometry, J. Comput. Phys. 102 (1992) 211-224], unsteady analytical solutions of the nonlinear shallow water equations and a barotropic instability caused by an initial perturbation of a jet stream. A convergence rate of O(Δx) was observed in the model experiments. Furthermore, a numerical experiment is presented, for which the third-order time-integration method limits the model error. Thus, the time step Δt is restricted by both the CFL-condition and accuracy demands. Conservation of mass was shown up to machine precision and energy conservation converges for both increasing grid resolution and increasing polynomial order k.

  7. A Method for Extracting Important Segments from Documents Using Support Vector Machines

    NASA Astrophysics Data System (ADS)

    Suzuki, Daisuke; Utsumi, Akira

    In this paper we propose an extraction-based method for automatic summarization. The proposed method consists of two processes: important segment extraction and sentence compaction. The process of important segment extraction classifies each segment in a document as important or not by Support Vector Machines (SVMs). The process of sentence compaction then determines grammatically appropriate portions of a sentence for a summary according to its dependency structure and the classification result by SVMs. To test the performance of our method, we conducted an evaluation experiment using the Text Summarization Challenge (TSC-1) corpus of human-prepared summaries. The result was that our method achieved better performance than a segment-extraction-only method and the Lead method, especially for sentences only a part of which was included in human summaries. Further analysis of the experimental results suggests that a hybrid method that integrates sentence extraction with segment extraction may generate better summaries.

  8. Analysis and interpretation of geophysical surveys in archaeological sites employing different integrated approach.

    NASA Astrophysics Data System (ADS)

    Piro, Salvatore; Papale, Enrico; Kucukdemirci, Melda; Zamuner, Daniela

    2017-04-01

    Non-destructive ground surface geophysical prospecting methods are frequently used for the investigation of archaeological sites, where a detailed physical and geometrical reconstructions of hidden volumes is required prior to any excavation work. All methods measure the variations of single physical parameters, therefore if these are used singularly, they could not permit a complete location and characterization of anomalous bodies. The probability of a successful result rapidly increases if a multhimethodological approach is adopted, according to the logic of objective complementarity of information and of global convergence toward a high quality multiparametric imaging of the buried structures. The representation of the static configuration of the bodies in the subsoil and of the space-time evolution of the interaction processes between targets and hosting materials have to be actually considered fundamental elements of primary knowledge in archaeological prospecting. The main effort in geophysical prospecting for archaeology is therefore the integration of different, absolutely non-invasive techniques, especially if managed in view of a ultra-high resolution three-dimensional (3D) tomographic representation mode. Following the above outlined approach, we have integrated geophysical methods which measure the variations of potential field (gradiometric methods) with active methods which measure the variations of physical properties due to the body's geometry and volume (GPR and ERT). In this work, the results obtained during the surveys of three archaeological sites, employing Ground Penetrating Radar (GPR), Electrical Resistivity Tomography (ERT) and Fluxgate Differential Magnetic (FDM) to obtain precise and detailed maps of subsurface bodies, are presented and discussed. The first site, situated in a suburban area between Itri and Fondi, in the Aurunci Natural Regional Park (Central Italy), is characterized by the presence of remains of past human activity dating from the third century B.C. The second site is always in suburban area and is part of the ancient acropolis Etruscan town of Cerveteri (central Italy). The third site is part of Aizanoi archaeological park (Cavdarhisar, Kutahya, Turkey). To have a better understanding of the subsurface, we performed a different integrated approaches of these data, which consists in fusing the data from all the employed methods, to have a complete visualization of the investigated area. For the processing we have used the following techniques: graphical integration (overlay and RGB colour composite), discrete data analysis (binary data analysis and cluster analysis) and continuous data analysis (data sum, product, max, min and PCA). Ernenwein, E.G. 2009. Integration of multidimensional archaeogeophysical data using supervised and unsupervised classification. Near surface geophysics. Vol 7: 147-158. DOI:10.3997/1873-0604.2009004 Kucukdemirci,M., Piro.S.,Baydemir,N.,Ozer.,E. Zamuner.,D. 2015. Mathematical and Statistical Integration approach on archaeological prospection data,case studies from Aizanoi-Turkey. 43rd Computer Applications and Quantitative Methods in Archaeology, Siena. Kvamme,K.,2007. Integrating Multiple Geophysical Datasets, Remote Sensing in archaeology, Springer,Boston. Piro,S.,Mauriello.,P. and Cammarano.,F.2000. Quantitative Integration of Geophysical methods for Archaeological Prospection. Archaeological prospection 7(4): 203-213. Piro S., Papale E., Zamuner D., 2016. Different integrated geophysical approaches to investigate archaeological sites in urban and suburban area. Geophysical Research Abstracts Vol. 18, EGU2016.

  9. A new heterogeneous asynchronous explicit-implicit time integrator for nonsmooth dynamics

    NASA Astrophysics Data System (ADS)

    Fekak, Fatima-Ezzahra; Brun, Michael; Gravouil, Anthony; Depale, Bruno

    2017-07-01

    In computational structural dynamics, particularly in the presence of nonsmooth behavior, the choice of the time-step and the time integrator has a critical impact on the feasibility of the simulation. Furthermore, in some cases, as in the case of a bridge crane under seismic loading, multiple time-scales coexist in the same problem. In that case, the use of multi-time scale methods is suitable. Here, we propose a new explicit-implicit heterogeneous asynchronous time integrator (HATI) for nonsmooth transient dynamics with frictionless unilateral contacts and impacts. Furthermore, we present a new explicit time integrator for contact/impact problems where the contact constraints are enforced using a Lagrange multiplier method. In other words, the aim of this paper consists in using an explicit time integrator with a fine time scale in the contact area for reproducing high frequency phenomena, while an implicit time integrator is adopted in the other parts in order to reproduce much low frequency phenomena and to optimize the CPU time. In a first step, the explicit time integrator is tested on a one-dimensional example and compared to Moreau-Jean's event-capturing schemes. The explicit algorithm is found to be very accurate and the scheme has generally a higher order of convergence than Moreau-Jean's schemes and provides also an excellent energy behavior. Then, the two time scales explicit-implicit HATI is applied to the numerical example of a bridge crane under seismic loading. The results are validated in comparison to a fine scale full explicit computation. The energy dissipated in the implicit-explicit interface is well controlled and the computational time is lower than a full-explicit simulation.

  10. How credible are the study results? Evaluating and applying internal validity tools to literature-based assessments of environmental health hazards

    PubMed Central

    Rooney, Andrew A.; Cooper, Glinda S.; Jahnke, Gloria D.; Lam, Juleen; Morgan, Rebecca L.; Boyles, Abee L.; Ratcliffe, Jennifer M.; Kraft, Andrew D.; Schünemann, Holger J.; Schwingl, Pamela; Walker, Teneille D.; Thayer, Kristina A.; Lunn, Ruth M.

    2016-01-01

    Environmental health hazard assessments are routinely relied upon for public health decision-making. The evidence base used in these assessments is typically developed from a collection of diverse sources of information of varying quality. It is critical that literature-based evaluations consider the credibility of individual studies used to reach conclusions through consistent, transparent and accepted methods. Systematic review procedures address study credibility by assessing internal validity or “risk of bias” — the assessment of whether the design and conduct of a study compromised the credibility of the link between exposure/intervention and outcome. This paper describes the commonalities and differences in risk-of-bias methods developed or used by five groups that conduct or provide methodological input for performing environmental health hazard assessments: the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) Working Group, the Navigation Guide, the National Toxicology Program’s (NTP) Office of Health Assessment and Translation (OHAT) and Office of the Report on Carcinogens (ORoC), and the Integrated Risk Information System of the U.S. Environmental Protection Agency (EPA-IRIS). Each of these groups have been developing and applying rigorous assessment methods for integrating across a heterogeneous collection of human and animal studies to inform conclusions on potential environmental health hazards. There is substantial consistency across the groups in the consideration of risk-of-bias issues or “domains” for assessing observational human studies. There is a similar overlap in terms of domains addressed for animal studies; however, the groups differ in the relative emphasis placed on different aspects of risk of bias. Future directions for the continued harmonization and improvement of these methods are also discussed. PMID:26857180

  11. How credible are the study results? Evaluating and applying internal validity tools to literature-based assessments of environmental health hazards.

    PubMed

    Rooney, Andrew A; Cooper, Glinda S; Jahnke, Gloria D; Lam, Juleen; Morgan, Rebecca L; Boyles, Abee L; Ratcliffe, Jennifer M; Kraft, Andrew D; Schünemann, Holger J; Schwingl, Pamela; Walker, Teneille D; Thayer, Kristina A; Lunn, Ruth M

    2016-01-01

    Environmental health hazard assessments are routinely relied upon for public health decision-making. The evidence base used in these assessments is typically developed from a collection of diverse sources of information of varying quality. It is critical that literature-based evaluations consider the credibility of individual studies used to reach conclusions through consistent, transparent and accepted methods. Systematic review procedures address study credibility by assessing internal validity or "risk of bias" - the assessment of whether the design and conduct of a study compromised the credibility of the link between exposure/intervention and outcome. This paper describes the commonalities and differences in risk-of-bias methods developed or used by five groups that conduct or provide methodological input for performing environmental health hazard assessments: the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) Working Group, the Navigation Guide, the National Toxicology Program's (NTP) Office of Health Assessment and Translation (OHAT) and Office of the Report on Carcinogens (ORoC), and the Integrated Risk Information System of the U.S. Environmental Protection Agency (EPA-IRIS). Each of these groups have been developing and applying rigorous assessment methods for integrating across a heterogeneous collection of human and animal studies to inform conclusions on potential environmental health hazards. There is substantial consistency across the groups in the consideration of risk-of-bias issues or "domains" for assessing observational human studies. There is a similar overlap in terms of domains addressed for animal studies; however, the groups differ in the relative emphasis placed on different aspects of risk of bias. Future directions for the continued harmonization and improvement of these methods are also discussed. Published by Elsevier Ltd.

  12. Data-driven adaptive fractional order PI control for PMSM servo system with measurement noise and data dropouts.

    PubMed

    Xie, Yuanlong; Tang, Xiaoqi; Song, Bao; Zhou, Xiangdong; Guo, Yixuan

    2018-04-01

    In this paper, data-driven adaptive fractional order proportional integral (AFOPI) control is presented for permanent magnet synchronous motor (PMSM) servo system perturbed by measurement noise and data dropouts. The proposed method directly exploits the closed-loop process data for the AFOPI controller design under unknown noise distribution and data missing probability. Firstly, the proposed method constructs the AFOPI controller tuning problem as a parameter identification problem using the modified l p norm virtual reference feedback tuning (VRFT). Then, iteratively reweighted least squares is integrated into the l p norm VRFT to give a consistent compensation solution for the AFOPI controller. The measurement noise and data dropouts are estimated and eliminated by feedback compensation periodically, so that the AFOPI controller is updated online to accommodate the time-varying operating conditions. Moreover, the convergence and stability are guaranteed by mathematical analysis. Finally, the effectiveness of the proposed method is demonstrated both on simulations and experiments implemented on a practical PMSM servo system. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Integrating habitat status, human population pressure, and protection status into biodiversity conservation priority setting

    USGS Publications Warehouse

    Shi, Hua; Singh, Ashbindu; Kant, S.; Zhu, Zhiliang; Waller, E.

    2005-01-01

    Priority setting is an essential component of biodiversity conservation. Existing methods to identify priority areas for conservation have focused almost entirely on biological factors. We suggest a new relative ranking method for identifying priority conservation areas that integrates both biological and social aspects. It is based on the following criteria: the habitat's status, human population pressure, human efforts to protect habitat, and number of endemic plant and vertebrate species. We used this method to rank 25 hotspots, 17 megadiverse countries, and the hotspots within each megadiverse country. We used consistent, comprehensive, georeferenced, and multiband data sets and analytical remote sensing and geographic information system tools to quantify habitat status, human population pressure, and protection status. The ranking suggests that the Philippines, Atlantic Forest, Mediterranean Basin, Caribbean Islands, Caucasus, and Indo-Burma are the hottest hotspots and that China, the Philippines, and India are the hottest megadiverse countries. The great variation in terms of habitat, protected areas, and population pressure among the hotspots, the megadiverse countries, and the hotspots within the same country suggests the need for hotspot- and country-specific conservation policies.

  14. A conservative, thermodynamically consistent numerical approach for low Mach number combustion. Part I: Single-level integration

    NASA Astrophysics Data System (ADS)

    Nonaka, Andrew; Day, Marcus S.; Bell, John B.

    2018-01-01

    We present a numerical approach for low Mach number combustion that conserves both mass and energy while remaining on the equation of state to a desired tolerance. We present both unconfined and confined cases, where in the latter the ambient pressure changes over time. Our overall scheme is a projection method for the velocity coupled to a multi-implicit spectral deferred corrections (SDC) approach to integrate the mass and energy equations. The iterative nature of SDC methods allows us to incorporate a series of pressure discrepancy corrections naturally that lead to additional mass and energy influx/outflux in each finite volume cell in order to satisfy the equation of state. The method is second order, and satisfies the equation of state to a desired tolerance with increasing iterations. Motivated by experimental results, we test our algorithm on hydrogen flames with detailed kinetics. We examine the morphology of thermodiffusively unstable cylindrical premixed flames in high-pressure environments for confined and unconfined cases. We also demonstrate that our algorithm maintains the equation of state for premixed methane flames and non-premixed dimethyl ether jet flames.

  15. The Decision Making Trial and Evaluation Laboratory (Dematel) and Analytic Network Process (ANP) for Safety Management System Evaluation Performance

    NASA Astrophysics Data System (ADS)

    Rolita, Lisa; Surarso, Bayu; Gernowo, Rahmat

    2018-02-01

    In order to improve airport safety management system (SMS) performance, an evaluation system is required to improve on current shortcomings and maximize safety. This study suggests the integration of the DEMATEL and ANP methods in decision making processes by analyzing causal relations between the relevant criteria and taking effective analysis-based decision. The DEMATEL method builds on the ANP method in identifying the interdependencies between criteria. The input data consists of questionnaire data obtained online and then stored in an online database. Furthermore, the questionnaire data is processed using DEMATEL and ANP methods to obtain the results of determining the relationship between criteria and criteria that need to be evaluated. The study cases on this evaluation system were Adi Sutjipto International Airport, Yogyakarta (JOG); Ahmad Yani International Airport, Semarang (SRG); and Adi Sumarmo International Airport, Surakarta (SOC). The integration grades SMS performance criterion weights in a descending order as follow: safety and destination policy, safety risk management, healthcare, and safety awareness. Sturges' formula classified the results into nine grades. JOG and SMG airports were in grade 8, while SOG airport was in grade 7.

  16. Extracting DNA words based on the sequence features: non-uniform distribution and integrity.

    PubMed

    Li, Zhi; Cao, Hongyan; Cui, Yuehua; Zhang, Yanbo

    2016-01-25

    DNA sequence can be viewed as an unknown language with words as its functional units. Given that most sequence alignment algorithms such as the motif discovery algorithms depend on the quality of background information about sequences, it is necessary to develop an ab initio algorithm for extracting the "words" based only on the DNA sequences. We considered that non-uniform distribution and integrity were two important features of a word, based on which we developed an ab initio algorithm to extract "DNA words" that have potential functional meaning. A Kolmogorov-Smirnov test was used for consistency test of uniform distribution of DNA sequences, and the integrity was judged by the sequence and position alignment. Two random base sequences were adopted as negative control, and an English book was used as positive control to verify our algorithm. We applied our algorithm to the genomes of Saccharomyces cerevisiae and 10 strains of Escherichia coli to show the utility of the methods. The results provide strong evidences that the algorithm is a promising tool for ab initio building a DNA dictionary. Our method provides a fast way for large scale screening of important DNA elements and offers potential insights into the understanding of a genome.

  17. A Fast Solver for Implicit Integration of the Vlasov--Poisson System in the Eulerian Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garrett, C. Kristopher; Hauck, Cory D.

    In this paper, we present a domain decomposition algorithm to accelerate the solution of Eulerian-type discretizations of the linear, steady-state Vlasov equation. The steady-state solver then forms a key component in the implementation of fully implicit or nearly fully implicit temporal integrators for the nonlinear Vlasov--Poisson system. The solver relies on a particular decomposition of phase space that enables the use of sweeping techniques commonly used in radiation transport applications. The original linear system for the phase space unknowns is then replaced by a smaller linear system involving only unknowns on the boundary between subdomains, which can then be solvedmore » efficiently with Krylov methods such as GMRES. Steady-state solves are combined to form an implicit Runge--Kutta time integrator, and the Vlasov equation is coupled self-consistently to the Poisson equation via a linearized procedure or a nonlinear fixed-point method for the electric field. Finally, numerical results for standard test problems demonstrate the efficiency of the domain decomposition approach when compared to the direct application of an iterative solver to the original linear system.« less

  18. Delimiting species of Protaphorura (Collembola: Onychiuridae): integrative evidence based on morphology, DNA sequences and geography.

    PubMed

    Sun, Xin; Zhang, Feng; Ding, Yinhuan; Davies, Thomas W; Li, Yu; Wu, Donghui

    2017-08-15

    Species delimitation remains a significant challenge when the diagnostic morphological characters are limited. Integrative taxonomy was applied to the genus Protaphorura (Collembola: Onychiuridae), which is one of most difficult soil animals to distinguish taxonomically. Three delimitation approaches (morphology, molecular markers and geography) were applied providing rigorous species validation criteria with an acceptably low error rate. Multiple molecular approaches, including distance- and evolutionary model-based methods, were used to determine species boundaries based on 144 standard barcode sequences. Twenty-two molecular putative species were consistently recovered across molecular and geographical analyses. Geographic criteria were was proved to be an efficient delimitation method for onychiurids. Further morphological examination, based on the combination of the number of pseudocelli, parapseudocelli and ventral mesothoracic chaetae, confirmed 18 taxa of 22 molecular units, with six of them described as new species. These characters were found to be of high taxonomical value. This study highlights the potential benefits of integrative taxonomy, particularly simultaneous use of molecular/geographical tools, as a powerful way of ascertaining the true diversity of the Onychiuridae. Our study also highlights that discovering new morphological characters remains central to achieving a full understanding of collembolan taxonomy.

  19. Strategy for signaling molecule detection by using an integrated microfluidic device coupled with mass spectrometry to study cell-to-cell communication.

    PubMed

    Mao, Sifeng; Zhang, Jie; Li, Haifang; Lin, Jin-Ming

    2013-01-15

    Cell-to-cell communication is a very important physiological behavior in life entity, and most of human behaviors are related to it. Although cell-to-cell communications are attracting much attention and financial support, rare methods have been successfully developed for in vitro cell-to-cell communication study. In this work, we developed a novel method for cell-to-cell communication study on an integrated microdevice, and signaling molecule and metabolites were online-detected by an electrospray ionization-quadrupole-time-of-flight-mass spectrometer (ESI-Q-TOF-MS) after on-chip solid-phase extraction. Moreover, we presented a "Surface Tension Plug" on a microchip to control cell-to-cell communication. The microdevice consists of three functional sections: cell coculture channel, targets pretreatment, and targets detection sections. To verify the feasibility of cell-to-cell communications on the integrated microdevice, we studied the communication between the 293 and the L-02 cells. Epinephrine and glucose were successfully detected using an ESI-Q-TOF-MS with short analysis time (<10 min). The results demonstrated that the developed microfluidic device is a potentially useful tool for high throughput cell-to-cell communication study.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  1. Exact solutions for the static bending of Euler-Bernoulli beams using Eringen's two-phase local/nonlocal model

    NASA Astrophysics Data System (ADS)

    Wang, Y. B.; Zhu, X. W.; Dai, H. H.

    2016-08-01

    Though widely used in modelling nano- and micro- structures, Eringen's differential model shows some inconsistencies and recent study has demonstrated its differences between the integral model, which then implies the necessity of using the latter model. In this paper, an analytical study is taken to analyze static bending of nonlocal Euler-Bernoulli beams using Eringen's two-phase local/nonlocal model. Firstly, a reduction method is proved rigorously, with which the integral equation in consideration can be reduced to a differential equation with mixed boundary value conditions. Then, the static bending problem is formulated and four types of boundary conditions with various loadings are considered. By solving the corresponding differential equations, exact solutions are obtained explicitly in all of the cases, especially for the paradoxical cantilever beam problem. Finally, asymptotic analysis of the exact solutions reveals clearly that, unlike the differential model, the integral model adopted herein has a consistent softening effect. Comparisons are also made with existing analytical and numerical results, which further shows the advantages of the analytical results obtained. Additionally, it seems that the once controversial nonlocal bar problem in the literature is well resolved by the reduction method.

  2. INTERPRETING FLUX FROM BROADBAND PHOTOMETRY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Peter J.; Breeveld, Alice; Roming, Peter W. A.

    2016-10-01

    We discuss the transformation of observed photometry into flux for the creation of spectral energy distributions (SED) and the computation of bolometric luminosities. We do this in the context of supernova studies, particularly as observed with the Swift spacecraft, but the concepts and techniques should be applicable to many other types of sources and wavelength regimes. Traditional methods of converting observed magnitudes to flux densities are not very accurate when applied to UV photometry. Common methods for extinction and the integration of pseudo-bolometric fluxes can also lead to inaccurate results. The sources of inaccuracy, though, also apply to other wavelengths.more » Because of the complicated nature of translating broadband photometry into monochromatic flux densities, comparison between observed photometry and a spectroscopic model is best done by forward modeling the spectrum into the count rates or magnitudes of the observations. We recommend that integrated flux measurements be made using a spectrum or SED which is consistent with the multi-band photometry rather than converting individual photometric measurements to flux densities, linearly interpolating between the points, and integrating. We also highlight some specific areas where the UV flux can be mischaracterized.« less

  3. A Fast Solver for Implicit Integration of the Vlasov--Poisson System in the Eulerian Framework

    DOE PAGES

    Garrett, C. Kristopher; Hauck, Cory D.

    2018-04-05

    In this paper, we present a domain decomposition algorithm to accelerate the solution of Eulerian-type discretizations of the linear, steady-state Vlasov equation. The steady-state solver then forms a key component in the implementation of fully implicit or nearly fully implicit temporal integrators for the nonlinear Vlasov--Poisson system. The solver relies on a particular decomposition of phase space that enables the use of sweeping techniques commonly used in radiation transport applications. The original linear system for the phase space unknowns is then replaced by a smaller linear system involving only unknowns on the boundary between subdomains, which can then be solvedmore » efficiently with Krylov methods such as GMRES. Steady-state solves are combined to form an implicit Runge--Kutta time integrator, and the Vlasov equation is coupled self-consistently to the Poisson equation via a linearized procedure or a nonlinear fixed-point method for the electric field. Finally, numerical results for standard test problems demonstrate the efficiency of the domain decomposition approach when compared to the direct application of an iterative solver to the original linear system.« less

  4. Dietary intake assessment using integrated sensors and software

    NASA Astrophysics Data System (ADS)

    Shang, Junqing; Pepin, Eric; Johnson, Eric; Hazel, David; Teredesai, Ankur; Kristal, Alan; Mamishev, Alexander

    2012-02-01

    The area of dietary assessment is becoming increasingly important as obesity rates soar, but valid measurement of the food intake in free-living persons is extraordinarily challenging. Traditional paper-based dietary assessment methods have limitations due to bias, user burden and cost, and therefore improved methods are needed to address important hypotheses related to diet and health. In this paper, we will describe the progress of our mobile Diet Data Recorder System (DDRS), where an electronic device is used for objective measurement on dietary intake in real time and at moderate cost. The DDRS consists of (1) a mobile device that integrates a smartphone and an integrated laser package, (2) software on the smartphone for data collection and laser control, (3) an algorithm to process acquired data for food volume estimation, which is the largest source of error in calculating dietary intake, and (4) database and interface for data storage and management. The estimated food volume, together with direct entries of food questionnaires and voice recordings, could provide dietitians and nutritional epidemiologists with more complete food description and more accurate food portion sizes. In this paper, we will describe the system design of DDRS and initial results of dietary assessment.

  5. A regulation probability model-based meta-analysis of multiple transcriptomics data sets for cancer biomarker identification.

    PubMed

    Xie, Xin-Ping; Xie, Yu-Feng; Wang, Hong-Qiang

    2017-08-23

    Large-scale accumulation of omics data poses a pressing challenge of integrative analysis of multiple data sets in bioinformatics. An open question of such integrative analysis is how to pinpoint consistent but subtle gene activity patterns across studies. Study heterogeneity needs to be addressed carefully for this goal. This paper proposes a regulation probability model-based meta-analysis, jGRP, for identifying differentially expressed genes (DEGs). The method integrates multiple transcriptomics data sets in a gene regulatory space instead of in a gene expression space, which makes it easy to capture and manage data heterogeneity across studies from different laboratories or platforms. Specifically, we transform gene expression profiles into a united gene regulation profile across studies by mathematically defining two gene regulation events between two conditions and estimating their occurring probabilities in a sample. Finally, a novel differential expression statistic is established based on the gene regulation profiles, realizing accurate and flexible identification of DEGs in gene regulation space. We evaluated the proposed method on simulation data and real-world cancer datasets and showed the effectiveness and efficiency of jGRP in identifying DEGs identification in the context of meta-analysis. Data heterogeneity largely influences the performance of meta-analysis of DEGs identification. Existing different meta-analysis methods were revealed to exhibit very different degrees of sensitivity to study heterogeneity. The proposed method, jGRP, can be a standalone tool due to its united framework and controllable way to deal with study heterogeneity.

  6. Protocol for a randomized comparison of integrated versus consecutive dual task practice in Parkinson’s disease: the DUALITY trial

    PubMed Central

    2014-01-01

    Background Multiple tasking is an integral part of daily mobility. Patients with Parkinson’s disease have dual tasking difficulties due to their combined motor and cognitive deficits. Two contrasting physiotherapy interventions have been proposed to alleviate dual tasking difficulties: either to discourage simultaneous execution of dual tasks (consecutive training); or to practice their concurrent use (integrated training). It is currently unclear which of these training methods should be adopted to achieve safe and consolidated dual task performance in daily life. Therefore, the proposed randomized controlled trial will compare the effects of integrated versus consecutive training of dual tasking (tested by combining walking with cognitive exercises). Methods and design Hundred and twenty patients with Parkinson’s disease will be recruited to participate in this multi-centered, single blind, randomized controlled trial. Patients in Hoehn & Yahr stage II-III, with or without freezing of gait, and who report dual task difficulties will be included. All patients will undergo a six-week control period without intervention after which they will be randomized to integrated or consecutive task practice. Training will consist of standardized walking and cognitive exercises delivered at home four times a week during six weeks. Treatment is guided by a physiotherapist twice a week and consists of two sessions of self-practice using an MP3 player. Blinded testers will assess patients before and after the control period, after the intervention period and after a 12-week follow-up period. The primary outcome measure is dual task gait velocity, i.e. walking combined with a novel untrained cognitive task to evaluate the consolidation of learning. Secondary outcomes include several single and dual task gait and cognitive measures, functional outcomes and a quality of life scale. Falling will be recorded as a possible adverse event using a weekly phone call for the entire study period. Discussion This randomized study will evaluate the effectiveness and safety of integrated versus consecutive task training in patients with Parkinson’s disease. The study will also highlight whether dual task gait training leads to robust motor learning effects, and whether these can be retained and carried-over to untrained dual tasks and functional mobility. Trial registration Clinicaltrials.gov NCT01375413. PMID:24674594

  7. Data Mining for ISHM of Liquid Rocket Propulsion Status Update

    NASA Technical Reports Server (NTRS)

    Srivastava, Ashok; Schwabacher, Mark; Oza, Nijunj; Martin, Rodney; Watson, Richard; Matthews, Bryan

    2006-01-01

    This document consists of presentation slides that review the current status of data mining to support the work with the Integrated Systems Health Management (ISHM) for the systems associated with Liquid Rocket Propulsion. The aim of this project is to have test stand data from Rocketdyne to design algorithms that will aid in the early detection of impending failures during operation. These methods will be extended and improved for future platforms (i.e., CEV/CLV).

  8. Monitoring and control requirement definition study for Dispersed Storage and Generation (DSG). Volume 2, appendix A: Selected DSG technologies and their general control requirements

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A consistent approach was sought for both hardware and software which will handle the monitoring and control necessary to integrate a number of different DSG technologies into a common distribution dispatch network. It appears that the control of each of the DSG technologies is compatible with a supervisory control method of operation that lends itself to remote control from a distribution dispatch center.

  9. A nanometre-scale electronic switch consisting of a metal cluster and redox-addressable groups.

    PubMed

    Gittins, D I; Bethell, D; Schiffrin, D J; Nichols, R J

    2000-11-02

    So-called bottom-up fabrication methods aim to assemble and integrate molecular components exhibiting specific functions into electronic devices that are orders of magnitude smaller than can be fabricated by lithographic techniques. Fundamental to the success of the bottom-up approach is the ability to control electron transport across molecular components. Organic molecules containing redox centres-chemical species whose oxidation number, and hence electronic structure, can be changed reversibly-support resonant tunnelling and display promising functional behaviour when sandwiched as molecular layers between electrical contacts, but their integration into more complex assemblies remains challenging. For this reason, functionalized metal nanoparticles have attracted much interest: they exhibit single-electron characteristics (such as quantized capacitance charging) and can be organized through simple self-assembly methods into well ordered structures, with the nanoparticles at controlled locations. Here we report scanning tunnelling microscopy measurements showing that organic molecules containing redox centres can be used to attach metal nanoparticles to electrode surfaces and so control the electron transport between them. Our system consists of gold nanoclusters a few nanometres across and functionalized with polymethylene chains that carry a central, reversibly reducible bipyridinium moiety. We expect that the ability to electronically contact metal nanoparticles via redox-active molecules, and to alter profoundly their tunnelling properties by charge injection into these molecules, can form the basis for a range of nanoscale electronic switches.

  10. A Ratiometric Wavelength Measurement Based on a Silicon-on-Insulator Directional Coupler Integrated Device

    PubMed Central

    Wang, Pengfei; Hatta, Agus Muhamad; Zhao, Haoyu; Zheng, Jie; Farrell, Gerald; Brambilla, Gilberto

    2015-01-01

    A ratiometric wavelength measurement based on a Silicon-on-Insulator (SOI) integrated device is proposed and designed, which consists of directional couplers acting as two edge filters with opposite spectral responses. The optimal separation distance between two parallel silicon waveguides and the interaction length of the directional coupler are designed to meet the desired spectral response by using local supermodes. The wavelength discrimination ability of the designed ratiometric structure is demonstrated by a beam propagation method numerically and then is verified experimentally. The experimental results have shown a general agreement with the theoretical models. The ratiometric wavelength system demonstrates a resolution of better than 50 pm at a wavelength around 1550 nm with ease of assembly and calibration. PMID:26343668

  11. Committed to kids: an integrated, 4-level team approach to weight management in adolescents.

    PubMed

    Sothern, Melinda S; Schumacher, Heidi; von Almen, T Kristian; Carlisle, Lauren Keely; Udall, John N

    2002-03-01

    The integrated, 4-level approach of Committed to Kids is successful because of several factors: The sessions are designed to entertain the adolescents and promote initial success; The program features parent-training methods in short, interactive, educational sessions; In severely obese adolescents, the diet intervention results in noticeable weight loss that motivates the patient to continue; also, the improved exercise tolerance resulting from the weight loss promotes increased physical activity; and The program team provides consistent feedback-patients and their families receive results and updates every 3 months. Most importantly, the program is conducted in groups of families. The adolescent group dynamics and peer modeling are primary components of the successful management of obesity in youth.

  12. Modeling Tools for Propulsion Analysis and Computational Fluid Dynamics on the Internet

    NASA Technical Reports Server (NTRS)

    Muss, J. A.; Johnson, C. W.; Gotchy, M. B.

    2000-01-01

    The existing RocketWeb(TradeMark) Internet Analysis System (httr)://www.iohnsonrockets.com/rocketweb) provides an integrated set of advanced analysis tools that can be securely accessed over the Internet. Since these tools consist of both batch and interactive analysis codes, the system includes convenient methods for creating input files and evaluating the resulting data. The RocketWeb(TradeMark) system also contains many features that permit data sharing which, when further developed, will facilitate real-time, geographically diverse, collaborative engineering within a designated work group. Adding work group management functionality while simultaneously extending and integrating the system's set of design and analysis tools will create a system providing rigorous, controlled design development, reducing design cycle time and cost.

  13. The condition-dependent transcriptional network in Escherichia coli.

    PubMed

    Lemmens, Karen; De Bie, Tijl; Dhollander, Thomas; Monsieurs, Pieter; De Moor, Bart; Collado-Vides, Julio; Engelen, Kristof; Marchal, Kathleen

    2009-03-01

    Thanks to the availability of high-throughput omics data, bioinformatics approaches are able to hypothesize thus-far undocumented genetic interactions. However, due to the amount of noise in these data, inferences based on a single data source are often unreliable. A popular approach to overcome this problem is to integrate different data sources. In this study, we describe DISTILLER, a novel framework for data integration that simultaneously analyzes microarray and motif information to find modules that consist of genes that are co-expressed in a subset of conditions, and their corresponding regulators. By applying our method on publicly available data, we evaluated the condition-specific transcriptional network of Escherichia coli. DISTILLER confirmed 62% of 736 interactions described in RegulonDB, and 278 novel interactions were predicted.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murray, E.; Floether, F. F.; Cavendish Laboratory, University of Cambridge, J.J. Thomson Avenue, Cambridge CB3 0HE

    Fundamental to integrated photonic quantum computing is an on-chip method for routing and modulating quantum light emission. We demonstrate a hybrid integration platform consisting of arbitrarily designed waveguide circuits and single-photon sources. InAs quantum dots (QD) embedded in GaAs are bonded to a SiON waveguide chip such that the QD emission is coupled to the waveguide mode. The waveguides are SiON core embedded in a SiO{sub 2} cladding. A tuneable Mach Zehnder interferometer (MZI) modulates the emission between two output ports and can act as a path-encoded qubit preparation device. The single-photon nature of the emission was verified using themore » on-chip MZI as a beamsplitter in a Hanbury Brown and Twiss measurement.« less

  15. User's Manual for FOMOCO Utilities-Force and Moment Computation Tools for Overset Grids

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Buning, Pieter G.

    1996-01-01

    In the numerical computations of flows around complex configurations, accurate calculations of force and moment coefficients for aerodynamic surfaces are required. When overset grid methods are used, the surfaces on which force and moment coefficients are sought typically consist of a collection of overlapping surface grids. Direct integration of flow quantities on the overlapping grids would result in the overlapped regions being counted more than once. The FOMOCO Utilities is a software package for computing flow coefficients (force, moment, and mass flow rate) on a collection of overset surfaces with accurate accounting of the overlapped zones. FOMOCO Utilities can be used in stand-alone mode or in conjunction with the Chimera overset grid compressible Navier-Stokes flow solver OVERFLOW. The software package consists of two modules corresponding to a two-step procedure: (1) hybrid surface grid generation (MIXSUR module), and (2) flow quantities integration (OVERINT module). Instructions on how to use this software package are described in this user's manual. Equations used in the flow coefficients calculation are given in Appendix A.

  16. Mixed-methods designs in mental health services research: a review.

    PubMed

    Palinkas, Lawrence A; Horwitz, Sarah M; Chamberlain, Patricia; Hurlburt, Michael S; Landsverk, John

    2011-03-01

    Despite increased calls for use of mixed-methods designs in mental health services research, how and why such methods are being used and whether there are any consistent patterns that might indicate a consensus about how such methods can and should be used are unclear. Use of mixed methods was examined in 50 peer-reviewed journal articles found by searching PubMed Central and 60 National Institutes of Health (NIH)-funded projects found by searching the CRISP database over five years (2005-2009). Studies were coded for aims and the rationale, structure, function, and process for using mixed methods. A notable increase was observed in articles published and grants funded over the study period. However, most did not provide an explicit rationale for using mixed methods, and 74% gave priority to use of quantitative methods. Mixed methods were used to accomplish five distinct types of study aims (assess needs for services, examine existing services, develop new or adapt existing services, evaluate services in randomized controlled trials, and examine service implementation), with three categories of rationale, seven structural arrangements based on timing and weighting of methods, five functions of mixed methods, and three ways of linking quantitative and qualitative data. Each study aim was associated with a specific pattern of use of mixed methods, and four common patterns were identified. These studies offer guidance for continued progress in integrating qualitative and quantitative methods in mental health services research consistent with efforts by NIH and other funding agencies to promote their use.

  17. Model-Based Systems

    NASA Technical Reports Server (NTRS)

    Frisch, Harold P.

    2007-01-01

    Engineers, who design systems using text specification documents, focus their work upon the completed system to meet Performance, time and budget goals. Consistency and integrity is difficult to maintain within text documents for a single complex system and more difficult to maintain as several systems are combined into higher-level systems, are maintained over decades, and evolve technically and in performance through updates. This system design approach frequently results in major changes during the system integration and test phase, and in time and budget overruns. Engineers who build system specification documents within a model-based systems environment go a step further and aggregate all of the data. They interrelate all of the data to insure consistency and integrity. After the model is constructed, the various system specification documents are prepared, all from the same database. The consistency and integrity of the model is assured, therefore the consistency and integrity of the various specification documents is insured. This article attempts to define model-based systems relative to such an environment. The intent is to expose the complexity of the enabling problem by outlining what is needed, why it is needed and how needs are being addressed by international standards writing teams.

  18. Real-space finite-difference approach for multi-body systems: path-integral renormalization group method and direct energy minimization method.

    PubMed

    Sasaki, Akira; Kojo, Masashi; Hirose, Kikuji; Goto, Hidekazu

    2011-11-02

    The path-integral renormalization group and direct energy minimization method of practical first-principles electronic structure calculations for multi-body systems within the framework of the real-space finite-difference scheme are introduced. These two methods can handle higher dimensional systems with consideration of the correlation effect. Furthermore, they can be easily extended to the multicomponent quantum systems which contain more than two kinds of quantum particles. The key to the present methods is employing linear combinations of nonorthogonal Slater determinants (SDs) as multi-body wavefunctions. As one of the noticeable results, the same accuracy as the variational Monte Carlo method is achieved with a few SDs. This enables us to study the entire ground state consisting of electrons and nuclei without the need to use the Born-Oppenheimer approximation. Recent activities on methodological developments aiming towards practical calculations such as the implementation of auxiliary field for Coulombic interaction, the treatment of the kinetic operator in imaginary-time evolutions, the time-saving double-grid technique for bare-Coulomb atomic potentials and the optimization scheme for minimizing the total-energy functional are also introduced. As test examples, the total energy of the hydrogen molecule, the atomic configuration of the methylene and the electronic structures of two-dimensional quantum dots are calculated, and the accuracy, availability and possibility of the present methods are demonstrated.

  19. Model Multi Criteria Decision Making with Fuzzy ANP Method for Performance Measurement Small Medium Enterprise (SME)

    NASA Astrophysics Data System (ADS)

    Rahmanita, E.; Widyaningrum, V. T.; Kustiyahningsih, Y.; Purnama, J.

    2018-04-01

    SMEs have a very important role in the development of the economy in Indonesia. SMEs assist the government in terms of creating new jobs and can support household income. The number of SMEs in Madura and the number of measurement indicators in the SME mapping so that it requires a method.This research uses Fuzzy Analytic Network Process (FANP) method for performance measurement SME. The FANP method can handle data that contains uncertainty. There is consistency index in determining decisions. Performance measurement in this study is based on a perspective of the Balanced Scorecard. This research approach integrated internal business perspective, learning, and growth perspective and fuzzy Analytic Network Process (FANP). The results of this research areframework a priority weighting of assessment indicators SME.

  20. Prototyping and implementing flight qualifiable semicustom CMOS P-well bulk integrated circuits in the JPL environment

    NASA Technical Reports Server (NTRS)

    Olson, E. M.

    1986-01-01

    Presently, there are many difficulties associated with implementing application specific custom or semi-custom (standard cell based) integrated circuits (ICs) into JPL flight projects. One of the primary difficulties is developing prototype semi-custom integrated circuits for use and evaluation in engineering prototype flight hardware. The prototype semi-custom ICs must be extremely cost-effective and yet still representative of flight qualifiable versions of the design. A second difficulty is encountered in the transport of the design from engineering prototype quality to flight quality. Normally, flight quality integrated circuits have stringent quality standards, must be radiation resistant and should consume minimal power. It is often not necessary or cost effective, however, to impose such stringent quality standards on engineering models developed for systems analysis in controlled lab environments. This article presents work originally initiated for ground based applications that also addresses these two problems. Furthermore, this article suggests a method that has been shown successful in prototyping flight quality semi-custom ICs through the Metal Oxide Semiconductor Implementation Service (MOSIS) program run by the University of Southern California's Information Sciences Institute. The method has been used successfully to design and fabricate through the MOSIS three different semi-custom prototype CMOS p-well chips. The three designs make use of the work presented and were designed consistent with design techniques and structures that are flight qualifiable, allowing one hour transfer of the design from engineering model status to flight qualifiable foundry-ready status through methods outlined in this article.

  1. Development of and Clinical Experience with a Simple Device for Performing Intraoperative Fluorescein Fluorescence Cerebral Angiography: Technical Notes.

    PubMed

    Ichikawa, Tsuyoshi; Suzuki, Kyouichi; Watanabe, Yoichi; Sato, Taku; Sakuma, Jun; Saito, Kiyoshi

    2016-01-01

    To perform intraoperative fluorescence angiography (FAG) under a microscope without an integrated FAG function with reasonable cost and sufficient quality for evaluation, we made a small and easy to use device for fluorescein FAG (FAG filter). We investigated the practical use of this FAG filter during aneurysm surgery, revascularization surgery, and brain tumor surgery. The FAG filter consists of two types of filters: an excitatory filter and a barrier filter. The excitatory filter excludes all wavelengths except for blue light and the barrier filter passes long waves except for blue light. By adding this FAG filter to a microscope without an integrated FAG function, light from the microscope illuminating the surgical field becomes blue, which is blocked by the barrier filter. We put the FAG filter on the objective lens of the operating microscope correctly and fluorescein sodium was injected intravenously or intra-arterially. Fluorescence (green light) from vessels in the surgical field and the dyed tumor were clearly observed through the microscope and recorded by a memory device. This method was easy and could be performed in a short time (about 10 seconds). Blood flow of small vessels deep in the surgical field could be observed. Blood flow stagnation could be evaluated. However, images from this method were inferior to those obtained by currently commercially available microscopes with an integrated FAG function. In brain tumor surgery, a stained tumor on the brain surface could be observed using this method. FAG could be performed with a microscope without an integrated FAG function easily with only this FAG filter.

  2. Development of and Clinical Experience with a Simple Device for Performing Intraoperative Fluorescein Fluorescence Cerebral Angiography: Technical Notes

    PubMed Central

    ICHIKAWA, Tsuyoshi; SUZUKI, Kyouichi; WATANABE, Yoichi; SATO, Taku; SAKUMA, Jun; SAITO, Kiyoshi

    2016-01-01

    To perform intraoperative fluorescence angiography (FAG) under a microscope without an integrated FAG function with reasonable cost and sufficient quality for evaluation, we made a small and easy to use device for fluorescein FAG (FAG filter). We investigated the practical use of this FAG filter during aneurysm surgery, revascularization surgery, and brain tumor surgery. The FAG filter consists of two types of filters: an excitatory filter and a barrier filter. The excitatory filter excludes all wavelengths except for blue light and the barrier filter passes long waves except for blue light. By adding this FAG filter to a microscope without an integrated FAG function, light from the microscope illuminating the surgical field becomes blue, which is blocked by the barrier filter. We put the FAG filter on the objective lens of the operating microscope correctly and fluorescein sodium was injected intravenously or intra-arterially. Fluorescence (green light) from vessels in the surgical field and the dyed tumor were clearly observed through the microscope and recorded by a memory device. This method was easy and could be performed in a short time (about 10 seconds). Blood flow of small vessels deep in the surgical field could be observed. Blood flow stagnation could be evaluated. However, images from this method were inferior to those obtained by currently commercially available microscopes with an integrated FAG function. In brain tumor surgery, a stained tumor on the brain surface could be observed using this method. FAG could be performed with a microscope without an integrated FAG function easily with only this FAG filter. PMID:26597335

  3. A Method of Signal Scrambling to Secure Data Storage for Healthcare Applications.

    PubMed

    Bao, Shu-Di; Chen, Meng; Yang, Guang-Zhong

    2017-11-01

    A body sensor network that consists of wearable and/or implantable biosensors has been an important front-end for collecting personal health records. It is expected that the full integration of outside-hospital personal health information and hospital electronic health records will further promote preventative health services as well as global health. However, the integration and sharing of health information is bound to bring with it security and privacy issues. With extensive development of healthcare applications, security and privacy issues are becoming increasingly important. This paper addresses the potential security risks of healthcare data in Internet-based applications and proposes a method of signal scrambling as an add-on security mechanism in the application layer for a variety of healthcare information, where a piece of tiny data is used to scramble healthcare records. The former is kept locally and the latter, along with security protection, is sent for cloud storage. The tiny data can be derived from a random number generator or even a piece of healthcare data, which makes the method more flexible. The computational complexity and security performance in terms of theoretical and experimental analysis has been investigated to demonstrate the efficiency and effectiveness of the proposed method. The proposed method is applicable to all kinds of data that require extra security protection within complex networks.

  4. Multiconstrained gene clustering based on generalized projections

    PubMed Central

    2010-01-01

    Background Gene clustering for annotating gene functions is one of the fundamental issues in bioinformatics. The best clustering solution is often regularized by multiple constraints such as gene expressions, Gene Ontology (GO) annotations and gene network structures. How to integrate multiple pieces of constraints for an optimal clustering solution still remains an unsolved problem. Results We propose a novel multiconstrained gene clustering (MGC) method within the generalized projection onto convex sets (POCS) framework used widely in image reconstruction. Each constraint is formulated as a corresponding set. The generalized projector iteratively projects the clustering solution onto these sets in order to find a consistent solution included in the intersection set that satisfies all constraints. Compared with previous MGC methods, POCS can integrate multiple constraints from different nature without distorting the original constraints. To evaluate the clustering solution, we also propose a new performance measure referred to as Gene Log Likelihood (GLL) that considers genes having more than one function and hence in more than one cluster. Comparative experimental results show that our POCS-based gene clustering method outperforms current state-of-the-art MGC methods. Conclusions The POCS-based MGC method can successfully combine multiple constraints from different nature for gene clustering. Also, the proposed GLL is an effective performance measure for the soft clustering solutions. PMID:20356386

  5. Nonlinear integrable model of Frenkel-like excitations on a ribbon of triangular lattice

    NASA Astrophysics Data System (ADS)

    Vakhnenko, Oleksiy O.

    2015-03-01

    Following the considerable progress in nanoribbon technology, we propose to model the nonlinear Frenkel-like excitations on a triangular-lattice ribbon by the integrable nonlinear ladder system with the background-controlled intersite resonant coupling. The system of interest arises as a proper reduction of first general semidiscrete integrable system from an infinite hierarchy. The most significant local conservation laws related to the first general integrable system are found explicitly in the framework of generalized recursive approach. The obtained general local densities are equally applicable to any general semidiscrete integrable system from the respective infinite hierarchy. Using the recovered second densities, the Hamiltonian formulation of integrable nonlinear ladder system with background-controlled intersite resonant coupling is presented. In doing so, the relevant Poisson structure turns out to be essentially nontrivial. The Darboux transformation scheme as applied to the first general semidiscrete system is developed and the key role of Bäcklund transformation in justification of its self-consistency is pointed out. The spectral properties of Darboux matrix allow to restore the whole Darboux matrix thus ensuring generation one more soliton as compared with a priori known seed solution of integrable nonlinear system. The power of Darboux-dressing method is explicitly demonstrated in generating the multicomponent one-soliton solution to the integrable nonlinear ladder system with background-controlled intersite resonant coupling.

  6. Implementation green and low cost on landscape design of Manggarai Integrated Station, Jakarta

    NASA Astrophysics Data System (ADS)

    Suryanti, T.; Meilianti, H.

    2018-01-01

    The Manggarai Integrated Station is the transit of various transportation modes. The Integrated Station located in Manggarai Jakarta and managed by PT. KAI. The Manggarai station is integrated and have terminal nature of transit areas (switching mode of transportation). There are several problems in the site, such as the problem of the site conditions in the urban area, topography, soil, vegetation, space, visual, users on the site can provide ideas for the concepts. The data was analyzed using the quantitative descriptive methode. The purpose of this research is to design the integrated station atmosphere, not only can support of the activities station users, but can also accommodate the needs of the community. It will “Green, Low cost” at the Manggarai integrated transport transit station in Jakarta. The potential that exists in this area is the lowliest integrated from various areas of the mode of transportation that make the users to facilitate transit transportation to the other. The basic concept of this design refers to the “Green, Low Cost” which unite with theme “user friendly” land use on a more efficient and effective site. The result of this research is landscape design development of Manggarai integrated station. Its consists of landscape design in west and east area, transition area, parking area, solar panel area, and social interaction area.

  7. A Dictionary Learning Method with Total Generalized Variation for MRI Reconstruction

    PubMed Central

    Lu, Hongyang; Wei, Jingbo; Wang, Yuhao; Deng, Xiaohua

    2016-01-01

    Reconstructing images from their noisy and incomplete measurements is always a challenge especially for medical MR image with important details and features. This work proposes a novel dictionary learning model that integrates two sparse regularization methods: the total generalized variation (TGV) approach and adaptive dictionary learning (DL). In the proposed method, the TGV selectively regularizes different image regions at different levels to avoid oil painting artifacts largely. At the same time, the dictionary learning adaptively represents the image features sparsely and effectively recovers details of images. The proposed model is solved by variable splitting technique and the alternating direction method of multiplier. Extensive simulation experimental results demonstrate that the proposed method consistently recovers MR images efficiently and outperforms the current state-of-the-art approaches in terms of higher PSNR and lower HFEN values. PMID:27110235

  8. A Dictionary Learning Method with Total Generalized Variation for MRI Reconstruction.

    PubMed

    Lu, Hongyang; Wei, Jingbo; Liu, Qiegen; Wang, Yuhao; Deng, Xiaohua

    2016-01-01

    Reconstructing images from their noisy and incomplete measurements is always a challenge especially for medical MR image with important details and features. This work proposes a novel dictionary learning model that integrates two sparse regularization methods: the total generalized variation (TGV) approach and adaptive dictionary learning (DL). In the proposed method, the TGV selectively regularizes different image regions at different levels to avoid oil painting artifacts largely. At the same time, the dictionary learning adaptively represents the image features sparsely and effectively recovers details of images. The proposed model is solved by variable splitting technique and the alternating direction method of multiplier. Extensive simulation experimental results demonstrate that the proposed method consistently recovers MR images efficiently and outperforms the current state-of-the-art approaches in terms of higher PSNR and lower HFEN values.

  9. Integrating Information in Biological Ontologies and Molecular Networks to Infer Novel Terms

    PubMed Central

    Li, Le; Yip, Kevin Y.

    2016-01-01

    Currently most terms and term-term relationships in Gene Ontology (GO) are defined manually, which creates cost, consistency and completeness issues. Recent studies have demonstrated the feasibility of inferring GO automatically from biological networks, which represents an important complementary approach to GO construction. These methods (NeXO and CliXO) are unsupervised, which means 1) they cannot use the information contained in existing GO, 2) the way they integrate biological networks may not optimize the accuracy, and 3) they are not customized to infer the three different sub-ontologies of GO. Here we present a semi-supervised method called Unicorn that extends these previous methods to tackle the three problems. Unicorn uses a sub-tree of an existing GO sub-ontology as training part to learn parameters in integrating multiple networks. Cross-validation results show that Unicorn reliably inferred the left-out parts of each specific GO sub-ontology. In addition, by training Unicorn with an old version of GO together with biological networks, it successfully re-discovered some terms and term-term relationships present only in a new version of GO. Unicorn also successfully inferred some novel terms that were not contained in GO but have biological meanings well-supported by the literature.Availability: Source code of Unicorn is available at http://yiplab.cse.cuhk.edu.hk/unicorn/. PMID:27976738

  10. A new approach for electrical properties estimation using a global integral equation and improvements using high permittivity materials.

    PubMed

    Schmidt, Rita; Webb, Andrew

    2016-01-01

    Electrical Properties Tomography (EPT) using MRI is a technique that has been developed to provide a new contrast mechanism for in vivo imaging. Currently the most common method relies on the solution of the homogeneous Helmholtz equation, which has limitations in accurate estimation at tissue interfaces. A new method proposed in this work combines a Maxwell's integral equation representation of the problem, and the use of high permittivity materials (HPM) to control the RF field, in order to reconstruct the electrical properties image. The magnetic field is represented by an integral equation considering each point as a contrast source. This equation can be solved in an inverse method. In this study we use a reference simulation or scout scan of a uniform phantom to provide an initial estimate for the inverse solution, which allows the estimation of the complex permittivity within a single iteration. Incorporating two setups with and without the HPM improves the reconstructed result, especially with respect to the very low electric field in the center of the sample. Electromagnetic simulations of the brain were performed at 3T to generate the B1(+) field maps and reconstruct the electric properties images. The standard deviations of the relative permittivity and conductivity were within 14% and 18%, respectively for a volume consisting of white matter, gray matter and cerebellum. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Meta-analysis of studies on chemical, physical and biological agents in the control of Aedes aegypti.

    PubMed

    Lima, Estelita Pereira; Goulart, Marília Oliveira Fonseca; Rolim Neto, Modesto Leite

    2015-09-04

    Aedes aegypti is a vector of international concern because it can transmit to humans three important arboviral diseases: yellow fever, dengue and chikungunya. Epidemics that are repeated year after year in a variety of urban centers indicate that there are control failures, allowing the vector to continue expanding. To identify the most effective vector control strategies and the factors that contributed to the success or failure of each strategy, we carried out a systematic review with meta-analysis of articles published in 12 databases, from 1974 to the month of December 2013. We evaluated the association between the use of whatever chemical substance, mechanical agent, biological or integrated actions against A. aegypti and the control of the vector, as measured by 10 indicators. We found 2,791 articles, but after careful selection, only 26 studies remained for analysis related to control interventions implemented in 15 countries, with 5 biological, 5 chemical, 3 mechanical and 13 integrated strategies. The comparison among all of them, indicated that the control of A. aegypti is significantly associated with the type of strategy used, and that integrated interventions consist of the most effective method for controlling A. aegypti. The most effective control method was the integrated approach, considering the influence of eco-bio-social determinants in the virus-vector-man epidemiological chain, and community involvement, starting with community empowerment as active agents of vector control.

  12. Multi-View Interaction Modelling of human collaboration processes: a business process study of head and neck cancer care in a Dutch academic hospital.

    PubMed

    Stuit, Marco; Wortmann, Hans; Szirbik, Nick; Roodenburg, Jan

    2011-12-01

    In the healthcare domain, human collaboration processes (HCPs), which consist of interactions between healthcare workers from different (para)medical disciplines and departments, are of growing importance as healthcare delivery becomes increasingly integrated. Existing workflow-based process modelling tools for healthcare process management, which are the most commonly applied, are not suited for healthcare HCPs mainly due to their focus on the definition of task sequences instead of the graphical description of human interactions. This paper uses a case study of a healthcare HCP at a Dutch academic hospital to evaluate a novel interaction-centric process modelling method. The HCP under study is the care pathway performed by the head and neck oncology team. The evaluation results show that the method brings innovative, effective, and useful features. First, it collects and formalizes the tacit domain knowledge of the interviewed healthcare workers in individual interaction diagrams. Second, the method automatically integrates these local diagrams into a single global interaction diagram that reflects the consolidated domain knowledge. Third, the case study illustrates how the method utilizes a graphical modelling language for effective tree-based description of interactions, their composition and routing relations, and their roles. A process analysis of the global interaction diagram is shown to identify HCP improvement opportunities. The proposed interaction-centric method has wider applicability since interactions are the core of most multidisciplinary patient-care processes. A discussion argues that, although (multidisciplinary) collaboration is in many cases not optimal in the healthcare domain, it is increasingly considered a necessity to improve integration, continuity, and quality of care. The proposed method is helpful to describe, analyze, and improve the functioning of healthcare collaboration. Copyright © 2011 Elsevier Inc. All rights reserved.

  13. Relationship between Organizational Culture and the Use of Psychotropic Medicines in Nursing Homes: A Systematic Integrative Review.

    PubMed

    Sawan, Mouna; Jeon, Yun-Hee; Chen, Timothy F

    2018-03-01

    Psychotropic medicines are commonly used in nursing homes, despite marginal clinical benefits and association with harm in the elderly. Organizational culture is proposed as a factor explaining the high-level use of psychotropic medicines. Schein describes three levels of culture: artifacts, espoused values, and basic assumptions. This integrative review aimed to investigate the facets and role of organizational culture in the use of psychotropic medicines in nursing homes. Five databases were searched for qualitative, quantitative, and mixed method empirical studies up to 13 February 2017. Articles were included if they examined an aspect of organizational culture according to Schein's theory and the use of psychotropic medicines in nursing homes for the management of behavioral and sleep disturbances in residents. Article screening and data extraction were performed independently by one reviewer and checked by the research team. The integrative review method, an approach similar to the method of constant comparison analysis was utilized for data analysis. Twenty-four studies met the inclusion criteria: 13 used quantitative methods, 9 used qualitative methods, 1 was quasi-qualitative, and 1 used mixed methods. Included studies were found to only address two aspects of organizational culture in relation to the use of psychotropic medicines: artifacts and espoused values. No studies addressed the basic assumptions, the unsaid taken-for-granted beliefs, which provide explanations for in/consistencies between the ideal use of psychotropic medicines and the actual use of psychotropic medicines. Previous studies suggest that organizational culture influences the use of psychotropic medicines in nursing homes; however, what is known is descriptive of culture only at the surface level, that is the artifacts and espoused values. Hence, future research that explains the impact of the basic assumptions of culture on the use of psychotropic medicines is important.

  14. Integrated carbon and chlorine isotope modeling: applications to chlorinated aliphatic hydrocarbons dechlorination.

    PubMed

    Jin, Biao; Haderlein, Stefan B; Rolle, Massimo

    2013-02-05

    We propose a self-consistent method to predict the evolution of carbon and chlorine isotope ratios during degradation of chlorinated hydrocarbons. The method treats explicitly the cleavage of isotopically different C-Cl bonds and thus considers, simultaneously, combined carbon-chlorine isotopologues. To illustrate the proposed modeling approach we focus on the reductive dehalogenation of chlorinated ethenes. We compare our method with the currently available approach, in which carbon and chlorine isotopologues are treated separately. The new approach provides an accurate description of dual-isotope effects regardless of the extent of the isotope fractionation and physical characteristics of the experimental system. We successfully applied the new approach to published experimental results on dehalogenation of chlorinated ethenes both in well-mixed systems and in situations where mass-transfer limitations control the overall rate of biodegradation. The advantages of our self-consistent dual isotope modeling approach proved to be most evident when isotope fractionation factors of carbon and chlorine differed significantly and for systems with mass-transfer limitations, where both physical and (bio)chemical transformation processes affect the observed isotopic values.

  15. Cost estimation: An expert-opinion approach. [cost analysis of research projects using the Delphi method (forecasting)

    NASA Technical Reports Server (NTRS)

    Buffalano, C.; Fogleman, S.; Gielecki, M.

    1976-01-01

    A methodology is outlined which can be used to estimate the costs of research and development projects. The approach uses the Delphi technique a method developed by the Rand Corporation for systematically eliciting and evaluating group judgments in an objective manner. The use of the Delphi allows for the integration of expert opinion into the cost-estimating process in a consistent and rigorous fashion. This approach can also signal potential cost-problem areas. This result can be a useful tool in planning additional cost analysis or in estimating contingency funds. A Monte Carlo approach is also examined.

  16. Measurement of the top quark mass with the template method in the [Formula: see text] channel using ATLAS data.

    PubMed

    Aad, G; Abbott, B; Abdallah, J; Abdelalim, A A; Abdesselam, A; Abdinov, O; Abi, B; Abolins, M; AbouZeid, O S; Abramowicz, H; Abreu, H; Acerbi, E; Acharya, B S; Adamczyk, L; Adams, D L; Addy, T N; Adelman, J; Aderholz, M; Adomeit, S; Adragna, P; Adye, T; Aefsky, S; Aguilar-Saavedra, J A; Aharrouche, M; Ahlen, S P; Ahles, F; Ahmad, A; Ahsan, M; Aielli, G; Akdogan, T; Åkesson, T P A; Akimoto, G; Akimov, A V; Akiyama, A; Alam, M S; Alam, M A; Albert, J; Albrand, S; Aleksa, M; Aleksandrov, I N; Alessandria, F; Alexa, C; Alexander, G; Alexandre, G; Alexopoulos, T; Alhroob, M; Aliev, M; Alimonti, G; Alison, J; Aliyev, M; Allbrooke, B M M; Allport, P P; Allwood-Spiers, S E; Almond, J; Aloisio, A; Alon, R; Alonso, A; Alvarez Gonzalez, B; Alviggi, M G; Amako, K; Amaral, P; Amelung, C; Ammosov, V V; Amorim, A; Amorós, G; Amram, N; Anastopoulos, C; Ancu, L S; Andari, N; Andeen, T; Anders, C F; Anders, G; Anderson, K J; Andreazza, A; Andrei, V; Andrieux, M-L; Anduaga, X S; Angerami, A; Anghinolfi, F; Anisenkov, A; Anjos, N; Annovi, A; Antonaki, A; Antonelli, M; Antonov, A; Antos, J; Anulli, F; Aoun, S; Aperio Bella, L; Apolle, R; Arabidze, G; Aracena, I; Arai, Y; Arce, A T H; Arfaoui, S; Arguin, J-F; Arik, E; Arik, M; Armbruster, A J; Arnaez, O; Arnault, C; Artamonov, A; Artoni, G; Arutinov, D; Asai, S; Asfandiyarov, R; Ask, S; Åsman, B; Asquith, L; Assamagan, K; Astbury, A; Astvatsatourov, A; Aubert, B; Auge, E; Augsten, K; Aurousseau, M; Avolio, G; Avramidou, R; Axen, D; Ay, C; Azuelos, G; Azuma, Y; Baak, M A; Baccaglioni, G; Bacci, C; Bach, A M; Bachacou, H; Bachas, K; Backes, M; Backhaus, M; Badescu, E; Bagnaia, P; Bahinipati, S; Bai, Y; Bailey, D C; Bain, T; Baines, J T; Baker, O K; Baker, M D; Baker, S; Banas, E; Banerjee, P; Banerjee, Sw; Banfi, D; Bangert, A; Bansal, V; Bansil, H S; Barak, L; Baranov, S P; Barashkou, A; Barbaro Galtieri, A; Barber, T; Barberio, E L; Barberis, D; Barbero, M; Bardin, D Y; Barillari, T; Barisonzi, M; Barklow, T; Barlow, N; Barnett, B M; Barnett, R M; Baroncelli, A; Barone, G; Barr, A J; Barreiro, F; Barreiro Guimarães da Costa, J; Barrillon, P; Bartoldus, R; Barton, A E; Bartsch, V; Bates, R L; Batkova, L; Batley, J R; Battaglia, A; Battistin, M; Bauer, F; Bawa, H S; Beale, S; Beare, B; Beau, T; Beauchemin, P H; Beccherle, R; Bechtle, P; Beck, H P; Becker, S; Beckingham, M; Becks, K H; Beddall, A J; Beddall, A; Bedikian, S; Bednyakov, V A; Bee, C P; Begel, M; Behar Harpaz, S; Behera, P K; Beimforde, M; Belanger-Champagne, C; Bell, P J; Bell, W H; Bella, G; Bellagamba, L; Bellina, F; Bellomo, M; Belloni, A; Beloborodova, O; Belotskiy, K; Beltramello, O; Ben Ami, S; Benary, O; Benchekroun, D; Benchouk, C; Bendel, M; Benekos, N; Benhammou, Y; Benhar Noccioli, E; Benitez Garcia, J A; Benjamin, D P; Benoit, M; Bensinger, J R; Benslama, K; Bentvelsen, S; Berge, D; Bergeaas Kuutmann, E; Berger, N; Berghaus, F; Berglund, E; Beringer, J; Bernat, P; Bernhard, R; Bernius, C; Berry, T; Bertella, C; Bertin, A; Bertinelli, F; Bertolucci, F; Besana, M I; Besson, N; Bethke, S; Bhimji, W; Bianchi, R M; Bianco, M; Biebel, O; Bieniek, S P; Bierwagen, K; Biesiada, J; Biglietti, M; Bilokon, H; Bindi, M; Binet, S; Bingul, A; Bini, C; Biscarat, C; Bitenc, U; Black, K M; Blair, R E; Blanchard, J-B; Blanchot, G; Blazek, T; Blocker, C; Blocki, J; Blondel, A; Blum, W; Blumenschein, U; Bobbink, G J; Bobrovnikov, V B; Bocchetta, S S; Bocci, A; Boddy, C R; Boehler, M; Boek, J; Boelaert, N; Bogaerts, J A; Bogdanchikov, A; Bogouch, A; Bohm, C; Boisvert, V; Bold, T; Boldea, V; Bolnet, N M; Bona, M; Bondarenko, V G; Bondioli, M; Boonekamp, M; Booth, C N; Bordoni, S; Borer, C; Borisov, A; Borissov, G; Borjanovic, I; Borri, M; Borroni, S; Bortolotto, V; Bos, K; Boscherini, D; Bosman, M; Boterenbrood, H; Botterill, D; Bouchami, J; Boudreau, J; Bouhova-Thacker, E V; Boumediene, D; Bourdarios, C; Bousson, N; Boveia, A; Boyd, J; Boyko, I R; Bozhko, N I; Bozovic-Jelisavcic, I; Bracinik, J; Braem, A; Branchini, P; Brandenburg, G W; Brandt, A; Brandt, G; Brandt, O; Bratzler, U; Brau, B; Brau, J E; Braun, H M; Brelier, B; Bremer, J; Brenner, R; Bressler, S; Breton, D; Britton, D; Brochu, F M; Brock, I; Brock, R; Brodbeck, T J; Brodet, E; Broggi, F; Bromberg, C; Bronner, J; Brooijmans, G; Brooks, W K; Brown, G; Brown, H; Bruckman de Renstrom, P A; Bruncko, D; Bruneliere, R; Brunet, S; Bruni, A; Bruni, G; Bruschi, M; Buanes, T; Buat, Q; Bucci, F; Buchanan, J; Buchanan, N J; Buchholz, P; Buckingham, R M; Buckley, A G; Buda, S I; Budagov, I A; Budick, B; Büscher, V; Bugge, L; Bulekov, O; Bunse, M; Buran, T; Burckhart, H; Burdin, S; Burgess, T; Burke, S; Busato, E; Bussey, P; Buszello, C P; Butin, F; Butler, B; Butler, J M; Buttar, C M; Butterworth, J M; Buttinger, W; Cabrera Urbán, S; Caforio, D; Cakir, O; Calafiura, P; Calderini, G; Calfayan, P; Calkins, R; Caloba, L P; Caloi, R; Calvet, D; Calvet, S; Camacho Toro, R; Camarri, P; Cambiaghi, M; Cameron, D; Caminada, L M; Campana, S; Campanelli, M; Canale, V; Canelli, F; Canepa, A; Cantero, J; Capasso, L; Capeans Garrido, M D M; Caprini, I; Caprini, M; Capriotti, D; Capua, M; Caputo, R; Caramarcu, C; Cardarelli, R; Carli, T; Carlino, G; Carminati, L; Caron, B; Caron, S; Carrillo Montoya, G D; Carter, A A; Carter, J R; Carvalho, J; Casadei, D; Casado, M P; Cascella, M; Caso, C; Castaneda Hernandez, A M; Castaneda-Miranda, E; Castillo Gimenez, V; Castro, N F; Cataldi, G; Cataneo, F; Catinaccio, A; Catmore, J R; Cattai, A; Cattani, G; Caughron, S; Cauz, D; Cavalleri, P; Cavalli, D; Cavalli-Sforza, M; Cavasinni, V; Ceradini, F; Cerqueira, A S; Cerri, A; Cerrito, L; Cerutti, F; Cetin, S A; Cevenini, F; Chafaq, A; Chakraborty, D; Chan, K; Chapleau, B; Chapman, J D; Chapman, J W; Chareyre, E; Charlton, D G; Chavda, V; Chavez Barajas, C A; Cheatham, S; Chekanov, S; Chekulaev, S V; Chelkov, G A; Chelstowska, M A; Chen, C; Chen, H; Chen, S; Chen, T; Chen, X; Cheng, S; Cheplakov, A; Chepurnov, V F; Cherkaoui El Moursli, R; Chernyatin, V; Cheu, E; Cheung, S L; Chevalier, L; Chiefari, G; Chikovani, L; Childers, J T; Chilingarov, A; Chiodini, G; Chisholm, A S; Chizhov, M V; Choudalakis, G; Chouridou, S; Christidi, I A; Christov, A; Chromek-Burckhart, D; Chu, M L; Chudoba, J; Ciapetti, G; Ciba, K; Ciftci, A K; Ciftci, R; Cinca, D; Cindro, V; Ciobotaru, M D; Ciocca, C; Ciocio, A; Cirilli, M; Citterio, M; Ciubancan, M; Clark, A; Clark, P J; Cleland, W; Clemens, J C; Clement, B; Clement, C; Clifft, R W; Coadou, Y; Cobal, M; Coccaro, A; Cochran, J; Coe, P; Cogan, J G; Coggeshall, J; Cogneras, E; Colas, J; Colijn, A P; Collins, N J; Collins-Tooth, C; Collot, J; Colon, G; Conde Muiño, P; Coniavitis, E; Conidi, M C; Consonni, M; Consorti, V; Constantinescu, S; Conta, C; Conventi, F; Cook, J; Cooke, M; Cooper, B D; Cooper-Sarkar, A M; Copic, K; Cornelissen, T; Corradi, M; Corriveau, F; Cortes-Gonzalez, A; Cortiana, G; Costa, G; Costa, M J; Costanzo, D; Costin, T; Côté, D; Coura Torres, R; Courneyea, L; Cowan, G; Cowden, C; Cox, B E; Cranmer, K; Crescioli, F; Cristinziani, M; Crosetti, G; Crupi, R; Crépé-Renaudin, S; Cuciuc, C-M; Cuenca Almenar, C; Cuhadar Donszelmann, T; Curatolo, M; Curtis, C J; Cuthbert, C; Cwetanski, P; Czirr, H; Czodrowski, P; Czyczula, Z; D'Auria, S; D'Onofrio, M; D'Orazio, A; Da Silva, P V M; Da Via, C; Dabrowski, W; Dai, T; Dallapiccola, C; Dam, M; Dameri, M; Damiani, D S; Danielsson, H O; Dannheim, D; Dao, V; Darbo, G; Darlea, G L; Davey, W; Davidek, T; Davidson, N; Davidson, R; Davies, E; Davies, M; Davison, A R; Davygora, Y; Dawe, E; Dawson, I; Dawson, J W; Daya-Ishmukhametova, R K; De, K; de Asmundis, R; De Castro, S; De Castro Faria Salgado, P E; De Cecco, S; de Graat, J; De Groot, N; de Jong, P; De La Taille, C; De la Torre, H; De Lotto, B; de Mora, L; De Nooij, L; De Pedis, D; De Salvo, A; De Sanctis, U; De Santo, A; De Vivie De Regie, J B; Dean, S; Dearnaley, W J; Debbe, R; Debenedetti, C; Dedovich, D V; Degenhardt, J; Dehchar, M; Del Papa, C; Del Peso, J; Del Prete, T; Delemontex, T; Deliyergiyev, M; Dell'Acqua, A; Dell'Asta, L; Della Pietra, M; Della Volpe, D; Delmastro, M; Delruelle, N; Delsart, P A; Deluca, C; Demers, S; Demichev, M; Demirkoz, B; Deng, J; Denisov, S P; Derendarz, D; Derkaoui, J E; Derue, F; Dervan, P; Desch, K; Devetak, E; Deviveiros, P O; Dewhurst, A; DeWilde, B; Dhaliwal, S; Dhullipudi, R; Di Ciaccio, A; Di Ciaccio, L; Di Girolamo, A; Di Girolamo, B; Di Luise, S; Di Mattia, A; Di Micco, B; Di Nardo, R; Di Simone, A; Di Sipio, R; Diaz, M A; Diblen, F; Diehl, E B; Dietrich, J; Dietzsch, T A; Diglio, S; Dindar Yagci, K; Dingfelder, J; Dionisi, C; Dita, P; Dita, S; Dittus, F; Djama, F; Djobava, T; do Vale, M A B; Do Valle Wemans, A; Doan, T K O; Dobbs, M; Dobinson, R; Dobos, D; Dobson, E; Dodd, J; Doglioni, C; Doherty, T; Doi, Y; Dolejsi, J; Dolenc, I; Dolezal, Z; Dolgoshein, B A; Dohmae, T; Donadelli, M; Donega, M; Donini, J; Dopke, J; Doria, A; Dos Anjos, A; Dosil, M; Dotti, A; Dova, M T; Dowell, J D; Doxiadis, A D; Doyle, A T; Drasal, Z; Drees, J; Dressnandt, N; Drevermann, H; Driouichi, C; Dris, M; Dubbert, J; Dube, S; Duchovni, E; Duckeck, G; Dudarev, A; Dudziak, F; Dührssen, M; Duerdoth, I P; Duflot, L; Dufour, M-A; Dunford, M; Duran Yildiz, H; Duxfield, R; Dwuznik, M; Dydak, F; Düren, M; Ebenstein, W L; Ebke, J; Eckweiler, S; Edmonds, K; Edwards, C A; Edwards, N C; Ehrenfeld, W; Ehrich, T; Eifert, T; Eigen, G; Einsweiler, K; Eisenhandler, E; Ekelof, T; El Kacimi, M; Ellert, M; Elles, S; Ellinghaus, F; Ellis, K; Ellis, N; Elmsheuser, J; Elsing, M; Emeliyanov, D; Engelmann, R; Engl, A; Epp, B; Eppig, A; Erdmann, J; Ereditato, A; Eriksson, D; Ernst, J; Ernst, M; Ernwein, J; Errede, D; Errede, S; Ertel, E; Escalier, M; Escobar, C; Espinal Curull, X; Esposito, B; Etienne, F; Etienvre, A I; Etzion, E; Evangelakou, D; Evans, H; Fabbri, L; Fabre, C; Fakhrutdinov, R M; Falciano, S; Fang, Y; Fanti, M; Farbin, A; Farilla, A; Farley, J; Farooque, T; Farrington, S M; Farthouat, P; Fassnacht, P; Fassouliotis, D; Fatholahzadeh, B; Favareto, A; Fayard, L; Fazio, S; Febbraro, R; Federic, P; Fedin, O L; Fedorko, W; Fehling-Kaschek, M; Feligioni, L; Fellmann, D; Feng, C; Feng, E J; Fenyuk, A B; Ferencei, J; Ferland, J; Fernando, W; Ferrag, S; Ferrando, J; Ferrara, V; Ferrari, A; Ferrari, P; Ferrari, R; Ferrer, A; Ferrer, M L; Ferrere, D; Ferretti, C; Ferretto Parodi, A; Fiascaris, M; Fiedler, F; Filipčič, A; Filippas, A; Filthaut, F; Fincke-Keeler, M; Fiolhais, M C N; Fiorini, L; Firan, A; Fischer, G; Fischer, P; Fisher, M J; Flechl, M; Fleck, I; Fleckner, J; Fleischmann, P; Fleischmann, S; Flick, T; Flores Castillo, L R; Flowerdew, M J; Fokitis, M; Fonseca Martin, T; Forbush, D A; Formica, A; Forti, A; Fortin, D; Foster, J M; Fournier, D; Foussat, A; Fowler, A J; Fowler, K; Fox, H; Francavilla, P; Franchino, S; Francis, D; Frank, T; Franklin, M; Franz, S; Fraternali, M; Fratina, S; French, S T; Friedrich, F; Froeschl, R; Froidevaux, D; Frost, J A; Fukunaga, C; Fullana Torregrosa, E; Fuster, J; Gabaldon, C; Gabizon, O; Gadfort, T; Gadomski, S; Gagliardi, G; Gagnon, P; Galea, C; Gallas, E J; Gallo, V; Gallop, B J; Gallus, P; Gan, K K; Gao, Y S; Gapienko, V A; Gaponenko, A; Garberson, F; Garcia-Sciveres, M; García, C; García Navarro, J E; Gardner, R W; Garelli, N; Garitaonandia, H; Garonne, V; Garvey, J; Gatti, C; Gaudio, G; Gaur, B; Gauthier, L; Gavrilenko, I L; Gay, C; Gaycken, G; Gayde, J-C; Gazis, E N; Ge, P; Gee, C N P; Geerts, D A A; Geich-Gimbel, Ch; Gellerstedt, K; Gemme, C; Gemmell, A; Genest, M H; Gentile, S; George, M; George, S; Gerlach, P; Gershon, A; Geweniger, C; Ghazlane, H; Ghodbane, N; Giacobbe, B; Giagu, S; Giakoumopoulou, V; Giangiobbe, V; Gianotti, F; Gibbard, B; Gibson, A; Gibson, S M; Gilbert, L M; Gilewsky, V; Gillberg, D; Gillman, A R; Gingrich, D M; Ginzburg, J; Giokaris, N; Giordani, M P; Giordano, R; Giorgi, F M; Giovannini, P; Giraud, P F; Giugni, D; Giunta, M; Giusti, P; Gjelsten, B K; Gladilin, L K; Glasman, C; Glatzer, J; Glazov, A; Glitza, K W; Glonti, G L; Goddard, J R; Godfrey, J; Godlewski, J; Goebel, M; Göpfert, T; Goeringer, C; Gössling, C; Göttfert, T; Goldfarb, S; Golling, T; Gomes, A; Gomez Fajardo, L S; Gonçalo, R; Goncalves Pinto Firmino Da Costa, J; Gonella, L; Gonidec, A; Gonzalez, S; González de la Hoz, S; Gonzalez Parra, G; Gonzalez Silva, M L; Gonzalez-Sevilla, S; Goodson, J J; Goossens, L; Gorbounov, P A; Gordon, H A; Gorelov, I; Gorfine, G; Gorini, B; Gorini, E; Gorišek, A; Gornicki, E; Gorokhov, S A; Goryachev, V N; Gosdzik, B; Gosselink, M; Gostkin, M I; Gough Eschrich, I; Gouighri, M; Goujdami, D; Goulette, M P; Goussiou, A G; Goy, C; Gozpinar, S; Grabowska-Bold, I; Grafström, P; Grahn, K-J; Grancagnolo, F; Grancagnolo, S; Grassi, V; Gratchev, V; Grau, N; Gray, H M; Gray, J A; Graziani, E; Grebenyuk, O G; Greenshaw, T; Greenwood, Z D; Gregersen, K; Gregor, I M; Grenier, P; Griffiths, J; Grigalashvili, N; Grillo, A A; Grinstein, S; Grishkevich, Y V; Grivaz, J-F; Groh, M; Gross, E; Grosse-Knetter, J; Groth-Jensen, J; Grybel, K; Guarino, V J; Guest, D; Guicheney, C; Guida, A; Guindon, S; Guler, H; Gunther, J; Guo, B; Guo, J; Gupta, A; Gusakov, Y; Gushchin, V N; Gutierrez, P; Guttman, N; Gutzwiller, O; Guyot, C; Gwenlan, C; Gwilliam, C B; Haas, A; Haas, S; Haber, C; Hadavand, H K; Hadley, D R; Haefner, P; Hahn, F; Haider, S; Hajduk, Z; Hakobyan, H; Hall, D; Haller, J; Hamacher, K; Hamal, P; Hamer, M; Hamilton, A; Hamilton, S; Han, H; Han, L; Hanagaki, K; Hanawa, K; Hance, M; Handel, C; Hanke, P; Hansen, J R; Hansen, J B; Hansen, J D; Hansen, P H; Hansson, P; Hara, K; Hare, G A; Harenberg, T; Harkusha, S; Harper, D; Harrington, R D; Harris, O M; Harrison, K; Hartert, J; Hartjes, F; Haruyama, T; Harvey, A; Hasegawa, S; Hasegawa, Y; Hassani, S; Hatch, M; Hauff, D; Haug, S; Hauschild, M; Hauser, R; Havranek, M; Hawes, B M; Hawkes, C M; Hawkings, R J; Hawkins, A D; Hawkins, D; Hayakawa, T; Hayashi, T; Hayden, D; Hayward, H S; Haywood, S J; Hazen, E; He, M; Head, S J; Hedberg, V; Heelan, L; Heim, S; Heinemann, B; Heisterkamp, S; Helary, L; Heller, C; Heller, M; Hellman, S; Hellmich, D; Helsens, C; Henderson, R C W; Henke, M; Henrichs, A; Henriques Correia, A M; Henrot-Versille, S; Henry-Couannier, F; Hensel, C; Henß, T; Hernandez, C M; Hernández Jiménez, Y; Herrberg, R; Hershenhorn, A D; Herten, G; Hertenberger, R; Hervas, L; Hessey, N P; Higón-Rodriguez, E; Hill, D; Hill, J C; Hill, N; Hiller, K H; Hillert, S; Hillier, S J; Hinchliffe, I; Hines, E; Hirose, M; Hirsch, F; Hirschbuehl, D; Hobbs, J; Hod, N; Hodgkinson, M C; Hodgson, P; Hoecker, A; Hoeferkamp, M R; Hoffman, J; Hoffmann, D; Hohlfeld, M; Holder, M; Holmgren, S O; Holy, T; Holzbauer, J L; Homma, Y; Hong, T M; Hooft van Huysduynen, L; Horazdovsky, T; Horn, C; Horner, S; Hostachy, J-Y; Hou, S; Houlden, M A; Hoummada, A; Howarth, J; Howell, D F; Hristova, I; Hrivnac, J; Hruska, I; Hryn'ova, T; Hsu, P J; Hsu, S-C; Huang, G S; Hubacek, Z; Hubaut, F; Huegging, F; Huettmann, A; Huffman, T B; Hughes, E W; Hughes, G; Hughes-Jones, R E; Huhtinen, M; Hurst, P; Hurwitz, M; Husemann, U; Huseynov, N; Huston, J; Huth, J; Iacobucci, G; Iakovidis, G; Ibbotson, M; Ibragimov, I; Ichimiya, R; Iconomidou-Fayard, L; Idarraga, J; Iengo, P; Igonkina, O; Ikegami, Y; Ikeno, M; Ilchenko, Y; Iliadis, D; Ilic, N; Imori, M; Ince, T; Inigo-Golfin, J; Ioannou, P; Iodice, M; Ippolito, V; Irles Quiles, A; Isaksson, C; Ishikawa, A; Ishino, M; Ishmukhametov, R; Issever, C; Istin, S; Ivashin, A V; Iwanski, W; Iwasaki, H; Izen, J M; Izzo, V; Jackson, B; Jackson, J N; Jackson, P; Jaekel, M R; Jain, V; Jakobs, K; Jakobsen, S; Jakubek, J; Jana, D K; Jankowski, E; Jansen, E; Jansen, H; Jantsch, A; Janus, M; Jarlskog, G; Jeanty, L; Jelen, K; Jen-La Plante, I; Jenni, P; Jeremie, A; Jež, P; Jézéquel, S; Jha, M K; Ji, H; Ji, W; Jia, J; Jiang, Y; Jimenez Belenguer, M; Jin, G; Jin, S; Jinnouchi, O; Joergensen, M D; Joffe, D; Johansen, L G; Johansen, M; Johansson, K E; Johansson, P; Johnert, S; Johns, K A; Jon-And, K; Jones, G; Jones, R W L; Jones, T W; Jones, T J; Jonsson, O; Joram, C; Jorge, P M; Joseph, J; Jovin, T; Ju, X; Jung, C A; Jungst, R M; Juranek, V; Jussel, P; Juste Rozas, A; Kabachenko, V V; Kabana, S; Kaci, M; Kaczmarska, A; Kadlecik, P; Kado, M; Kagan, H; Kagan, M; Kaiser, S; Kajomovitz, E; Kalinin, S; Kalinovskaya, L V; Kama, S; Kanaya, N; Kaneda, M; Kaneti, S; Kanno, T; Kantserov, V A; Kanzaki, J; Kaplan, B; Kapliy, A; Kaplon, J; Kar, D; Karagounis, M; Karagoz, M; Karnevskiy, M; Karr, K; Kartvelishvili, V; Karyukhin, A N; Kashif, L; Kasieczka, G; Kass, R D; Kastanas, A; Kataoka, M; Kataoka, Y; Katsoufis, E; Katzy, J; Kaushik, V; Kawagoe, K; Kawamoto, T; Kawamura, G; Kayl, M S; Kazanin, V A; Kazarinov, M Y; Keeler, R; Kehoe, R; Keil, M; Kekelidze, G D; Kennedy, J; Kenney, C J; Kenyon, M; Kepka, O; Kerschen, N; Kerševan, B P; Kersten, S; Kessoku, K; Keung, J; Khalil-Zada, F; Khandanyan, H; Khanov, A; Kharchenko, D; Khodinov, A; Kholodenko, A G; Khomich, A; Khoo, T J; Khoriauli, G; Khoroshilov, A; Khovanskiy, N; Khovanskiy, V; Khramov, E; Khubua, J; Kim, H; Kim, M S; Kim, P C; Kim, S H; Kimura, N; Kind, O; King, B T; King, M; King, R S B; Kirk, J; Kirsch, L E; Kiryunin, A E; Kishimoto, T; Kisielewska, D; Kittelmann, T; Kiver, A M; Kladiva, E; Klaiber-Lodewigs, J; Klein, M; Klein, U; Kleinknecht, K; Klemetti, M; Klier, A; Klimek, P; Klimentov, A; Klingenberg, R; Klinger, J A; Klinkby, E B; Klioutchnikova, T; Klok, P F; Klous, S; Kluge, E-E; Kluge, T; Kluit, P; Kluth, S; Knecht, N S; Kneringer, E; Knobloch, J; Knoops, E B F G; Knue, A; Ko, B R; Kobayashi, T; Kobel, M; Kocian, M; Kodys, P; Köneke, K; König, A C; Koenig, S; Köpke, L; Koetsveld, F; Koevesarki, P; Koffas, T; Koffeman, E; Kogan, L A; Kohn, F; Kohout, Z; Kohriki, T; Koi, T; Kokott, T; Kolachev, G M; Kolanoski, H; Kolesnikov, V; Koletsou, I; Koll, J; Kollar, D; Kollefrath, M; Kolya, S D; Komar, A A; Komori, Y; Kondo, T; Kono, T; Kononov, A I; Konoplich, R; Konstantinidis, N; Kootz, A; Koperny, S; Korcyl, K; Kordas, K; Koreshev, V; Korn, A; Korol, A; Korolkov, I; Korolkova, E V; Korotkov, V A; Kortner, O; Kortner, S; Kostyukhin, V V; Kotamäki, M J; Kotov, S; Kotov, V M; Kotwal, A; Kourkoumelis, C; Kouskoura, V; Koutsman, A; Kowalewski, R; Kowalski, T Z; Kozanecki, W; Kozhin, A S; Kral, V; Kramarenko, V A; Kramberger, G; Krasny, M W; Krasznahorkay, A; Kraus, J; Kraus, J K; Kreisel, A; Krejci, F; Kretzschmar, J; Krieger, N; Krieger, P; Kroeninger, K; Kroha, H; Kroll, J; Kroseberg, J; Krstic, J; Kruchonak, U; Krüger, H; Kruker, T; Krumnack, N; Krumshteyn, Z V; Kruth, A; Kubota, T; Kuday, S; Kuehn, S; Kugel, A; Kuhl, T; Kuhn, D; Kukhtin, V; Kulchitsky, Y; Kuleshov, S; Kummer, C; Kuna, M; Kundu, N; Kunkle, J; Kupco, A; Kurashige, H; Kurata, M; Kurochkin, Y A; Kus, V; Kuwertz, E S; Kuze, M; Kvita, J; Kwee, R; La Rosa, A; La Rotonda, L; Labarga, L; Labbe, J; Lablak, S; Lacasta, C; Lacava, F; Lacker, H; Lacour, D; Lacuesta, V R; Ladygin, E; Lafaye, R; Laforge, B; Lagouri, T; Lai, S; Laisne, E; Lamanna, M; Lampen, C L; Lampl, W; Lancon, E; Landgraf, U; Landon, M P J; Lane, J L; Lange, C; Lankford, A J; Lanni, F; Lantzsch, K; Laplace, S; Lapoire, C; Laporte, J F; Lari, T; Larionov, A V; Larner, A; Lasseur, C; Lassnig, M; Laurelli, P; Lavrijsen, W; Laycock, P; Lazarev, A B; Le Dortz, O; Le Guirriec, E; Le Maner, C; Le Menedeu, E; Lebel, C; LeCompte, T; Ledroit-Guillon, F; Lee, H; Lee, J S H; Lee, S C; Lee, L; Lefebvre, M; Legendre, M; Leger, A; LeGeyt, B C; Legger, F; Leggett, C; Lehmacher, M; Lehmann Miotto, G; Lei, X; Leite, M A L; Leitner, R; Lellouch, D; Leltchouk, M; Lemmer, B; Lendermann, V; Leney, K J C; Lenz, T; Lenzen, G; Lenzi, B; Leonhardt, K; Leontsinis, S; Leroy, C; Lessard, J-R; Lesser, J; Lester, C G; Leung Fook Cheong, A; Levêque, J; Levin, D; Levinson, L J; Levitski, M S; Lewis, A; Lewis, G H; Leyko, A M; Leyton, M; Li, B; Li, H; Li, S; Li, X; Liang, Z; Liao, H; Liberti, B; Lichard, P; Lichtnecker, M; Lie, K; Liebig, W; Lifshitz, R; Limbach, C; Limosani, A; Limper, M; Lin, S C; Linde, F; Linnemann, J T; Lipeles, E; Lipinsky, L; Lipniacka, A; Liss, T M; Lissauer, D; Lister, A; Litke, A M; Liu, C; Liu, D; Liu, H; Liu, J B; Liu, M; Liu, Y; Livan, M; Livermore, S S A; Lleres, A; Llorente Merino, J; Lloyd, S L; Lobodzinska, E; Loch, P; Lockman, W S; Loddenkoetter, T; Loebinger, F K; Loginov, A; Loh, C W; Lohse, T; Lohwasser, K; Lokajicek, M; Loken, J; Lombardo, V P; Long, R E; Lopes, L; Lopez Mateos, D; Lorenz, J; Lorenzo Martinez, N; Losada, M; Loscutoff, P; Lo Sterzo, F; Losty, M J; Lou, X; Lounis, A; Loureiro, K F; Love, J; Love, P A; Lowe, A J; Lu, F; Lubatti, H J; Luci, C; Lucotte, A; Ludwig, A; Ludwig, D; Ludwig, I; Ludwig, J; Luehring, F; Luijckx, G; Lumb, D; Luminari, L; Lund, E; Lund-Jensen, B; Lundberg, B; Lundberg, J; Lundquist, J; Lungwitz, M; Lutz, G; Lynn, D; Lys, J; Lytken, E; Ma, H; Ma, L L; Macana Goia, J A; Maccarrone, G; Macchiolo, A; Maček, B; Machado Miguens, J; Mackeprang, R; Madaras, R J; Mader, W F; Maenner, R; Maeno, T; Mättig, P; Mättig, S; Magnoni, L; Magradze, E; Mahalalel, Y; Mahboubi, K; Mahout, G; Maiani, C; Maidantchik, C; Maio, A; Majewski, S; Makida, Y; Makovec, N; Mal, P; Malaescu, B; Malecki, Pa; Malecki, P; Maleev, V P; Malek, F; Mallik, U; Malon, D; Malone, C; Maltezos, S; Malyshev, V; Malyukov, S; Mameghani, R; Mamuzic, J; Manabe, A; Mandelli, L; Mandić, I; Mandrysch, R; Maneira, J; Mangeard, P S; Manhaes de Andrade Filho, L; Manjavidze, I D; Mann, A; Manning, P M; Manousakis-Katsikakis, A; Mansoulie, B; Manz, A; Mapelli, A; Mapelli, L; March, L; Marchand, J F; Marchese, F; Marchiori, G; Marcisovsky, M; Marin, A; Marino, C P; Marroquim, F; Marshall, R; Marshall, Z; Martens, F K; Marti-Garcia, S; Martin, A J; Martin, B; Martin, B; Martin, F F; Martin, J P; Martin, Ph; Martin, T A; Martin, V J; Martin Dit Latour, B; Martin-Haugh, S; Martinez, M; Martinez Outschoorn, V; Martyniuk, A C; Marx, M; Marzano, F; Marzin, A; Masetti, L; Mashimo, T; Mashinistov, R; Masik, J; Maslennikov, A L; Massa, I; Massaro, G; Massol, N; Mastrandrea, P; Mastroberardino, A; Masubuchi, T; Mathes, M; Matricon, P; Matsumoto, H; Matsunaga, H; Matsushita, T; Mattravers, C; Maugain, J M; Maurer, J; Maxfield, S J; Maximov, D A; May, E N; Mayne, A; Mazini, R; Mazur, M; Mazzanti, M; Mazzoni, E; Mc Kee, S P; McCarn, A; McCarthy, R L; McCarthy, T G; McCubbin, N A; McFarlane, K W; Mcfayden, J A; McGlone, H; Mchedlidze, G; McLaren, R A; Mclaughlan, T; McMahon, S J; McPherson, R A; Meade, A; Mechnich, J; Mechtel, M; Medinnis, M; Meera-Lebbai, R; Meguro, T; Mehdiyev, R; Mehlhase, S; Mehta, A; Meier, K; Meirose, B; Melachrinos, C; Mellado Garcia, B R; Mendoza Navas, L; Meng, Z; Mengarelli, A; Menke, S; Menot, C; Meoni, E; Mercurio, K M; Mermod, P; Merola, L; Meroni, C; Merritt, F S; Merritt, H; Messina, A; Metcalfe, J; Mete, A S; Meyer, C; Meyer, C; Meyer, J-P; Meyer, J; Meyer, J; Meyer, T C; Meyer, W T; Miao, J; Michal, S; Micu, L; Middleton, R P; Migas, S; Mijović, L; Mikenberg, G; Mikestikova, M; Mikuž, M; Miller, D W; Miller, R J; Mills, W J; Mills, C; Milov, A; Milstead, D A; Milstein, D; Minaenko, A A; Miñano Moya, M; Minashvili, I A; Mincer, A I; Mindur, B; Mineev, M; Ming, Y; Mir, L M; Mirabelli, G; Miralles Verge, L; Misiejuk, A; Mitrevski, J; Mitrofanov, G Y; Mitsou, V A; Mitsui, S; Miyagawa, P S; Miyazaki, K; Mjörnmark, J U; Moa, T; Mockett, P; Moed, S; Moeller, V; Mönig, K; Möser, N; Mohapatra, S; Mohr, W; Mohrdieck-Möck, S; Moisseev, A M; Moles-Valls, R; Molina-Perez, J; Monk, J; Monnier, E; Montesano, S; Monticelli, F; Monzani, S; Moore, R W; Moorhead, G F; Mora Herrera, C; Moraes, A; Morange, N; Morel, J; Morello, G; Moreno, D; Moreno Llácer, M; Morettini, P; Morii, M; Morin, J; Morley, A K; Mornacchi, G; Morozov, S V; Morris, J D; Morvaj, L; Moser, H G; Mosidze, M; Moss, J; Mount, R; Mountricha, E; Mouraviev, S V; Moyse, E J W; Mudrinic, M; Mueller, F; Mueller, J; Mueller, K; Müller, T A; Mueller, T; Muenstermann, D; Muir, A; Munwes, Y; Murray, W J; Mussche, I; Musto, E; Myagkov, A G; Myska, M; Nadal, J; Nagai, K; Nagano, K; Nagarkar, A; Nagasaka, Y; Nagel, M; Nairz, A M; Nakahama, Y; Nakamura, K; Nakamura, T; Nakano, I; Nanava, G; Napier, A; Narayan, R; Nash, M; Nation, N R; Nattermann, T; Naumann, T; Navarro, G; Neal, H A; Nebot, E; Nechaeva, P Yu; Neep, T J; Negri, A; Negri, G; Nektarijevic, S; Nelson, A; Nelson, S; Nelson, T K; Nemecek, S; Nemethy, P; Nepomuceno, A A; Nessi, M; Neubauer, M S; Neusiedl, A; Neves, R M; Nevski, P; Newman, P R; Nguyen Thi Hong, V; Nickerson, R B; Nicolaidou, R; Nicolas, L; Nicquevert, B; Niedercorn, F; Nielsen, J; Niinikoski, T; Nikiforou, N; Nikiforov, A; Nikolaenko, V; Nikolaev, K; Nikolic-Audit, I; Nikolics, K; Nikolopoulos, K; Nilsen, H; Nilsson, P; Ninomiya, Y; Nisati, A; Nishiyama, T; Nisius, R; Nodulman, L; Nomachi, M; Nomidis, I; Nordberg, M; Nordkvist, B; Norton, P R; Novakova, J; Nozaki, M; Nozka, L; Nugent, I M; Nuncio-Quiroz, A-E; Nunes Hanninger, G; Nunnemann, T; Nurse, E; O'Brien, B J; O'Neale, S W; O'Neil, D C; O'Shea, V; Oakes, L B; Oakham, F G; Oberlack, H; Ocariz, J; Ochi, A; Oda, S; Odaka, S; Odier, J; Ogren, H; Oh, A; Oh, S H; Ohm, C C; Ohshima, T; Ohshita, H; Ohsugi, T; Okada, S; Okawa, H; Okumura, Y; Okuyama, T; Olariu, A; Olcese, M; Olchevski, A G; Oliveira, M; Oliveira Damazio, D; Oliver Garcia, E; Olivito, D; Olszewski, A; Olszowska, J; Omachi, C; Onofre, A; Onyisi, P U E; Oram, C J; Oreglia, M J; Oren, Y; Orestano, D; Orlov, I; Oropeza Barrera, C; Orr, R S; Osculati, B; Ospanov, R; Osuna, C; Otero Y Garzon, G; Ottersbach, J P; Ouchrif, M; Ouellette, E A; Ould-Saada, F; Ouraou, A; Ouyang, Q; Ovcharova, A; Owen, M; Owen, S; Ozcan, V E; Ozturk, N; Pacheco Pages, A; Padilla Aranda, C; Pagan Griso, S; Paganis, E; Paige, F; Pais, P; Pajchel, K; Palacino, G; Paleari, C P; Palestini, S; Pallin, D; Palma, A; Palmer, J D; Pan, Y B; Panagiotopoulou, E; Panes, B; Panikashvili, N; Panitkin, S; Pantea, D; Panuskova, M; Paolone, V; Papadelis, A; Papadopoulou, Th D; Paramonov, A; Park, W; Parker, M A; Parodi, F; Parsons, J A; Parzefall, U; Pasqualucci, E; Passaggio, S; Passeri, A; Pastore, F; Pastore, Fr; Pásztor, G; Pataraia, S; Patel, N; Pater, J R; Patricelli, S; Pauly, T; Pecsy, M; Pedraza Morales, M I; Peleganchuk, S V; Peng, H; Pengo, R; Penson, A; Penwell, J; Perantoni, M; Perez, K; Perez Cavalcanti, T; Perez Codina, E; Pérez García-Estañ, M T; Perez Reale, V; Perini, L; Pernegger, H; Perrino, R; Perrodo, P; Persembe, S; Perus, A; Peshekhonov, V D; Peters, K; Petersen, B A; Petersen, J; Petersen, T C; Petit, E; Petridis, A; Petridou, C; Petrolo, E; Petrucci, F; Petschull, D; Petteni, M; Pezoa, R; Phan, A; Phillips, P W; Piacquadio, G; Piccaro, E; Piccinini, M; Piec, S M; Piegaia, R; Pignotti, D T; Pilcher, J E; Pilkington, A D; Pina, J; Pinamonti, M; Pinder, A; Pinfold, J L; Ping, J; Pinto, B; Pirotte, O; Pizio, C; Plamondon, M; Pleier, M-A; Pleskach, A V; Poblaguev, A; Poddar, S; Podlyski, F; Poggioli, L; Poghosyan, T; Pohl, M; Polci, F; Polesello, G; Policicchio, A; Polini, A; Poll, J; Polychronakos, V; Pomarede, D M; Pomeroy, D; Pommès, K; Pontecorvo, L; Pope, B G; Popeneciu, G A; Popovic, D S; Poppleton, A; Portell Bueso, X; Posch, C; Pospelov, G E; Pospisil, S; Potrap, I N; Potter, C J; Potter, C T; Poulard, G; Poveda, J; Prabhu, R; Pralavorio, P; Pranko, A; Prasad, S; Pravahan, R; Prell, S; Pretzl, K; Pribyl, L; Price, D; Price, J; Price, L E; Price, M J; Prieur, D; Primavera, M; Prokofiev, K; Prokoshin, F; Protopopescu, S; Proudfoot, J; Prudent, X; Przybycien, M; Przysiezniak, H; Psoroulas, S; Ptacek, E; Pueschel, E; Purdham, J; Purohit, M; Puzo, P; Pylypchenko, Y; Qian, J; Qian, Z; Qin, Z; Quadt, A; Quarrie, D R; Quayle, W B; Quinonez, F; Raas, M; Radescu, V; Radics, B; Radloff, P; Rador, T; Ragusa, F; Rahal, G; Rahimi, A M; Rahm, D; Rajagopalan, S; Rammensee, M; Rammes, M; Randle-Conde, A S; Randrianarivony, K; Ratoff, P N; Rauscher, F; Raymond, M; Read, A L; Rebuzzi, D M; Redelbach, A; Redlinger, G; Reece, R; Reeves, K; Reichold, A; Reinherz-Aronis, E; Reinsch, A; Reisinger, I; Reljic, D; Rembser, C; Ren, Z L; Renaud, A; Renkel, P; Rescigno, M; Resconi, S; Resende, B; Reznicek, P; Rezvani, R; Richards, A; Richter, R; Richter-Was, E; Ridel, M; Rijpstra, M; Rijssenbeek, M; Rimoldi, A; Rinaldi, L; Rios, R R; Riu, I; Rivoltella, G; Rizatdinova, F; Rizvi, E; Robertson, S H; Robichaud-Veronneau, A; Robinson, D; Robinson, J E M; Robinson, M; Robson, A; Rocha de Lima, J G; Roda, C; Roda Dos Santos, D; Rodriguez, D; Roe, A; Roe, S; Røhne, O; Rojo, V; Rolli, S; Romaniouk, A; Romano, M; Romanov, V M; Romeo, G; Romero Adam, E; Roos, L; Ros, E; Rosati, S; Rosbach, K; Rose, A; Rose, M; Rosenbaum, G A; Rosenberg, E I; Rosendahl, P L; Rosenthal, O; Rosselet, L; Rossetti, V; Rossi, E; Rossi, L P; Rotaru, M; Roth, I; Rothberg, J; Rousseau, D; Royon, C R; Rozanov, A; Rozen, Y; Ruan, X; Rubinskiy, I; Ruckert, B; Ruckstuhl, N; Rud, V I; Rudolph, C; Rudolph, G; Rühr, F; Ruggieri, F; Ruiz-Martinez, A; Rumiantsev, V; Rumyantsev, L; Runge, K; Rurikova, Z; Rusakovich, N A; Rust, D R; Rutherfoord, J P; Ruwiedel, C; Ruzicka, P; Ryabov, Y F; Ryadovikov, V; Ryan, P; Rybar, M; Rybkin, G; Ryder, N C; Rzaeva, S; Saavedra, A F; Sadeh, I; Sadrozinski, H F-W; Sadykov, R; Safai Tehrani, F; Sakamoto, H; Salamanna, G; Salamon, A; Saleem, M; Salihagic, D; Salnikov, A; Salt, J; Salvachua Ferrando, B M; Salvatore, D; Salvatore, F; Salvucci, A; Salzburger, A; Sampsonidis, D; Samset, B H; Sanchez, A; Sanchez Martinez, V; Sandaker, H; Sander, H G; Sanders, M P; Sandhoff, M; Sandoval, T; Sandoval, C; Sandstroem, R; Sandvoss, S; Sankey, D P C; Sansoni, A; Santamarina Rios, C; Santoni, C; Santonico, R; Santos, H; Saraiva, J G; Sarangi, T; Sarkisyan-Grinbaum, E; Sarri, F; Sartisohn, G; Sasaki, O; Sasao, N; Satsounkevitch, I; Sauvage, G; Sauvan, E; Sauvan, J B; Savard, P; Savinov, V; Savu, D O; Sawyer, L; Saxon, D H; Says, L P; Sbarra, C; Sbrizzi, A; Scallon, O; Scannicchio, D A; Scarcella, M; Schaarschmidt, J; Schacht, P; Schäfer, U; Schaepe, S; Schaetzel, S; Schaffer, A C; Schaile, D; Schamberger, R D; Schamov, A G; Scharf, V; Schegelsky, V A; Scheirich, D; Schernau, M; Scherzer, M I; Schiavi, C; Schieck, J; Schioppa, M; Schlenker, S; Schlereth, J L; Schmidt, E; Schmieden, K; Schmitt, C; Schmitt, S; Schmitz, M; Schöning, A; Schott, M; Schouten, D; Schovancova, J; Schram, M; Schroeder, C; Schroer, N; Schuh, S; Schuler, G; Schultens, M J; Schultes, J; Schultz-Coulon, H-C; Schulz, H; Schumacher, J W; Schumacher, M; Schumm, B A; Schune, Ph; Schwanenberger, C; Schwartzman, A; Schwemling, Ph; Schwienhorst, R; Schwierz, R; Schwindling, J; Schwindt, T; Schwoerer, M; Scott, W G; Searcy, J; Sedov, G; Sedykh, E; Segura, E; Seidel, S C; Seiden, A; Seifert, F; Seixas, J M; Sekhniaidze, G; Selbach, K E; Seliverstov, D M; Sellden, B; Sellers, G; Seman, M; Semprini-Cesari, N; Serfon, C; Serin, L; Serkin, L; Seuster, R; Severini, H; Sevior, M E; Sfyrla, A; Shabalina, E; Shamim, M; Shan, L Y; Shank, J T; Shao, Q T; Shapiro, M; Shatalov, P B; Shaver, L; Shaw, K; Sherman, D; Sherwood, P; Shibata, A; Shichi, H; Shimizu, S; Shimojima, M; Shin, T; Shiyakova, M; Shmeleva, A; Shochet, M J; Short, D; Shrestha, S; Shulga, E; Shupe, M A; Sicho, P; Sidoti, A; Siegert, F; Sijacki, Dj; Silbert, O; Silva, J; Silver, Y; Silverstein, D; Silverstein, S B; Simak, V; Simard, O; Simic, Lj; Simion, S; Simmons, B; Simonyan, M; Sinervo, P; Sinev, N B; Sipica, V; Siragusa, G; Sircar, A; Sisakyan, A N; Sivoklokov, S Yu; Sjölin, J; Sjursen, T B; Skinnari, L A; Skottowe, H P; Skovpen, K; Skubic, P; Skvorodnev, N; Slater, M; Slavicek, T; Sliwa, K; Sloper, J; Smakhtin, V; Smirnov, S Yu; Smirnov, Y; Smirnova, L N; Smirnova, O; Smith, B C; Smith, D; Smith, K M; Smizanska, M; Smolek, K; Snesarev, A A; Snow, S W; Snow, J; Snuverink, J; Snyder, S; Soares, M; Sobie, R; Sodomka, J; Soffer, A; Solans, C A; Solar, M; Solc, J; Soldatov, E; Soldevila, U; Solfaroli Camillocci, E; Solodkov, A A; Solovyanov, O V; Soni, N; Sopko, V; Sopko, B; Sosebee, M; Soualah, R; Soukharev, A; Spagnolo, S; Spanò, F; Spighi, R; Spigo, G; Spila, F; Spiwoks, R; Spousta, M; Spreitzer, T; Spurlock, B; St Denis, R D; Stahlman, J; Stamen, R; Stanecka, E; Stanek, R W; Stanescu, C; Stapnes, S; Starchenko, E A; Stark, J; Staroba, P; Starovoitov, P; Staude, A; Stavina, P; Stavropoulos, G; Steele, G; Steinbach, P; Steinberg, P; Stekl, I; Stelzer, B; Stelzer, H J; Stelzer-Chilton, O; Stenzel, H; Stern, S; Stevenson, K; Stewart, G A; Stillings, J A; Stockton, M C; Stoerig, K; Stoicea, G; Stonjek, S; Strachota, P; Stradling, A R; Straessner, A; Strandberg, J; Strandberg, S; Strandlie, A; Strang, M; Strauss, E; Strauss, M; Strizenec, P; Ströhmer, R; Strom, D M; Strong, J A; Stroynowski, R; Strube, J; Stugu, B; Stumer, I; Stupak, J; Sturm, P; Styles, N A; Soh, D A; Su, D; Subramania, Hs; Succurro, A; Sugaya, Y; Sugimoto, T; Suhr, C; Suita, K; Suk, M; Sulin, V V; Sultansoy, S; Sumida, T; Sun, X; Sundermann, J E; Suruliz, K; Sushkov, S; Susinno, G; Sutton, M R; Suzuki, Y; Suzuki, Y; Svatos, M; Sviridov, Yu M; Swedish, S; Sykora, I; Sykora, T; Szeless, B; Sánchez, J; Ta, D; Tackmann, K; Taffard, A; Tafirout, R; Taiblum, N; Takahashi, Y; Takai, H; Takashima, R; Takeda, H; Takeshita, T; Takubo, Y; Talby, M; Talyshev, A; Tamsett, M C; Tanaka, J; Tanaka, R; Tanaka, S; Tanaka, S; Tanaka, Y; Tanasijczuk, A J; Tani, K; Tannoury, N; Tappern, G P; Tapprogge, S; Tardif, D; Tarem, S; Tarrade, F; Tartarelli, G F; Tas, P; Tasevsky, M; Tassi, E; Tatarkhanov, M; Tayalati, Y; Taylor, C; Taylor, F E; Taylor, G N; Taylor, W; Teinturier, M; Teixeira Dias Castanheira, M; Teixeira-Dias, P; Temming, K K; Ten Kate, H; Teng, P K; Terada, S; Terashi, K; Terron, J; Testa, M; Teuscher, R J; Thadome, J; Therhaag, J; Theveneaux-Pelzer, T; Thioye, M; Thoma, S; Thomas, J P; Thompson, E N; Thompson, P D; Thompson, P D; Thompson, A S; Thomson, E; Thomson, M; Thun, R P; Tian, F; Tibbetts, M J; Tic, T; Tikhomirov, V O; Tikhonov, Y A; Timoshenko, S; Tipton, P; Tique Aires Viegas, F J; Tisserant, S; Toczek, B; Todorov, T; Todorova-Nova, S; Toggerson, B; Tojo, J; Tokár, S; Tokunaga, K; Tokushuku, K; Tollefson, K; Tomoto, M; Tompkins, L; Toms, K; Tong, G; Tonoyan, A; Topfel, C; Topilin, N D; Torchiani, I; Torrence, E; Torres, H; Torró Pastor, E; Toth, J; Touchard, F; Tovey, D R; Trefzger, T; Tremblet, L; Tricoli, A; Trigger, I M; Trincaz-Duvoid, S; Trinh, T N; Tripiana, M F; Trischuk, W; Trivedi, A; Trocmé, B; Troncon, C; Trottier-McDonald, M; Trzebinski, M; Trzupek, A; Tsarouchas, C; Tseng, J C-L; Tsiakiris, M; Tsiareshka, P V; Tsionou, D; Tsipolitis, G; Tsiskaridze, V; Tskhadadze, E G; Tsukerman, I I; Tsulaia, V; Tsung, J-W; Tsuno, S; Tsybychev, D; Tua, A; Tudorache, A; Tudorache, V; Tuggle, J M; Turala, M; Turecek, D; Turk Cakir, I; Turlay, E; Turra, R; Tuts, P M; Tykhonov, A; Tylmad, M; Tyndel, M; Tzanakos, G; Uchida, K; Ueda, I; Ueno, R; Ugland, M; Uhlenbrock, M; Uhrmacher, M; Ukegawa, F; Unal, G; Underwood, D G; Undrus, A; Unel, G; Unno, Y; Urbaniec, D; Usai, G; Uslenghi, M; Vacavant, L; Vacek, V; Vachon, B; Vahsen, S; Valenta, J; Valente, P; Valentinetti, S; Valkar, S; Valladolid Gallego, E; Vallecorsa, S; Valls Ferrer, J A; van der Graaf, H; van der Kraaij, E; Van Der Leeuw, R; van der Poel, E; van der Ster, D; van Eldik, N; van Gemmeren, P; van Kesteren, Z; van Vulpen, I; Vanadia, M; Vandelli, W; Vandoni, G; Vaniachine, A; Vankov, P; Vannucci, F; Varela Rodriguez, F; Vari, R; Varnes, E W; Varouchas, D; Vartapetian, A; Varvell, K E; Vassilakopoulos, V I; Vazeille, F; Vegni, G; Veillet, J J; Vellidis, C; Veloso, F; Veness, R; Veneziano, S; Ventura, A; Ventura, D; Venturi, M; Venturi, N; Vercesi, V; Verducci, M; Verkerke, W; Vermeulen, J C; Vest, A; Vetterli, M C; Vichou, I; Vickey, T; Vickey Boeriu, O E; Viehhauser, G H A; Viel, S; Villa, M; Villaplana Perez, M; Vilucchi, E; Vincter, M G; Vinek, E; Vinogradov, V B; Virchaux, M; Virzi, J; Vitells, O; Viti, M; Vivarelli, I; Vives Vaque, F; Vlachos, S; Vladoiu, D; Vlasak, M; Vlasov, N; Vogel, A; Vokac, P; Volpi, G; Volpi, M; Volpini, G; von der Schmitt, H; von Loeben, J; von Radziewski, H; von Toerne, E; Vorobel, V; Vorobiev, A P; Vorwerk, V; Vos, M; Voss, R; Voss, T T; Vossebeld, J H; Vranjes, N; Vranjes Milosavljevic, M; Vrba, V; Vreeswijk, M; Vu Anh, T; Vuillermet, R; Vukotic, I; Wagner, W; Wagner, P; Wahlen, H; Wakabayashi, J; Walbersloh, J; Walch, S; Walder, J; Walker, R; Walkowiak, W; Wall, R; Waller, P; Wang, C; Wang, H; Wang, H; Wang, J; Wang, J; Wang, J C; Wang, R; Wang, S M; Warburton, A; Ward, C P; Warsinsky, M; Watkins, P M; Watson, A T; Watson, I J; Watson, M F; Watts, G; Watts, S; Waugh, A T; Waugh, B M; Weber, M; Weber, M S; Weber, P; Weidberg, A R; Weigell, P; Weingarten, J; Weiser, C; Wellenstein, H; Wells, P S; Wen, M; Wenaus, T; Wendler, S; Weng, Z; Wengler, T; Wenig, S; Wermes, N; Werner, M; Werner, P; Werth, M; Wessels, M; Weydert, C; Whalen, K; Wheeler-Ellis, S J; Whitaker, S P; White, A; White, M J; Whitehead, S R; Whiteson, D; Whittington, D; Wicek, F; Wicke, D; Wickens, F J; Wiedenmann, W; Wielers, M; Wienemann, P; Wiglesworth, C; Wiik-Fuchs, L A M; Wijeratne, P A; Wildauer, A; Wildt, M A; Wilhelm, I; Wilkens, H G; Will, J Z; Williams, E; Williams, H H; Willis, W; Willocq, S; Wilson, J A; Wilson, M G; Wilson, A; Wingerter-Seez, I; Winkelmann, S; Winklmeier, F; Wittgen, M; Wolter, M W; Wolters, H; Wong, W C; Wooden, G; Wosiek, B K; Wotschack, J; Woudstra, M J; Wozniak, K W; Wraight, K; Wright, C; Wright, M; Wrona, B; Wu, S L; Wu, X; Wu, Y; Wulf, E; Wunstorf, R; Wynne, B M; Xella, S; Xiao, M; Xie, S; Xie, Y; Xu, C; Xu, D; Xu, G; Yabsley, B; Yacoob, S; Yamada, M; Yamaguchi, H; Yamamoto, A; Yamamoto, K; Yamamoto, S; Yamamura, T; Yamanaka, T; Yamaoka, J; Yamazaki, T; Yamazaki, Y; Yan, Z; Yang, H; Yang, U K; Yang, Y; Yang, Y; Yang, Z; Yanush, S; Yao, Y; Yasu, Y; Ybeles Smit, G V; Ye, J; Ye, S; Yilmaz, M; Yoosoofmiya, R; Yorita, K; Yoshida, R; Young, C; Youssef, S; Yu, D; Yu, J; Yu, J; Yuan, L; Yurkewicz, A; Zabinski, B; Zaets, V G; Zaidan, R; Zaitsev, A M; Zajacova, Z; Zanello, L; Zarzhitsky, P; Zaytsev, A; Zeitnitz, C; Zeller, M; Zeman, M; Zemla, A; Zendler, C; Zenin, O; Ženiš, T; Zinonos, Z; Zenz, S; Zerwas, D; Zevi Della Porta, G; Zhan, Z; Zhang, D; Zhang, H; Zhang, J; Zhang, X; Zhang, Z; Zhao, L; Zhao, T; Zhao, Z; Zhemchugov, A; Zheng, S; Zhong, J; Zhou, B; Zhou, N; Zhou, Y; Zhu, C G; Zhu, H; Zhu, J; Zhu, Y; Zhuang, X; Zhuravlov, V; Zieminska, D; Zimmermann, R; Zimmermann, S; Zimmermann, S; Ziolkowski, M; Zitoun, R; Živković, L; Zmouchko, V V; Zobernig, G; Zoccoli, A; Zolnierowski, Y; Zsenei, A; Zur Nedden, M; Zutshi, V; Zwalinski, L

    The top quark mass has been measured using the template method in the [Formula: see text] channel based on data recorded in 2011 with the ATLAS detector at the LHC. The data were taken at a proton-proton centre-of-mass energy of [Formula: see text] and correspond to an integrated luminosity of 1.04 fb -1 . The analyses in the e +jets and μ +jets decay channels yield consistent results. The top quark mass is measured to be m top =174.5±0.6 stat ±2.3 syst GeV.

  17. Occupational Component. 36-Level Courses. Teacher Resource Manual. Integrated Occupational Program.

    ERIC Educational Resources Information Center

    Alberta Dept. of Education, Edmonton. Curriculum Branch.

    This 36-level occupational component of Integrated Occupational Program (IOP) consists of 8 occupational clusters composed of 20 occupational courses. Each course contains learning activities so that students in Alberta (Canada) may develop occupational concepts, skills, and attitudes. This teacher's manual consists of the following sections:…

  18. TBIEM3D: A Computer Program for Predicting Ducted Fan Engine Noise. Version 1.1

    NASA Technical Reports Server (NTRS)

    Dunn, M. H.

    1997-01-01

    This document describes the usage of the ducted fan noise prediction program TBIEM3D (Thin duct - Boundary Integral Equation Method - 3 Dimensional). A scattering approach is adopted in which the acoustic pressure field is split into known incident and unknown scattered parts. The scattering of fan-generated noise by a finite length circular cylinder in a uniform flow field is considered. The fan noise is modeled by a collection of spinning point thrust dipoles. The program, based on a Boundary Integral Equation Method (BIEM), calculates circumferential modal coefficients of the acoustic pressure at user-specified field locations. The duct interior can be of the hard wall type or lined. The duct liner is axisymmetric, locally reactive, and can be uniform or axially segmented. TBIEM3D is written in the FORTRAN programming language. Input to TBIEM3D is minimal and consists of geometric and kinematic parameters. Discretization and numerical parameters are determined automatically by the code. Several examples are presented to demonstrate TBIEM3D capabilities.

  19. Edgewise Compression Testing of STIPS-0 (Structurally Integrated Thermal Protection System)

    NASA Technical Reports Server (NTRS)

    Brewer, Amy R.

    2011-01-01

    The Structurally Integrated Thermal Protection System (SITPS) task was initiated by the NASA Hypersonics Project under the Fundamental Aeronautics Program to develop a structural load-carrying thermal protection system for use in aerospace applications. The initial NASA concept for SITPS consists of high-temperature composite facesheets (outer and inner mold lines) with a light-weight insulated structural core. An edgewise compression test was performed on the SITPS-0 test article at room temperature using conventional instrumentation and methods in order to obtain panel-level mechanical properties and behavior of the panel. Three compression loadings (10, 20 and 37 kips) were applied to the SITPS-0 panel. The panel behavior was monitored using standard techniques and non-destructive evaluation methods such as photogrammetry and acoustic emission. The elastic modulus of the SITPS-0 panel was determined to be 1.146x106 psi with a proportional limit at 1039 psi. Barrel-shaped bending of the panel and partial delamination of the IML occurred under the final loading.

  20. imaging volcanos with gravity and muon tomography measurements

    NASA Astrophysics Data System (ADS)

    Jourde, Kevin; Gibert, Dominique; Marteau, Jacques; Deroussi, Sébastien; Dufour, Fabrice; de Bremond d'Ars, Jean; Ianigro, Jean-Christophe; Gardien, Serge; Girerd, Claude

    2015-04-01

    Both muon tomography and gravimetry are geohysical methods that provide information on the density structure of the Earth's subsurface. Muon tomography measures the natural flux of cosmic muons and its attenuation produced by the screening effect of the rock mass to image. Gravimetry generally consists in measurements of the vertical component of the local gravity field. Both methods are linearly linked to density, but their spatial sensitivity is very different. Muon tomography essentially works like medical X-ray scan and integrates density information along elongated narrow conical volumes while gravimetry measurements are linked to density by a 3-dimensional integral encompassing the whole studied domain. We show that gravity data are almost useless to constrain the density structure in regions sampled by more than two muon tomography acquisitions. Interestingly the resolution in deeper regions not sampled by muon tomography is significantly improved by joining the two techniques. Examples taken from field experiments performed on La Soufrière of Guadeloupe volcano are discussed.

  1. Modelling malaria control by introduction of larvivorous fish.

    PubMed

    Lou, Yijun; Zhao, Xiao-Qiang

    2011-10-01

    Malaria creates serious health and economic problems which call for integrated management strategies to disrupt interactions among mosquitoes, the parasite and humans. In order to reduce the intensity of malaria transmission, malaria vector control may be implemented to protect individuals against infective mosquito bites. As a sustainable larval control method, the use of larvivorous fish is promoted in some circumstances. To evaluate the potential impacts of this biological control measure on malaria transmission, we propose and investigate a mathematical model describing the linked dynamics between the host-vector interaction and the predator-prey interaction. The model, which consists of five ordinary differential equations, is rigorously analysed via theories and methods of dynamical systems. We derive four biologically plausible and insightful quantities (reproduction numbers) that completely determine the community composition. Our results suggest that the introduction of larvivorous fish can, in principle, have important consequences for malaria dynamics, but also indicate that this would require strong predators on larval mosquitoes. Integrated strategies of malaria control are analysed to demonstrate the biological application of our developed theory.

  2. A PDA-based electrocardiogram/blood pressure telemonitor for telemedicine.

    PubMed

    Bolanos, Marcos; Nazeran, Homayoun; Gonzalez, Izzac; Parra, Ricardo; Martinez, Christopher

    2004-01-01

    An electrocardiogram (ECG) / blood pressure (BP) telemonitor consisting of comprehensive integration of various electrical engineering concepts, devices, and methods was developed. This personal digital assistant-based (PDAbased) system focused on integration of biopotential amplifiers, photoplethysmographic measurement of blood pressure, microcontroller devices, programming methods, wireless transmission, signal filtering and analysis, interfacing, and long term memory devices (24 hours) to develop a state-of-the-art ECG/BP telemonitor. These instrumentation modules were developed and tested to realize a complete and compact system that could be deployed to assist in telemedicine applications and heart rate variability studies. The specific objective of this device was to facilitate the long term monitoring and recording of ECG and blood pressure signals. This device was able to acquire ECG/BP waveforms, transmit them wirelessly to a PDA, save them onto a compact flash memory, and display them on the LCD screen of the PDA. It was also capable of calculating the heart rate (HR) in beats per minute, and providing systolic and diastolic blood pressure values.

  3. Face-Referenced Measurement of Perioral Stiffness and Speech Kinematics in Parkinson's Disease

    PubMed Central

    Barlow, Steven M.; Lee, Jaehoon

    2015-01-01

    Purpose Perioral biomechanics, labial kinematics, and associated electromyographic signals were sampled and characterized in individuals with Parkinson's disease (PD) as a function of medication state. Method Passive perioral stiffness was sampled using the OroSTIFF system in 10 individuals with PD in a medication ON and a medication OFF state and compared to 10 matched controls. Perioral stiffness, derived as the quotient of resultant force and interoral angle span, was modeled with regression techniques. Labial movement amplitudes and integrated electromyograms from select lip muscles were evaluated during syllable production using a 4-D computerized motion capture system. Results Multilevel regression modeling showed greater perioral stiffness in patients with PD, consistent with the clinical correlate of rigidity. In the medication-OFF state, individuals with PD manifested greater integrated electromyogram levels for the orbicularis oris inferior compared to controls, which increased further after consumption of levodopa. Conclusions This study illustrates the application of biomechanical, electrophysiological, and kinematic methods to better understand the pathophysiology of speech motor control in PD. PMID:25629806

  4. Fully automated adipose tissue measurement on abdominal CT

    NASA Astrophysics Data System (ADS)

    Yao, Jianhua; Sussman, Daniel L.; Summers, Ronald M.

    2011-03-01

    Obesity has become widespread in America and has been associated as a risk factor for many illnesses. Adipose tissue (AT) content, especially visceral AT (VAT), is an important indicator for risks of many disorders, including heart disease and diabetes. Measuring adipose tissue (AT) with traditional means is often unreliable and inaccurate. CT provides a means to measure AT accurately and consistently. We present a fully automated method to segment and measure abdominal AT in CT. Our method integrates image preprocessing which attempts to correct for image artifacts and inhomogeneities. We use fuzzy cmeans to cluster AT regions and active contour models to separate subcutaneous and visceral AT. We tested our method on 50 abdominal CT scans and evaluated the correlations between several measurements.

  5. Improved Hierarchical Optimization-Based Classification of Hyperspectral Images Using Shape Analysis

    NASA Technical Reports Server (NTRS)

    Tarabalka, Yuliya; Tilton, James C.

    2012-01-01

    A new spectral-spatial method for classification of hyperspectral images is proposed. The HSegClas method is based on the integration of probabilistic classification and shape analysis within the hierarchical step-wise optimization algorithm. First, probabilistic support vector machines classification is applied. Then, at each iteration two neighboring regions with the smallest Dissimilarity Criterion (DC) are merged, and classification probabilities are recomputed. The important contribution of this work consists in estimating a DC between regions as a function of statistical, classification and geometrical (area and rectangularity) features. Experimental results are presented on a 102-band ROSIS image of the Center of Pavia, Italy. The developed approach yields more accurate classification results when compared to previously proposed methods.

  6. Enhanced Conformational Sampling Using Replica Exchange with Collective-Variable Tempering.

    PubMed

    Gil-Ley, Alejandro; Bussi, Giovanni

    2015-03-10

    The computational study of conformational transitions in RNA and proteins with atomistic molecular dynamics often requires suitable enhanced sampling techniques. We here introduce a novel method where concurrent metadynamics are integrated in a Hamiltonian replica-exchange scheme. The ladder of replicas is built with different strengths of the bias potential exploiting the tunability of well-tempered metadynamics. Using this method, free-energy barriers of individual collective variables are significantly reduced compared with simple force-field scaling. The introduced methodology is flexible and allows adaptive bias potentials to be self-consistently constructed for a large number of simple collective variables, such as distances and dihedral angles. The method is tested on alanine dipeptide and applied to the difficult problem of conformational sampling in a tetranucleotide.

  7. Sampling the isothermal-isobaric ensemble by Langevin dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Xingyu; Institute of Applied Physics and Computational Mathematics, Fenghao East Road 2, Beijing 100094; CAEP Software Center for High Performance Numerical Simulation, Huayuan Road 6, Beijing 100088

    2016-03-28

    We present a new method of conducting fully flexible-cell molecular dynamics simulation in isothermal-isobaric ensemble based on Langevin equations of motion. The stochastic coupling to all particle and cell degrees of freedoms is introduced in a correct way, in the sense that the stationary configurational distribution is proved to be consistent with that of the isothermal-isobaric ensemble. In order to apply the proposed method in computer simulations, a second order symmetric numerical integration scheme is developed by Trotter’s splitting of the single-step propagator. Moreover, a practical guide of choosing working parameters is suggested for user specified thermo- and baro-coupling timemore » scales. The method and software implementation are carefully validated by a numerical example.« less

  8. Augmented reality & gesture-based architecture in games for the elderly.

    PubMed

    McCallum, Simon; Boletsis, Costas

    2013-01-01

    Serious games for health and, more specifically, for elderly people have developed rapidly in recent years. The recent popularization of novel interaction methods of consoles, such as the Nintendo Wii and Microsoft Kinect, has provided an opportunity for the elderly to engage in computer and video games. These interaction methods, however, still present various challenges for elderly users. To address these challenges, we propose an architecture consisted of Augmented Reality (as an output mechanism) combined with gestured-based devices (as an input method). The intention of this work is to provide a theoretical justification for using these technologies and to integrate them into an architecture, acting as a basis for potentially creating suitable interaction techniques for the elderly players.

  9. A framework for different levels of integration of computational models into web-based virtual patients.

    PubMed

    Kononowicz, Andrzej A; Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-23

    Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients' interactivity by enriching them with computational models of physiological and pathological processes. The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the model. The third element is the description of four integration strategies, and the last element consisted of evaluation profiles specifying the relevant feasibility features and acceptance thresholds for specific purposes. The group of experts who evaluated the virtual patient exemplar found higher integration more interesting, but at the same time they were more concerned with the validity of the result. The observed differences were not statistically significant. This paper outlines a framework for the integration of computational models into virtual patients. The opportunities and challenges of model exploitation are discussed from a number of user perspectives, considering different levels of model integration. The long-term aim for future research is to isolate the most crucial factors in the framework and to determine their influence on the integration outcome.

  10. An integrated method for atherosclerotic carotid plaque segmentation in ultrasound image.

    PubMed

    Qian, Chunjun; Yang, Xiaoping

    2018-01-01

    Carotid artery atherosclerosis is an important cause of stroke. Ultrasound imaging has been widely used in the diagnosis of atherosclerosis. Therefore, segmenting atherosclerotic carotid plaque in ultrasound image is an important task. Accurate plaque segmentation is helpful for the measurement of carotid plaque burden. In this paper, we propose and evaluate a novel learning-based integrated framework for plaque segmentation. In our study, four different classification algorithms, along with the auto-context iterative algorithm, were employed to effectively integrate features from ultrasound images and later also the iteratively estimated and refined probability maps together for pixel-wise classification. The four classification algorithms were support vector machine with linear kernel, support vector machine with radial basis function kernel, AdaBoost and random forest. The plaque segmentation was implemented in the generated probability map. The performance of the four different learning-based plaque segmentation methods was tested on 29 B-mode ultrasound images. The evaluation indices for our proposed methods were consisted of sensitivity, specificity, Dice similarity coefficient, overlap index, error of area, absolute error of area, point-to-point distance, and Hausdorff point-to-point distance, along with the area under the ROC curve. The segmentation method integrated the random forest and an auto-context model obtained the best results (sensitivity 80.4 ± 8.4%, specificity 96.5 ± 2.0%, Dice similarity coefficient 81.0 ± 4.1%, overlap index 68.3 ± 5.8%, error of area -1.02 ± 18.3%, absolute error of area 14.7 ± 10.9%, point-to-point distance 0.34 ± 0.10 mm, Hausdorff point-to-point distance 1.75 ± 1.02 mm, and area under the ROC curve 0.897), which were almost the best, compared with that from the existed methods. Our proposed learning-based integrated framework investigated in this study could be useful for atherosclerotic carotid plaque segmentation, which will be helpful for the measurement of carotid plaque burden. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Development of an integrated staircase lift for home access.

    PubMed

    Mattie, Johanne L; Borisoff, Jaimie F; Leland, Danny; Miller, William C

    2015-12-01

    Stairways into buildings present a significant environmental barrier for those with mobility impairments, including older adults. A number of home access solutions that allow users to safely enter and exit the home exist, however these all have some limitations. The purpose of this work was to develop a novel, inclusive home access solution that integrates a staircase and a lift into one device. The development of an integrated staircase lift followed a structured protocol with stakeholders providing feedback at various stages in the design process, consistent with rehabilitation engineering design methods. A novel home access device was developed. The integrated staircase-lift has the following features: inclusivity, by a universal design that provides an option for either use of stairs or a lift; constant availability, with a lift platform always ready for use on either level; and potential aesthetic advantages when integrating the device into an existing home. The potential also exists for emergency descent during a power outage, and self-powered versions. By engaging stakeholders in a user centred design process, insight on the limitations of existing home access solutions and specific feedback on our design guided development of a novel home access device.

  12. Model reduction in integrated controls-structures design

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.

    1993-01-01

    It is the objective of this paper to present a model reduction technique developed for the integrated controls-structures design of flexible structures. Integrated controls-structures design problems are typically posed as nonlinear mathematical programming problems, where the design variables consist of both structural and control parameters. In the solution process, both structural and control design variables are constantly changing; therefore, the dynamic characteristics of the structure are also changing. This presents a problem in obtaining a reduced-order model for active control design and analysis which will be valid for all design points within the design space. In other words, the frequency and number of the significant modes of the structure (modes that should be included) may vary considerably throughout the design process. This is also true as the locations and/or masses of the sensors and actuators change. Moreover, since the number of design evaluations in the integrated design process could easily run into thousands, any feasible order-reduction method should not require model reduction analysis at every design iteration. In this paper a novel and efficient technique for model reduction in the integrated controls-structures design process, which addresses these issues, is presented.

  13. Earth Systems Science in an Integrated Science Content and Methods Course for Elementary Education Majors

    NASA Astrophysics Data System (ADS)

    Madsen, J. A.; Allen, D. E.; Donham, R. S.; Fifield, S. J.; Shipman, H. L.; Ford, D. J.; Dagher, Z. R.

    2004-12-01

    With funding from the National Science Foundation, we have designed an integrated science content and methods course for sophomore-level elementary teacher education (ETE) majors. This course, the Science Semester, is a 15-credit sequence that consists of three science content courses (Earth, Life, and Physical Science) and a science teaching methods course. The goal of this integrated science and education methods curriculum is to foster holistic understandings of science and pedagogy that future elementary teachers need to effectively use inquiry-based approaches in teaching science in their classrooms. During the Science Semester, traditional subject matter boundaries are crossed to stress shared themes that teachers must understand to teach standards-based elementary science. Exemplary approaches that support both learning science and learning how to teach science are used. In the science courses, students work collaboratively on multidisciplinary problem-based learning (PBL) activities that place science concepts in authentic contexts and build learning skills. In the methods course, students critically explore the theory and practice of elementary science teaching, drawing on their shared experiences of inquiry learning in the science courses. An earth system science approach is ideally adapted for the integrated, inquiry-based learning that takes place during the Science Semester. The PBL investigations that are the hallmark of the Science Semester provide the backdrop through which fundamental earth system interactions can be studied. For example in the PBL investigation that focuses on energy, the carbon cycle is examined as it relates to fossil fuels. In another PBL investigation centered on kids, cancer, and the environment, the hydrologic cycle with emphasis on surface runoff and ground water contamination is studied. In a PBL investigation that has students learning about the Delaware Bay ecosystem through the story of the horseshoe crab and the biome that swirls around this remarkable arthropod, students are exposed to interactions between the hydrosphere, atmosphere, and geosphere and they examine ways in which climate change can affect this ecosystem.

  14. Two-loop renormalization of the quark propagator in the light-cone gauge

    NASA Astrophysics Data System (ADS)

    Williams, James Daniel

    The divergent parts of the five two-loop quark self- energy diagrams of quantum chromodynamics are evaluated in the noncovariant light-cone gauge. Most of the Feynman integrals are computed by means of the powerful matrix integration method, originally developed for the author's Master's thesis. From the results of the integrations, it is shown how to renormalize the quark mass and wave function in such a way that the effective quark propagator is rendered finite at two-loop order. The required counterterms turn out to be local functions of the quark momentum, due to cancellation of the nonlocal divergent parts of the two-loop integrals with equal and opposite contributions from one-loop counterterm subtraction diagrams. The final form of the counterterms is seen to be consistent with the renormalization framework proposed by Bassetto, Dalbosco, and Soldati, in which all noncovariant divergences are absorbed into the wave function normalizations. It also turns out that the mass renormalization d m is the same in the light-cone gauge as it is in a general covariant gauge, at least up to two-loop order.

  15. Atomistic potentials based energy flux integral criterion for dynamic adiabatic shear banding

    NASA Astrophysics Data System (ADS)

    Xu, Yun; Chen, Jun

    2015-02-01

    The energy flux integral criterion based on atomistic potentials within the framework of hyperelasticity-plasticity is proposed for dynamic adiabatic shear banding (ASB). System Helmholtz energy decomposition reveals that the dynamic influence on the integral path dependence is originated from the volumetric strain energy and partial deviatoric strain energy, and the plastic influence only from the rest part of deviatoric strain energy. The concept of critical shear banding energy is suggested for describing the initiation of ASB, which consists of the dynamic recrystallization (DRX) threshold energy and the thermal softening energy. The criterion directly relates energy flux to the basic physical processes that induce shear instability such as dislocation nucleations and multiplications, without introducing ad-hoc parameters in empirical constitutive models. It reduces to the classical path independent J-integral for quasi-static loading and elastic solids. The atomistic-to-continuum multiscale coupling method is used to simulate the initiation of ASB. Atomic configurations indicate that DRX induced microstructural softening may be essential to the dynamic shear localization and hence the initiation of ASB.

  16. Improvement of Students’ Environmental Literacy by Using Integrated Science Teaching Materials

    NASA Astrophysics Data System (ADS)

    Suryanti, D.; Sinaga, P.; Surakusumah, W.

    2018-02-01

    This study aims to determine the improvement of student environmental literacy through the use of integrated science teaching materials on pollution topics. The research is used weak experiment method with the one group pre-test post-test design. The sample of the study were junior high school students in Bandung amounted to 32 people of 7th grade. Data collection in the form of environmental literacy test instrument consist of four components of environmental literacy that is (1) Knowledge, (2) Competencies (Cognitive Skill), (3) Affective and (4) Environmentally Responsible Behavior. The results show that the student’s environmental literacy ability is improved after using integrated science teaching materials. An increase in the medium category is occurring in the knowledge (N-gain=46%) and cognitive skill (N-gain=31%), while the increase in the low category occurs in the affective component (N-gain=25%) and behaviour (N-gain=24%). The conclusions of this study as a whole the improvement of students’ environmental literacy by using integrated science teaching material is in the medium category (N-gain=34%).

  17. Validation of two innovative methods to measure contaminant mass flux in groundwater

    NASA Astrophysics Data System (ADS)

    Goltz, Mark N.; Close, Murray E.; Yoon, Hyouk; Huang, Junqi; Flintoft, Mark J.; Kim, Sehjong; Enfield, Carl

    2009-04-01

    The ability to quantify the mass flux of a groundwater contaminant that is leaching from a source area is critical to enable us to: (1) evaluate the risk posed by the contamination source and prioritize cleanup, (2) evaluate the effectiveness of source remediation technologies or natural attenuation processes, and (3) quantify a source term for use in models that may be applied to predict maximum contaminant concentrations in downstream wells. Recently, a number of new methods have been developed and subsequently applied to measure contaminant mass flux in groundwater in the field. However, none of these methods has been validated at larger than the laboratory-scale through a comparison of measured mass flux and a known flux that has been introduced into flowing groundwater. A couple of innovative flux measurement methods, the tandem circulation well (TCW) and modified integral pumping test (MIPT) methods, have recently been proposed. The TCW method can measure mass flux integrated over a large subsurface volume without extracting water. The TCW method may be implemented using two different techniques. One technique, the multi-dipole technique, is relatively simple and inexpensive, only requiring measurement of heads, while the second technique requires conducting a tracer test. The MIPT method is an easily implemented method of obtaining volume-integrated flux measurements. In the current study, flux measurements obtained using these two methods are compared with known mass fluxes in a three-dimensional, artificial aquifer. Experiments in the artificial aquifer show that the TCW multi-dipole and tracer test techniques accurately estimated flux, within 2% and 16%, respectively; although the good results obtained using the multi-dipole technique may be fortuitous. The MIPT method was not as accurate as the TCW method, underestimating flux by as much as 70%. MIPT method inaccuracies may be due to the fact that the method assumptions (two-dimensional steady groundwater flow to fully-screened wells) were not well-approximated. While fluxes measured using the MIPT method were consistently underestimated, the method's simplicity and applicability to the field may compensate for the inaccuracies that were observed in this artificial aquifer test.

  18. Decentralized adaptive control

    NASA Technical Reports Server (NTRS)

    Oh, B. J.; Jamshidi, M.; Seraji, H.

    1988-01-01

    A decentralized adaptive control is proposed to stabilize and track the nonlinear, interconnected subsystems with unknown parameters. The adaptation of the controller gain is derived by using model reference adaptive control theory based on Lyapunov's direct method. The adaptive gains consist of sigma, proportional, and integral combination of the measured and reference values of the corresponding subsystem. The proposed control is applied to the joint control of a two-link robot manipulator, and the performance in computer simulation corresponds with what is expected in theoretical development.

  19. Methodology for testing and validating knowledge bases

    NASA Technical Reports Server (NTRS)

    Krishnamurthy, C.; Padalkar, S.; Sztipanovits, J.; Purves, B. R.

    1987-01-01

    A test and validation toolset developed for artificial intelligence programs is described. The basic premises of this method are: (1) knowledge bases have a strongly declarative character and represent mostly structural information about different domains, (2) the conditions for integrity, consistency, and correctness can be transformed into structural properties of knowledge bases, and (3) structural information and structural properties can be uniformly represented by graphs and checked by graph algorithms. The interactive test and validation environment have been implemented on a SUN workstation.

  20. All-optical Integrated Switches Based on Azo-benzene Liquid Crystals on Silicon

    DTIC Science & Technology

    2011-11-01

    Glass D263 SU8 Polymer Polymer NLC n̂ n̂ Refractive index @1.55 µm Materials n// = 1.689 n⊥= 1.502 n = 1.575 n = 1.516 E7 Glass D263 SU8 ...In the other case we have a nonlinear LCW based on glass substrates. It consists in a rectangular hollow realized in SU8 photoresist two glass...and discussion 5. All optical polymeric waveguide: methods, assumptions and procedure 6. All optical polymeric waveguide: results and discussion 7

  1. Integrated Real Time Contamination Monitor IRTCM

    NASA Technical Reports Server (NTRS)

    Luttges, W. E.

    1976-01-01

    Engineering and design work was performed on a monitoring device for particulate and gas contamination to be used in the space shuttle cargo area during launch at altitudes up to 50 km and during return phases of the flight. The gas sampling device consists of ampules filled with specific absorber materials which are opened and/or sealed at preprogrammed intervals. The design eliminates the use of valves which, according to experiments, are never sealing properly at hard vacuum. Methods of analysis including in-flight measuring possibilities are discussed.

  2. Impurity self-energy in the strongly-correlated Bose systems

    NASA Astrophysics Data System (ADS)

    Panochko, Galyna; Pastukhov, Volodymyr; Vakarchuk, Ivan

    2018-02-01

    We proposed the nonperturbative scheme for the calculation of the impurity spectrum in the Bose system at zero temperature. The method is based on the path-integral formulation and describes an impurity as a zero-density ideal Fermi gas interacting with Bose system for which the action is written in terms of density fluctuations. On the example of the 3He atom immersed in the liquid helium-4 a good consistency with experimental data and results of Monte Carlo simulations is shown.

  3. Effectiveness of Teamwork in an Integrated Care Setting for Patients with COPD: Development and Testing of a Self-Evaluation Instrument for Interprofessional Teams

    PubMed Central

    Van Dijk-de Vries, Anneke N.; Duimel-Peeters, Inge G. P.; Muris, Jean W.; Wesseling, Geertjan J.; Beusmans, George H. M. I.

    2016-01-01

    Introduction: Teamwork between healthcare providers is conditional for the delivery of integrated care. This study aimed to assess the usefulness of the conceptual framework Integrated Team Effectiveness Model for developing and testing of the Integrated Team Effectiveness Instrument. Theory and methods: Focus groups with healthcare providers in an integrated care setting for people with chronic obstructive pulmonary disease (COPD) were conducted to examine the recognisability of the conceptual framework and to explore critical success factors for collaborative COPD practice out of this framework. The resulting items were transposed into a pilot instrument. This was reviewed by expert opinion and completed 153 times by healthcare providers. The underlying structure and internal consistency of the instrument were verified by factor analysis and Cronbach’s alpha. Results: The conceptual framework turned out to be comprehensible for discussing teamwork effectiveness. The pilot instrument measures 25 relevant aspects of teamwork in integrated COPD care. Factor analysis suggested three reliable components: teamwork effectiveness, team processes and team psychosocial traits (Cronbach’s alpha between 0.76 and 0.81). Conclusions and discussion: The conceptual framework Integrated Team Effectiveness Model is relevant in developing a practical full-spectrum instrument to facilitate discussing teamwork effectiveness. The Integrated Team Effectiveness Instrument provides a well-founded basis to self-evaluate teamwork effectiveness in integrated COPD care by healthcare providers. Recommendations are provided for the improvement of the instrument. PMID:27616953

  4. Fully-relativistic full-potential multiple scattering theory: A pathology-free scheme

    NASA Astrophysics Data System (ADS)

    Liu, Xianglin; Wang, Yang; Eisenbach, Markus; Stocks, G. Malcolm

    2018-03-01

    The Green function plays an essential role in the Korringa-Kohn-Rostoker(KKR) multiple scattering method. In practice, it is constructed from the regular and irregular solutions of the local Kohn-Sham equation and robust methods exist for spherical potentials. However, when applied to a non-spherical potential, numerical errors from the irregular solutions give rise to pathological behaviors of the charge density at small radius. Here we present a full-potential implementation of the fully-relativistic KKR method to perform ab initio self-consistent calculation by directly solving the Dirac differential equations using the generalized variable phase (sine and cosine matrices) formalism Liu et al. (2016). The pathology around the origin is completely eliminated by carrying out the energy integration of the single-site Green function along the real axis. By using an efficient pole-searching technique to identify the zeros of the well-behaved Jost matrices, we demonstrated that this scheme is numerically stable and computationally efficient, with speed comparable to the conventional contour energy integration method, while free of the pathology problem of the charge density. As an application, this method is utilized to investigate the crystal structures of polonium and their bulk properties, which is challenging for a conventional real-energy scheme. The noble metals are also calculated, both as a test of our method and to study the relativistic effects.

  5. Fully-relativistic full-potential multiple scattering theory: A pathology-free scheme

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Xianglin; Wang, Yang; Eisenbach, Markus

    The Green function plays an essential role in the Korringa–Kohn–Rostoker(KKR) multiple scattering method. In practice, it is constructed from the regular and irregular solutions of the local Kohn–Sham equation and robust methods exist for spherical potentials. However, when applied to a non-spherical potential, numerical errors from the irregular solutions give rise to pathological behaviors of the charge density at small radius. Here we present a full-potential implementation of the fully-relativistic KKR method to perform ab initio self-consistent calculation by directly solving the Dirac differential equations using the generalized variable phase (sine and cosine matrices) formalism Liu et al. (2016). Themore » pathology around the origin is completely eliminated by carrying out the energy integration of the single-site Green function along the real axis. Here, by using an efficient pole-searching technique to identify the zeros of the well-behaved Jost matrices, we demonstrated that this scheme is numerically stable and computationally efficient, with speed comparable to the conventional contour energy integration method, while free of the pathology problem of the charge density. As an application, this method is utilized to investigate the crystal structures of polonium and their bulk properties, which is challenging for a conventional real-energy scheme. The noble metals are also calculated, both as a test of our method and to study the relativistic effects.« less

  6. Fully-relativistic full-potential multiple scattering theory: A pathology-free scheme

    DOE PAGES

    Liu, Xianglin; Wang, Yang; Eisenbach, Markus; ...

    2017-10-28

    The Green function plays an essential role in the Korringa–Kohn–Rostoker(KKR) multiple scattering method. In practice, it is constructed from the regular and irregular solutions of the local Kohn–Sham equation and robust methods exist for spherical potentials. However, when applied to a non-spherical potential, numerical errors from the irregular solutions give rise to pathological behaviors of the charge density at small radius. Here we present a full-potential implementation of the fully-relativistic KKR method to perform ab initio self-consistent calculation by directly solving the Dirac differential equations using the generalized variable phase (sine and cosine matrices) formalism Liu et al. (2016). Themore » pathology around the origin is completely eliminated by carrying out the energy integration of the single-site Green function along the real axis. Here, by using an efficient pole-searching technique to identify the zeros of the well-behaved Jost matrices, we demonstrated that this scheme is numerically stable and computationally efficient, with speed comparable to the conventional contour energy integration method, while free of the pathology problem of the charge density. As an application, this method is utilized to investigate the crystal structures of polonium and their bulk properties, which is challenging for a conventional real-energy scheme. The noble metals are also calculated, both as a test of our method and to study the relativistic effects.« less

  7. Automated segmentation and geometrical modeling of the tricuspid aortic valve in 3D echocardiographic images.

    PubMed

    Pouch, Alison M; Wang, Hongzhi; Takabe, Manabu; Jackson, Benjamin M; Sehgal, Chandra M; Gorman, Joseph H; Gorman, Robert C; Yushkevich, Paul A

    2013-01-01

    The aortic valve has been described with variable anatomical definitions, and the consistency of 2D manual measurement of valve dimensions in medical image data has been questionable. Given the importance of image-based morphological assessment in the diagnosis and surgical treatment of aortic valve disease, there is considerable need to develop a standardized framework for 3D valve segmentation and shape representation. Towards this goal, this work integrates template-based medial modeling and multi-atlas label fusion techniques to automatically delineate and quantitatively describe aortic leaflet geometry in 3D echocardiographic (3DE) images, a challenging task that has been explored only to a limited extent. The method makes use of expert knowledge of aortic leaflet image appearance, generates segmentations with consistent topology, and establishes a shape-based coordinate system on the aortic leaflets that enables standardized automated measurements. In this study, the algorithm is evaluated on 11 3DE images of normal human aortic leaflets acquired at mid systole. The clinical relevance of the method is its ability to capture leaflet geometry in 3DE image data with minimal user interaction while producing consistent measurements of 3D aortic leaflet geometry.

  8. Analytic Intermodel Consistent Modeling of Volumetric Human Lung Dynamics.

    PubMed

    Ilegbusi, Olusegun; Seyfi, Behnaz; Neylon, John; Santhanam, Anand P

    2015-10-01

    Human lung undergoes breathing-induced deformation in the form of inhalation and exhalation. Modeling the dynamics is numerically complicated by the lack of information on lung elastic behavior and fluid-structure interactions between air and the tissue. A mathematical method is developed to integrate deformation results from a deformable image registration (DIR) and physics-based modeling approaches in order to represent consistent volumetric lung dynamics. The computational fluid dynamics (CFD) simulation assumes the lung is a poro-elastic medium with spatially distributed elastic property. Simulation is performed on a 3D lung geometry reconstructed from four-dimensional computed tomography (4DCT) dataset of a human subject. The heterogeneous Young's modulus (YM) is estimated from a linear elastic deformation model with the same lung geometry and 4D lung DIR. The deformation obtained from the CFD is then coupled with the displacement obtained from the 4D lung DIR by means of the Tikhonov regularization (TR) algorithm. The numerical results include 4DCT registration, CFD, and optimal displacement data which collectively provide consistent estimate of the volumetric lung dynamics. The fusion method is validated by comparing the optimal displacement with the results obtained from the 4DCT registration.

  9. The multiscale expansions of difference equations in the small lattice spacing regime, and a vicinity and integrability test: I

    NASA Astrophysics Data System (ADS)

    Santini, Paolo Maria

    2010-01-01

    We propose an algorithmic procedure (i) to study the 'distance' between an integrable PDE and any discretization of it, in the small lattice spacing epsilon regime, and, at the same time, (ii) to test the (asymptotic) integrability properties of such discretization. This method should provide, in particular, useful and concrete information on how good is any numerical scheme used to integrate a given integrable PDE. The procedure, illustrated on a fairly general ten-parameter family of discretizations of the nonlinear Schrödinger equation, consists of the following three steps: (i) the construction of the continuous multiscale expansion of a generic solution of the discrete system at all orders in epsilon, following Degasperis et al (1997 Physica D 100 187-211) (ii) the application, to such an expansion, of the Degasperis-Procesi (DP) integrability test (Degasperis A and Procesi M 1999 Asymptotic integrability Symmetry and Perturbation Theory, SPT98, ed A Degasperis and G Gaeta (Singapore: World Scientific) pp 23-37 Degasperis A 2001 Multiscale expansion and integrability of dispersive wave equations Lectures given at the Euro Summer School: 'What is integrability?' (Isaac Newton Institute, Cambridge, UK, 13-24 August); Integrability (Lecture Notes in Physics vol 767) ed A Mikhailov (Berlin: Springer)), to test the asymptotic integrability properties of the discrete system and its 'distance' from its continuous limit; (iii) the use of the main output of the DP test to construct infinitely many approximate symmetries and constants of motion of the discrete system, through novel and simple formulas.

  10. Comparing floral and isotopic paleoelevation estimates: Examples from the western United States

    NASA Astrophysics Data System (ADS)

    Hyland, E. G.; Huntington, K. W.; Sheldon, N. D.; Smith, S. Y.; Strömberg, C. A. E.

    2016-12-01

    Describing paleoelevations is crucial to understanding tectonic processes and deconvolving the effects of uplift and climate on environmental change in the past. Decades of work has gone into estimating past elevation from various proxy archives, particularly using modern relationships between elevation and temperature, floral assemblage compositions, or oxygen isotope values. While these methods have been used widely and refined through time, they are rarely applied in tandem; here we provide two examples from the western United States using new multiproxy methods: 1) combining clumped isotopes and macrofloral assemblages to estimate paleoelevations along the Colorado Plateau, and 2) combining oxygen isotopes and phytolith methods to estimate paleoelevations within the greater Yellowstone region. Clumped isotope measurements and refined floral coexistence methods from sites on the northern Colorado Plateau like Florissant and Creede (CO) consistently estimate low (< 2km) elevations through the Eocene/Oligocene, suggesting slower uplift and a south-north propagation of the plateau. Oxygen isotope measurements and C4 phytolith estimates from sites surrounding the Yellowstone hotspot consistently estimate moderate uplift (0.2-0.7km) propagating along the hotspot track, suggesting migrating dynamic topography associated with the region. These examples provide support for the emerging practice of using multiproxy methods to estimate paleoelevations for important time periods, and can help integrate environmental and tectonic records of the past.

  11. A computational method for sharp interface advection

    PubMed Central

    Bredmose, Henrik; Jasak, Hrvoje

    2016-01-01

    We devise a numerical method for passive advection of a surface, such as the interface between two incompressible fluids, across a computational mesh. The method is called isoAdvector, and is developed for general meshes consisting of arbitrary polyhedral cells. The algorithm is based on the volume of fluid (VOF) idea of calculating the volume of one of the fluids transported across the mesh faces during a time step. The novelty of the isoAdvector concept consists of two parts. First, we exploit an isosurface concept for modelling the interface inside cells in a geometric surface reconstruction step. Second, from the reconstructed surface, we model the motion of the face–interface intersection line for a general polygonal face to obtain the time evolution within a time step of the submerged face area. Integrating this submerged area over the time step leads to an accurate estimate for the total volume of fluid transported across the face. The method was tested on simple two-dimensional and three-dimensional interface advection problems on both structured and unstructured meshes. The results are very satisfactory in terms of volume conservation, boundedness, surface sharpness and efficiency. The isoAdvector method was implemented as an OpenFOAM® extension and is published as open source. PMID:28018619

  12. Delineation of karst terranes in complex environments: Application of modern developments in the wavelet theory and data mining

    NASA Astrophysics Data System (ADS)

    Alperovich, Leonid; Averbuch, Amir; Eppelbaum, Lev; Zheludev, Valery

    2013-04-01

    Karst areas occupy about 14% of the world land. Karst terranes of different origin have caused difficult conditions for building, industrial activity and tourism, and are the source of heightened danger for environment. Mapping of karst (sinkhole) hazards, obviously, will be one of the most significant problems of engineering geophysics in the XXI century. Taking into account the complexity of geological media, some unfavourable environments and known ambiguity of geophysical data analysis, a single geophysical method examination might be insufficient. Wavelet methodology as whole has a significant impact on cardinal problems of geophysical signal processing such as: denoising of signals, enhancement of signals and distinguishing of signals with closely related characteristics and integrated analysis of different geophysical fields (satellite, airborne, earth surface or underground observed data). We developed a three-phase approach to the integrated geophysical localization of subsurface karsts (the same approach could be used for following monitoring of karst dynamics). The first phase consists of modeling devoted to compute various geophysical effects characterizing karst phenomena. The second phase determines development of the signal processing approaches to analyzing of profile or areal geophysical observations. Finally, at the third phase provides integration of these methods in order to create a new method of the combined interpretation of different geophysical data. In the base of our combine geophysical analysis we put modern developments in the wavelet technique of the signal and image processing. The development of the integrated methodology of geophysical field examination will enable to recognizing the karst terranes even by a small ratio of "useful signal - noise" in complex geological environments. For analyzing the geophysical data, we used a technique based on the algorithm to characterize a geophysical image by a limited number of parameters. This set of parameters serves as a signature of the image and is to be utilized for discrimination of images containing karst cavity (K) from the images non-containing karst (N). The constructed algorithm consists of the following main phases: (a) collection of the database, (b) characterization of geophysical images, (c) and dimensionality reduction. Then, each image is characterized by the histogram of the coherency directions. As a result of the previous steps we obtain two sets K and N of the signatures vectors for images from sections containing karst cavity and non-karst subsurface, respectively.

  13. Analysis of photonic spot profile converter and bridge structure on SOI platform for horizontal and vertical integration

    NASA Astrophysics Data System (ADS)

    Majumder, Saikat; Jha, Amit Kr.; Biswas, Aishik; Banerjee, Debasmita; Ganguly, Dipankar; Chakraborty, Rajib

    2017-08-01

    Horizontal spot size converter required for horizontal light coupling and vertical bridge structure required for vertical integration are designed on high index contrast SOI platform in order to form more compact integrated photonic circuits. Both the structures are based on the concept of multimode interference. The spot size converter can be realized by successive integration of multimode interference structures with reducing dimension on horizontal plane, whereas the optical bridge structure consists of a number of vertical multimode interference structure connected by single mode sections. The spot size converter can be modified to a spot profile converter when the final single mode waveguide is replaced by a slot waveguide. Analysis have shown that by using three multimode sections in a spot size converter, an Gaussian input having spot diameter of 2.51 μm can be converted to a spot diameter of 0.25 μm. If the output single mode section is replaced by a slot waveguide, this input profile can be converted to a flat top profile of width 50 nm. Similarly, vertical displacement of 8μm is possible by using a combination of two multimode sections and three single mode sections in the vertical bridge structure. The analyses of these two structures are carried out for both TE and TM modes at 1550 nm wavelength using the semi analytical matrix method which is simple and fast in computation time and memory. This work shows that the matrix method is equally applicable for analysis of horizontally as well as vertically integrated photonic circuit.

  14. A Method of Retrospective Computerized System Validation for Drug Manufacturing Software Considering Modifications

    NASA Astrophysics Data System (ADS)

    Takahashi, Masakazu; Fukue, Yoshinori

    This paper proposes a Retrospective Computerized System Validation (RCSV) method for Drug Manufacturing Software (DMSW) that relates to drug production considering software modification. Because DMSW that is used for quality management and facility control affects big impact to quality of drugs, regulatory agency required proofs of adequacy for DMSW's functions and performance based on developed documents and test results. Especially, the work that explains adequacy for previously developed DMSW based on existing documents and operational records is called RCSV. When modifying RCSV conducted DMSW, it was difficult to secure consistency between developed documents and test results for modified DMSW parts and existing documents and operational records for non-modified DMSW parts. This made conducting RCSV difficult. In this paper, we proposed (a) definition of documents architecture, (b) definition of descriptive items and levels in the documents, (c) management of design information using database, (d) exhaustive testing, and (e) integrated RCSV procedure. As a result, we could conduct adequate RCSV securing consistency.

  15. Interconnections between various analytic approaches applicable to third-order nonlinear differential equations

    PubMed Central

    Mohanasubha, R.; Chandrasekar, V. K.; Senthilvelan, M.; Lakshmanan, M.

    2015-01-01

    We unearth the interconnection between various analytical methods which are widely used in the current literature to identify integrable nonlinear dynamical systems described by third-order nonlinear ODEs. We establish an important interconnection between the extended Prelle–Singer procedure and λ-symmetries approach applicable to third-order ODEs to bring out the various linkages associated with these different techniques. By establishing this interconnection we demonstrate that given any one of the quantities as a starting point in the family consisting of Jacobi last multipliers, Darboux polynomials, Lie point symmetries, adjoint-symmetries, λ-symmetries, integrating factors and null forms one can derive the rest of the quantities in this family in a straightforward and unambiguous manner. We also illustrate our findings with three specific examples. PMID:27547076

  16. Interconnections between various analytic approaches applicable to third-order nonlinear differential equations.

    PubMed

    Mohanasubha, R; Chandrasekar, V K; Senthilvelan, M; Lakshmanan, M

    2015-04-08

    We unearth the interconnection between various analytical methods which are widely used in the current literature to identify integrable nonlinear dynamical systems described by third-order nonlinear ODEs. We establish an important interconnection between the extended Prelle-Singer procedure and λ-symmetries approach applicable to third-order ODEs to bring out the various linkages associated with these different techniques. By establishing this interconnection we demonstrate that given any one of the quantities as a starting point in the family consisting of Jacobi last multipliers, Darboux polynomials, Lie point symmetries, adjoint-symmetries, λ-symmetries, integrating factors and null forms one can derive the rest of the quantities in this family in a straightforward and unambiguous manner. We also illustrate our findings with three specific examples.

  17. Tunable optical analog to electromagnetically induced transparency in graphene-ring resonators system.

    PubMed

    Wang, Yonghua; Xue, Chenyang; Zhang, Zengxing; Zheng, Hua; Zhang, Wendong; Yan, Shubin

    2016-12-12

    The analogue of electromagnetically induced transparency in optical ways has shown great potential in optical delay and quantum-information technology due to its flexible design and easy implementation. The chief drawback for these devices is the bad tunability. Here we demonstrate a tunable optical transparency system formed by graphene-silicon microrings which could control the transparent window by electro-optical means. The device consists of cascaded coupled ring resonators and a graphene/graphene capacitor which integrated on one of the rings. By tuning the Fermi level of the graphene sheets, we can modulate the round-trip ring loss so that the transparency window can be dynamically tuned. The results provide a new method for the manipulation and transmission of light in highly integrated optical circuits and quantum information storage devices.

  18. FY16 Progress Report on Test Results In Support Of Integrated EPP and SMT Design Methods Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yanli; Jetter, Robert I.; Sham, T. -L.

    2016-08-08

    The proposed integrated Elastic Perfectly-Plastic (EPP) and Simplified Model Test (SMT) methodology consists of incorporating an SMT data-based approach for creep-fatigue damage evaluation into the EPP methodology to avoid using the creep-fatigue interaction diagram (the D diagram) and to minimize over-conservatism while properly accounting for localized defects and stress risers. To support the implementation of the proposed code rules and to verify their applicability, a series of thermomechanical tests have been initiated. This report presents the recent test results for Type 2 SMT specimens on Alloy 617, Pressurization SMT on Alloy 617, Type 1 SMT on Gr. 91, and two-barmore » thermal ratcheting test results on Alloy 617 with a new thermal loading profile.« less

  19. Enabling the use of hereditary information from pedigree tools in medical knowledge-based systems.

    PubMed

    Gay, Pablo; López, Beatriz; Plà, Albert; Saperas, Jordi; Pous, Carles

    2013-08-01

    The use of family information is a key issue to deal with inheritance illnesses. This kind of information use to come in the form of pedigree files, which contain structured information as tree or graphs, which explains the family relationships. Knowledge-based systems should incorporate the information gathered by pedigree tools to assess medical decision making. In this paper, we propose a method to achieve such a goal, which consists on the definition of new indicators, and methods and rules to compute them from family trees. The method is illustrated with several case studies. We provide information about its implementation and integration on a case-based reasoning tool. The method has been experimentally tested with breast cancer diagnosis data. The results show the feasibility of our methodology. Copyright © 2013 Elsevier Inc. All rights reserved.

  20. A third-generation density-functional-theory-based method for calculating canonical molecular orbitals of large molecules.

    PubMed

    Hirano, Toshiyuki; Sato, Fumitoshi

    2014-07-28

    We used grid-free modified Cholesky decomposition (CD) to develop a density-functional-theory (DFT)-based method for calculating the canonical molecular orbitals (CMOs) of large molecules. Our method can be used to calculate standard CMOs, analytically compute exchange-correlation terms, and maximise the capacity of next-generation supercomputers. Cholesky vectors were first analytically downscaled using low-rank pivoted CD and CD with adaptive metric (CDAM). The obtained Cholesky vectors were distributed and stored on each computer node in a parallel computer, and the Coulomb, Fock exchange, and pure exchange-correlation terms were calculated by multiplying the Cholesky vectors without evaluating molecular integrals in self-consistent field iterations. Our method enables DFT and massively distributed memory parallel computers to be used in order to very efficiently calculate the CMOs of large molecules.

  1. Statistical Optics

    NASA Astrophysics Data System (ADS)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  2. Functional Module Analysis for Gene Coexpression Networks with Network Integration.

    PubMed

    Zhang, Shuqin; Zhao, Hongyu; Ng, Michael K

    2015-01-01

    Network has been a general tool for studying the complex interactions between different genes, proteins, and other small molecules. Module as a fundamental property of many biological networks has been widely studied and many computational methods have been proposed to identify the modules in an individual network. However, in many cases, a single network is insufficient for module analysis due to the noise in the data or the tuning of parameters when building the biological network. The availability of a large amount of biological networks makes network integration study possible. By integrating such networks, more informative modules for some specific disease can be derived from the networks constructed from different tissues, and consistent factors for different diseases can be inferred. In this paper, we have developed an effective method for module identification from multiple networks under different conditions. The problem is formulated as an optimization model, which combines the module identification in each individual network and alignment of the modules from different networks together. An approximation algorithm based on eigenvector computation is proposed. Our method outperforms the existing methods, especially when the underlying modules in multiple networks are different in simulation studies. We also applied our method to two groups of gene coexpression networks for humans, which include one for three different cancers, and one for three tissues from the morbidly obese patients. We identified 13 modules with three complete subgraphs, and 11 modules with two complete subgraphs, respectively. The modules were validated through Gene Ontology enrichment and KEGG pathway enrichment analysis. We also showed that the main functions of most modules for the corresponding disease have been addressed by other researchers, which may provide the theoretical basis for further studying the modules experimentally.

  3. Evolving the Principles and Practice of Validation for New Alternative Approaches to Toxicity Testing.

    PubMed

    Whelan, Maurice; Eskes, Chantra

    Validation is essential for the translation of newly developed alternative approaches to animal testing into tools and solutions suitable for regulatory applications. Formal approaches to validation have emerged over the past 20 years or so and although they have helped greatly to progress the field, it is essential that the principles and practice underpinning validation continue to evolve to keep pace with scientific progress. The modular approach to validation should be exploited to encourage more innovation and flexibility in study design and to increase efficiency in filling data gaps. With the focus now on integrated approaches to testing and assessment that are based on toxicological knowledge captured as adverse outcome pathways, and which incorporate the latest in vitro and computational methods, validation needs to adapt to ensure it adds value rather than hinders progress. Validation needs to be pursued both at the method level, to characterise the performance of in vitro methods in relation their ability to detect any association of a chemical with a particular pathway or key toxicological event, and at the methodological level, to assess how integrated approaches can predict toxicological endpoints relevant for regulatory decision making. To facilitate this, more emphasis needs to be given to the development of performance standards that can be applied to classes of methods and integrated approaches that provide similar information. Moreover, the challenge of selecting the right reference chemicals to support validation needs to be addressed more systematically, consistently and in a manner that better reflects the state of the science. Above all however, validation requires true partnership between the development and user communities of alternative methods and the appropriate investment of resources.

  4. New Computational Methods for the Prediction and Analysis of Helicopter Noise

    NASA Technical Reports Server (NTRS)

    Strawn, Roger C.; Oliker, Leonid; Biswas, Rupak

    1996-01-01

    This paper describes several new methods to predict and analyze rotorcraft noise. These methods are: 1) a combined computational fluid dynamics and Kirchhoff scheme for far-field noise predictions, 2) parallel computer implementation of the Kirchhoff integrations, 3) audio and visual rendering of the computed acoustic predictions over large far-field regions, and 4) acoustic tracebacks to the Kirchhoff surface to pinpoint the sources of the rotor noise. The paper describes each method and presents sample results for three test cases. The first case consists of in-plane high-speed impulsive noise and the other two cases show idealized parallel and oblique blade-vortex interactions. The computed results show good agreement with available experimental data but convey much more information about the far-field noise propagation. When taken together, these new analysis methods exploit the power of new computer technologies and offer the potential to significantly improve our prediction and understanding of rotorcraft noise.

  5. SCA with rotation to distinguish common and distinctive information in linked data.

    PubMed

    Schouteden, Martijn; Van Deun, Katrijn; Pattyn, Sven; Van Mechelen, Iven

    2013-09-01

    Often data are collected that consist of different blocks that all contain information about the same entities (e.g., items, persons, or situations). In order to unveil both information that is common to all data blocks and information that is distinctive for one or a few of them, an integrated analysis of the whole of all data blocks may be most useful. Interesting classes of methods for such an approach are simultaneous-component and multigroup factor analysis methods. These methods yield dimensions underlying the data at hand. Unfortunately, however, in the results from such analyses, common and distinctive types of information are mixed up. This article proposes a novel method to disentangle the two kinds of information, by making use of the rotational freedom of component and factor models. We illustrate this method with data from a cross-cultural study of emotions.

  6. Dose-volume histogram prediction using density estimation.

    PubMed

    Skarpman Munter, Johanna; Sjölund, Jens

    2015-09-07

    Knowledge of what dose-volume histograms can be expected for a previously unseen patient could increase consistency and quality in radiotherapy treatment planning. We propose a machine learning method that uses previous treatment plans to predict such dose-volume histograms. The key to the approach is the framing of dose-volume histograms in a probabilistic setting.The training consists of estimating, from the patients in the training set, the joint probability distribution of some predictive features and the dose. The joint distribution immediately provides an estimate of the conditional probability of the dose given the values of the predictive features. The prediction consists of estimating, from the new patient, the distribution of the predictive features and marginalizing the conditional probability from the training over this. Integrating the resulting probability distribution for the dose yields an estimate of the dose-volume histogram.To illustrate how the proposed method relates to previously proposed methods, we use the signed distance to the target boundary as a single predictive feature. As a proof-of-concept, we predicted dose-volume histograms for the brainstems of 22 acoustic schwannoma patients treated with stereotactic radiosurgery, and for the lungs of 9 lung cancer patients treated with stereotactic body radiation therapy. Comparing with two previous attempts at dose-volume histogram prediction we find that, given the same input data, the predictions are similar.In summary, we propose a method for dose-volume histogram prediction that exploits the intrinsic probabilistic properties of dose-volume histograms. We argue that the proposed method makes up for some deficiencies in previously proposed methods, thereby potentially increasing ease of use, flexibility and ability to perform well with small amounts of training data.

  7. The Application of Lidar to Synthetic Vision System Integrity

    NASA Technical Reports Server (NTRS)

    Campbell, Jacob L.; UijtdeHaag, Maarten; Vadlamani, Ananth; Young, Steve

    2003-01-01

    One goal in the development of a Synthetic Vision System (SVS) is to create a system that can be certified by the Federal Aviation Administration (FAA) for use at various flight criticality levels. As part of NASA s Aviation Safety Program, Ohio University and NASA Langley have been involved in the research and development of real-time terrain database integrity monitors for SVS. Integrity monitors based on a consistency check with onboard sensors may be required if the inherent terrain database integrity is not sufficient for a particular operation. Sensors such as the radar altimeter and weather radar, which are available on most commercial aircraft, are currently being investigated for use in a real-time terrain database integrity monitor. This paper introduces the concept of using a Light Detection And Ranging (LiDAR) sensor as part of a real-time terrain database integrity monitor. A LiDAR system consists of a scanning laser ranger, an inertial measurement unit (IMU), and a Global Positioning System (GPS) receiver. Information from these three sensors can be combined to generate synthesized terrain models (profiles), which can then be compared to the stored SVS terrain model. This paper discusses an initial performance evaluation of the LiDAR-based terrain database integrity monitor using LiDAR data collected over Reno, Nevada. The paper will address the consistency checking mechanism and test statistic, sensitivity to position errors, and a comparison of the LiDAR-based integrity monitor to a radar altimeter-based integrity monitor.

  8. Vulnerability assessment of atmospheric environment driven by human impacts.

    PubMed

    Zhang, Yang; Shen, Jing; Ding, Feng; Li, Yu; He, Li

    2016-11-15

    Atmospheric environment quality worsening is a substantial threat to public health worldwide, and in many places, air pollution due to the intensification of the human activity is increasing dramatically. However, no studies have been investigated the integration of vulnerability assessment and atmospheric environment driven by human impacts. The objective of this study was to identify and prioritize the undesirable environmental changes as an early warning system for environment managers and decision makers in term of human, atmospheric environment, and social economic elements. We conduct a vulnerability assessment method of atmospheric environment associated with human impact, this method integrates spatial context of Geographic Information System (GIS) tool, multi-criteria decision analysis (MCDA) method, ordered weighted averaging (OWA) operators under the Exposure-Sensitivity- Adaptive Capacity (ESA) framework. Decision makers can find out relevant vulnerability assessment results with different vulnerable attitudes. In the Beijing-Tianjin-Hebei (BTH) region, China, we further applied this developed method and proved it to be reliable and consistent with the China Environmental Status Bulletin. Results indicate that the vulnerability of atmospheric environment in the BTH region is not optimistic, and environment managers should do more about air pollution. Thus, the most appropriate strategic decision and development program of city or state can be picked out assisting by the vulnerable results. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. The stability of perfect elliptic disks. 1: The maximum streaming case

    NASA Technical Reports Server (NTRS)

    Levine, Stephen E.; Sparke, Linda S.

    1994-01-01

    Self-consistent distribution functions are constructed for two-dimensional perfect elliptic disks (for which the potential is exactly integrable) in the limit of maximum streaming; these are tested for stability by N-body integration. To obtain a discrete representation for each model, simulated annealing is used to choose a set of orbits which sample the distribution function and reproduce the required density profile while carrying the greatest possible amount of angular momentum. A quiet start technique is developed to place particles on these orbits uniformly in action-angle space, making the initial conditions as smooth as possible. The roundest models exhibit spiral instabilities similar to those of cold axisymmetric disks; the most elongated models show bending instabilities like those seen in prolate systems. Between these extremes, there is a range of axial ratios 0.25 approximately less than b/a approximately less than 0.6 within which these models appear to be stable. All the methods developed in this investigation can easily be extended to integrable potentials in three dimensions.

  10. Integration process of fermentation and liquid biphasic flotation for lipase separation from Burkholderia cepacia.

    PubMed

    Sankaran, Revathy; Show, Pau Loke; Lee, Sze Ying; Yap, Yee Jiun; Ling, Tau Chuan

    2018-02-01

    Liquid Biphasic Flotation (LBF) is an advanced recovery method that has been effectively applied for biomolecules extraction. The objective of this investigation is to incorporate the fermentation and extraction process of lipase from Burkholderia cepacia using flotation system. Initial study was conducted to compare the performance of bacteria growth and lipase production using flotation and shaker system. From the results obtained, bacteria shows quicker growth and high lipase yield via flotation system. Integration process for lipase separation was investigated and the result showed high efficiency reaching 92.29% and yield of 95.73%. Upscaling of the flotation system exhibited consistent result with the lab-scale which are 89.53% efficiency and 93.82% yield. The combination of upstream and downstream processes in a single system enables the acceleration of product formation, improves the product yield and facilitates downstream processing. This integration system demonstrated its potential for biomolecules fermentation and separation that possibly open new opportunities for industrial production. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Structure simulation with calculated NMR parameters - integrating COSMOS into the CCPN framework.

    PubMed

    Schneider, Olaf; Fogh, Rasmus H; Sternberg, Ulrich; Klenin, Konstantin; Kondov, Ivan

    2012-01-01

    The Collaborative Computing Project for NMR (CCPN) has build a software framework consisting of the CCPN data model (with APIs) for NMR related data, the CcpNmr Analysis program and additional tools like CcpNmr FormatConverter. The open architecture allows for the integration of external software to extend the abilities of the CCPN framework with additional calculation methods. Recently, we have carried out the first steps for integrating our software Computer Simulation of Molecular Structures (COSMOS) into the CCPN framework. The COSMOS-NMR force field unites quantum chemical routines for the calculation of molecular properties with a molecular mechanics force field yielding the relative molecular energies. COSMOS-NMR allows introducing NMR parameters as constraints into molecular mechanics calculations. The resulting infrastructure will be made available for the NMR community. As a first application we have tested the evaluation of calculated protein structures using COSMOS-derived 13C Cα and Cβ chemical shifts. In this paper we give an overview of the methodology and a roadmap for future developments and applications.

  12. Sol-Gel Zinc Oxide Humidity Sensors Integrated with a Ring Oscillator Circuit On-a-Chip

    PubMed Central

    Yang, Ming-Zhi; Dai, Ching-Liang; Wu, Chyan-Chyi

    2014-01-01

    The study develops an integrated humidity microsensor fabricated using the commercial 0.18 μm complementary metal oxide semiconductor (CMOS) process. The integrated humidity sensor consists of a humidity sensor and a ring oscillator circuit on-a-chip. The humidity sensor is composed of a sensitive film and branch interdigitated electrodes. The sensitive film is zinc oxide prepared by sol-gel method. After completion of the CMOS process, the sensor requires a post-process to remove the sacrificial oxide layer and to coat the zinc oxide film on the interdigitated electrodes. The capacitance of the sensor changes when the sensitive film adsorbs water vapor. The circuit is used to convert the capacitance of the humidity sensor into the oscillation frequency output. Experimental results show that the output frequency of the sensor changes from 84.3 to 73.4 MHz at 30 °C as the humidity increases 40 to 90 %RH. PMID:25353984

  13. Interoperability between biomedical ontologies through relation expansion, upper-level ontologies and automatic reasoning.

    PubMed

    Hoehndorf, Robert; Dumontier, Michel; Oellrich, Anika; Rebholz-Schuhmann, Dietrich; Schofield, Paul N; Gkoutos, Georgios V

    2011-01-01

    Researchers design ontologies as a means to accurately annotate and integrate experimental data across heterogeneous and disparate data- and knowledge bases. Formal ontologies make the semantics of terms and relations explicit such that automated reasoning can be used to verify the consistency of knowledge. However, many biomedical ontologies do not sufficiently formalize the semantics of their relations and are therefore limited with respect to automated reasoning for large scale data integration and knowledge discovery. We describe a method to improve automated reasoning over biomedical ontologies and identify several thousand contradictory class definitions. Our approach aligns terms in biomedical ontologies with foundational classes in a top-level ontology and formalizes composite relations as class expressions. We describe the semi-automated repair of contradictions and demonstrate expressive queries over interoperable ontologies. Our work forms an important cornerstone for data integration, automatic inference and knowledge discovery based on formal representations of knowledge. Our results and analysis software are available at http://bioonto.de/pmwiki.php/Main/ReasonableOntologies.

  14. TopicLens: Efficient Multi-Level Visual Topic Exploration of Large-Scale Document Collections.

    PubMed

    Kim, Minjeong; Kang, Kyeongpil; Park, Deokgun; Choo, Jaegul; Elmqvist, Niklas

    2017-01-01

    Topic modeling, which reveals underlying topics of a document corpus, has been actively adopted in visual analytics for large-scale document collections. However, due to its significant processing time and non-interactive nature, topic modeling has so far not been tightly integrated into a visual analytics workflow. Instead, most such systems are limited to utilizing a fixed, initial set of topics. Motivated by this gap in the literature, we propose a novel interaction technique called TopicLens that allows a user to dynamically explore data through a lens interface where topic modeling and the corresponding 2D embedding are efficiently computed on the fly. To support this interaction in real time while maintaining view consistency, we propose a novel efficient topic modeling method and a semi-supervised 2D embedding algorithm. Our work is based on improving state-of-the-art methods such as nonnegative matrix factorization and t-distributed stochastic neighbor embedding. Furthermore, we have built a web-based visual analytics system integrated with TopicLens. We use this system to measure the performance and the visualization quality of our proposed methods. We provide several scenarios showcasing the capability of TopicLens using real-world datasets.

  15. The Development, Description and Appraisal of an Emergent Multimethod Research Design to Study Workforce Changes in Integrated Care Interventions

    PubMed Central

    Luijkx, Katrien; Calciolari, Stefano; González-Ortiz, Laura G.

    2017-01-01

    Introduction: In this paper, we provide a detailed and explicit description of the processes and decisions underlying and shaping the emergent multimethod research design of our study on workforce changes in integrated chronic care. Theory and methods: The study was originally planned as mixed method research consisting of a preliminary literature review and quantitative check of these findings via a Delphi panel. However, when the findings of the literature review were not appropriate for quantitative confirmation, we chose to continue our qualitative exploration of the topic via qualitative questionnaires and secondary analysis of two best practice case reports. Results: The resulting research design is schematically described as an emergent and interactive multimethod design with multiphase combination timing. In doing so, we provide other researchers with a set of theory- and experience-based options to develop their own multimethod research and provide an example for more detailed and structured reporting of emergent designs. Conclusion and discussion: We argue that the terminology developed for the description of mixed methods designs should also be used for multimethod designs such as the one presented here. PMID:29042843

  16. Content-based image retrieval in medical applications for picture archiving and communication systems

    NASA Astrophysics Data System (ADS)

    Lehmann, Thomas M.; Guld, Mark O.; Thies, Christian; Fischer, Benedikt; Keysers, Daniel; Kohnen, Michael; Schubert, Henning; Wein, Berthold B.

    2003-05-01

    Picture archiving and communication systems (PACS) aim to efficiently provide the radiologists with all images in a suitable quality for diagnosis. Modern standards for digital imaging and communication in medicine (DICOM) comprise alphanumerical descriptions of study, patient, and technical parameters. Currently, this is the only information used to select relevant images within PACS. Since textual descriptions insufficiently describe the great variety of details in medical images, content-based image retrieval (CBIR) is expected to have a strong impact when integrated into PACS. However, existing CBIR approaches usually are limited to a distinct modality, organ, or diagnostic study. In this state-of-the-art report, we present first results implementing a general approach to content-based image retrieval in medical applications (IRMA) and discuss its integration into PACS environments. Usually, a PACS consists of a DICOM image server and several DICOM-compliant workstations, which are used by radiologists for reading the images and reporting the findings. Basic IRMA components are the relational database, the scheduler, and the web server, which all may be installed on the DICOM image server, and the IRMA daemons running on distributed machines, e.g., the radiologists" workstations. These workstations can also host the web-based front-ends of IRMA applications. Integrating CBIR and PACS, a special focus is put on (a) location and access transparency for data, methods, and experiments, (b) replication transparency for methods in development, (c) concurrency transparency for job processing and feature extraction, (d) system transparency at method implementation time, and (e) job distribution transparency when issuing a query. Transparent integration will have a certain impact on diagnostic quality supporting both evidence-based medicine and case-based reasoning.

  17. Integrated approach to e-learning enhanced both subjective and objective knowledge of aEEG in a neonatal intensive care unit

    PubMed Central

    Poon, Woei Bing; Tagamolila, Vina; Toh, Ying Pin Anne; Cheng, Zai Ru

    2015-01-01

    INTRODUCTION Various meta-analyses have shown that e-learning is as effective as traditional methods of continuing professional education. However, there are some disadvantages to e-learning, such as possible technical problems, the need for greater self-discipline, cost involved in developing programmes and limited direct interaction. Currently, most strategies for teaching amplitude-integrated electroencephalography (aEEG) in neonatal intensive care units (NICUs) worldwide depend on traditional teaching methods. METHODS We implemented a programme that utilised an integrated approach to e-learning. The programme consisted of three sessions of supervised protected time e-learning in an NICU. The objective and subjective effectiveness of the approach was assessed through surveys administered to participants before and after the programme. RESULTS A total of 37 NICU staff (32 nurses and 5 doctors) participated in the study. 93.1% of the participants appreciated the need to acquire knowledge of aEEG. We also saw a statistically significant improvement in the subjective knowledge score (p = 0.041) of the participants. The passing rates for identifying abnormal aEEG tracings (defined as ≥ 3 correct answers out of 5) also showed a statistically significant improvement (from 13.6% to 81.8%, p < 0.001). Among the participants who completed the survey, 96.0% felt the teaching was well structured, 77.8% felt the duration was optimal, 80.0% felt that they had learnt how to systematically interpret aEEGs, and 70.4% felt that they could interpret normal aEEG with confidence. CONCLUSION An integrated approach to e-learning can help improve subjective and objective knowledge of aEEG. PMID:25820847

  18. A concept for holistic whole body MRI data analysis, Imiomics

    PubMed Central

    Malmberg, Filip; Johansson, Lars; Lind, Lars; Sundbom, Magnus; Ahlström, Håkan; Kullberg, Joel

    2017-01-01

    Purpose To present and evaluate a whole-body image analysis concept, Imiomics (imaging–omics) and an image registration method that enables Imiomics analyses by deforming all image data to a common coordinate system, so that the information in each voxel can be compared between persons or within a person over time and integrated with non-imaging data. Methods The presented image registration method utilizes relative elasticity constraints of different tissue obtained from whole-body water-fat MRI. The registration method is evaluated by inverse consistency and Dice coefficients and the Imiomics concept is evaluated by example analyses of importance for metabolic research using non-imaging parameters where we know what to expect. The example analyses include whole body imaging atlas creation, anomaly detection, and cross-sectional and longitudinal analysis. Results The image registration method evaluation on 128 subjects shows low inverse consistency errors and high Dice coefficients. Also, the statistical atlas with fat content intensity values shows low standard deviation values, indicating successful deformations to the common coordinate system. The example analyses show expected associations and correlations which agree with explicit measurements, and thereby illustrate the usefulness of the proposed Imiomics concept. Conclusions The registration method is well-suited for Imiomics analyses, which enable analyses of relationships to non-imaging data, e.g. clinical data, in new types of holistic targeted and untargeted big-data analysis. PMID:28241015

  19. Developing Learning Tool of Control System Engineering Using Matrix Laboratory Software Oriented on Industrial Needs

    NASA Astrophysics Data System (ADS)

    Isnur Haryudo, Subuh; Imam Agung, Achmad; Firmansyah, Rifqi

    2018-04-01

    The purpose of this research is to develop learning media of control technique using Matrix Laboratory software with industry requirement approach. Learning media serves as a tool for creating a better and effective teaching and learning situation because it can accelerate the learning process in order to enhance the quality of learning. Control Techniques using Matrix Laboratory software can enlarge the interest and attention of students, with real experience and can grow independent attitude. This research design refers to the use of research and development (R & D) methods that have been modified by multi-disciplinary team-based researchers. This research used Computer based learning method consisting of computer and Matrix Laboratory software which was integrated with props. Matrix Laboratory has the ability to visualize the theory and analysis of the Control System which is an integration of computing, visualization and programming which is easy to use. The result of this instructional media development is to use mathematical equations using Matrix Laboratory software on control system application with DC motor plant and PID (Proportional-Integral-Derivative). Considering that manufacturing in the field of Distributed Control systems (DCSs), Programmable Controllers (PLCs), and Microcontrollers (MCUs) use PID systems in production processes are widely used in industry.

  20. MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models

    DOE PAGES

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    2016-08-03

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  1. The integrated motion measurement simulation for SOFIA

    NASA Astrophysics Data System (ADS)

    Kaswekar, Prashant; Greiner, Benjamin; Wagner, Jörg

    2014-07-01

    The Stratospheric Observatory for Infrared Astronomy SOFIA consists of a B747-SP aircraft, which carries aloft a 2.7-meter reflecting telescope. The image stability goal for SOFIA is 0:2 arc-seconds rms. The performance of the telescope structure is affected by elastic vibrations induced by aeroacoustic and suspension disturbances. Active compensation of such disturbances requires a fast way of estimating the structural motion. Integrated navigation systems are examples of such estimation systems. However they employ a rigid body assumption. A possible extension of these systems to an elastic structure is shown by different authors for one dimensional beam structures taking into account the eigenmodes of the structural system. The rigid body motion as well as the flexible modes of the telescope assembly, however, are coupled among the three axes. Extending a special mathematical approach to three dimensional structures, the aspect of a modal observer based on integrated motion measurement is simulated for SOFIA. It is in general a fusion of different measurement methods by using their benefits and blinding out their disadvantages. There are no mass and stillness properties needed directly in this approach. However, the knowledge of modal properties of the structure is necessary for the implementation of this method. A finite-element model is chosen as a basis to extract the modal properties of the structure.

  2. An Integrated Strategy for Global Qualitative and Quantitative Profiling of Traditional Chinese Medicine Formulas: Baoyuan Decoction as a Case

    NASA Astrophysics Data System (ADS)

    Ma, Xiaoli; Guo, Xiaoyu; Song, Yuelin; Qiao, Lirui; Wang, Wenguang; Zhao, Mingbo; Tu, Pengfei; Jiang, Yong

    2016-12-01

    Clarification of the chemical composition of traditional Chinese medicine formulas (TCMFs) is a challenge due to the variety of structures and the complexity of plant matrices. Herein, an integrated strategy was developed by hyphenating ultra-performance liquid chromatography (UPLC), quadrupole time-of-flight (Q-TOF), hybrid triple quadrupole-linear ion trap mass spectrometry (Qtrap-MS), and the novel post-acquisition data processing software UNIFI to achieve automatic, rapid, accurate, and comprehensive qualitative and quantitative analysis of the chemical components in TCMFs. As a proof-of-concept, the chemical profiling of Baoyuan decoction (BYD), which is an ancient TCMF that is clinically used for the treatment of coronary heart disease that consists of Ginseng Radix et Rhizoma, Astragali Radix, Glycyrrhizae Radix et Rhizoma Praeparata Cum Melle, and Cinnamomi Cortex, was performed. As many as 236 compounds were plausibly or unambiguously identified, and 175 compounds were quantified or relatively quantified by the scheduled multiple reaction monitoring (sMRM) method. The findings demonstrate that the strategy integrating the rapidity of UNIFI software, the efficiency of UPLC, the accuracy of Q-TOF-MS, and the sensitivity and quantitation ability of Qtrap-MS provides a method for the efficient and comprehensive chemome characterization and quality control of complex TCMFs.

  3. A framework for scalable parameter estimation of gene circuit models using structural information.

    PubMed

    Kuwahara, Hiroyuki; Fan, Ming; Wang, Suojin; Gao, Xin

    2013-07-01

    Systematic and scalable parameter estimation is a key to construct complex gene regulatory models and to ultimately facilitate an integrative systems biology approach to quantitatively understand the molecular mechanisms underpinning gene regulation. Here, we report a novel framework for efficient and scalable parameter estimation that focuses specifically on modeling of gene circuits. Exploiting the structure commonly found in gene circuit models, this framework decomposes a system of coupled rate equations into individual ones and efficiently integrates them separately to reconstruct the mean time evolution of the gene products. The accuracy of the parameter estimates is refined by iteratively increasing the accuracy of numerical integration using the model structure. As a case study, we applied our framework to four gene circuit models with complex dynamics based on three synthetic datasets and one time series microarray data set. We compared our framework to three state-of-the-art parameter estimation methods and found that our approach consistently generated higher quality parameter solutions efficiently. Although many general-purpose parameter estimation methods have been applied for modeling of gene circuits, our results suggest that the use of more tailored approaches to use domain-specific information may be a key to reverse engineering of complex biological systems. http://sfb.kaust.edu.sa/Pages/Software.aspx. Supplementary data are available at Bioinformatics online.

  4. Why we do what we do: a theoretical evaluation of the integrated practice model for forensic nursing science.

    PubMed

    Valentine, Julie L

    2014-01-01

    An evaluation of the Integrated Practice Model for Forensic Nursing Science () is presented utilizing methods outlined by . A brief review of nursing theory basics and evaluation methods by Meleis is provided to enhance understanding of the ensuing theoretical evaluation and critique. The Integrated Practice Model for Forensic Nursing Science, created by forensic nursing pioneer Virginia Lynch, captures the theories, assumptions, concepts, and propositions inherent in forensic nursing practice and science. The historical background of the theory is explored as Lynch's model launched the role development of forensic nursing practice as both a nursing and forensic science specialty. It is derived from a combination of nursing, sociological, and philosophical theories to reflect the grounding of forensic nursing in the nursing, legal, psychological, and scientific communities. As Lynch's model is the first inception of forensic nursing theory, it is representative of a conceptual framework although the title implies a practice theory. The clarity and consistency displayed in the theory's structural components of assumptions, concepts, and propositions are analyzed. The model is described and evaluated. A summary of the strengths and limitations of the model is compiled followed by application to practice, education, and research with suggestions for ongoing theory development.

  5. Determinants of sustainability in solid waste management--the Gianyar Waste Recovery Project in Indonesia.

    PubMed

    Zurbrügg, Christian; Gfrerer, Margareth; Ashadi, Henki; Brenner, Werner; Küper, David

    2012-11-01

    According to most experts, integrated and sustainable solid waste management should not only be given top priority, but must go beyond technical aspects to include various key elements of sustainability to ensure success of any solid waste project. Aside from project sustainable impacts, the overall enabling environment is the key feature determining performance and success of an integrated and affordable solid waste system. This paper describes a project-specific approach to assess typical success or failure factors. A questionnaire-based assessment method covers issues of: (i) social mobilisation and acceptance (social element), (ii) stakeholder, legal and institutional arrangements comprising roles, responsibilities and management functions (institutional element); (iii) financial and operational requirements, as well as cost recovery mechanisms (economic element). The Gianyar Waste Recovery Project in Bali, Indonesia was analysed using this integrated assessment method. The results clearly identified chief characteristics, key factors to consider when planning country wide replication but also major barriers and obstacles which must be overcome to ensure project sustainability. The Gianyar project consists of a composting unit processing 60 tons of municipal waste per day from 500,000 inhabitants, including manual waste segregation and subsequent composting of the biodegradable organic fraction. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Simultaneous integrated vs. sequential boost in VMAT radiotherapy of high-grade gliomas.

    PubMed

    Farzin, Mostafa; Molls, Michael; Astner, Sabrina; Rondak, Ina-Christine; Oechsner, Markus

    2015-12-01

    In 20 patients with high-grade gliomas, we compared two methods of planning for volumetric-modulated arc therapy (VMAT): simultaneous integrated boost (SIB) vs. sequential boost (SEB). The investigation focused on the analysis of dose distributions in the target volumes and the organs at risk (OARs). After contouring the target volumes [planning target volumes (PTVs) and boost volumes (BVs)] and OARs, SIB planning and SEB planning were performed. The SEB method consisted of two plans: in the first plan the PTV received 50 Gy in 25 fractions with a 2-Gy dose per fraction. In the second plan the BV received 10 Gy in 5 fractions with a dose per fraction of 2 Gy. The doses of both plans were summed up to show the total doses delivered. In the SIB method the PTV received 54 Gy in 30 fractions with a dose per fraction of 1.8 Gy, while the BV received 60 Gy in the same fraction number but with a dose per fraction of 2 Gy. All of the OARs showed higher doses (Dmax and Dmean) in the SEB method when compared with the SIB technique. The differences between the two methods were statistically significant in almost all of the OARs. Analysing the total doses of the target volumes we found dose distributions with similar homogeneities and comparable total doses. Our analysis shows that the SIB method offers advantages over the SEB method in terms of sparing OARs.

  7. A novel methodology for querying web images

    NASA Astrophysics Data System (ADS)

    Prabhakara, Rashmi; Lee, Ching Cheng

    2005-01-01

    Ever since the advent of Internet, there has been an immense growth in the amount of image data that is available on the World Wide Web. With such a magnitude of image availability, an efficient and effective image retrieval system is required to make use of this information. This research presents an effective image matching and indexing technique that improvises on existing integrated image retrieval methods. The proposed technique follows a two-phase approach, integrating query by topic and query by example specification methods. The first phase consists of topic-based image retrieval using an improved text information retrieval (IR) technique that makes use of the structured format of HTML documents. It consists of a focused crawler that not only provides for the user to enter the keyword for the topic-based search but also, the scope in which the user wants to find the images. The second phase uses the query by example specification to perform a low-level content-based image match for the retrieval of smaller and relatively closer results of the example image. Information related to the image feature is automatically extracted from the query image by the image processing system. A technique that is not computationally intensive based on color feature is used to perform content-based matching of images. The main goal is to develop a functional image search and indexing system and to demonstrate that better retrieval results can be achieved with this proposed hybrid search technique.

  8. A novel methodology for querying web images

    NASA Astrophysics Data System (ADS)

    Prabhakara, Rashmi; Lee, Ching Cheng

    2004-12-01

    Ever since the advent of Internet, there has been an immense growth in the amount of image data that is available on the World Wide Web. With such a magnitude of image availability, an efficient and effective image retrieval system is required to make use of this information. This research presents an effective image matching and indexing technique that improvises on existing integrated image retrieval methods. The proposed technique follows a two-phase approach, integrating query by topic and query by example specification methods. The first phase consists of topic-based image retrieval using an improved text information retrieval (IR) technique that makes use of the structured format of HTML documents. It consists of a focused crawler that not only provides for the user to enter the keyword for the topic-based search but also, the scope in which the user wants to find the images. The second phase uses the query by example specification to perform a low-level content-based image match for the retrieval of smaller and relatively closer results of the example image. Information related to the image feature is automatically extracted from the query image by the image processing system. A technique that is not computationally intensive based on color feature is used to perform content-based matching of images. The main goal is to develop a functional image search and indexing system and to demonstrate that better retrieval results can be achieved with this proposed hybrid search technique.

  9. Objective Integrated Assessment of Functional Outcomes in Reduction Mammaplasty

    PubMed Central

    Passaro, Ilaria; Malovini, Alberto; Faga, Angela; Toffola, Elena Dalla

    2013-01-01

    Background: The aim of our study was an objective integrated assessment of the functional outcomes of reduction mammaplasty. Methods: The study involved 17 women undergoing reduction mammaplasty from March 2009 to June 2011. Each patient was assessed before surgery and 2 months postoperatively with the original association of 4 subjective and objective assessment methods: a physiatric clinical examination, the Roland Morris Disability Questionnaire, the Berg Balance Scale, and a static force platform analysis. Results: All of the tests proved multiple statistically significant associated outcomes demonstrating a significant improvement in the functional status following reduction mammaplasty. Surgical correction of breast hypertrophy could achieve both spinal pain relief and recovery of performance status in everyday life tasks, owing to a muscular postural functional rearrangement with a consistent antigravity muscle activity sparing. Pain reduction in turn could reduce the antalgic stiffness and improved the spinal range of motion. In our sample, the improvement of the spinal range of motion in flexion matched a similar improvement in extension. Recovery of a more favorable postural pattern with reduction of the anterior imbalance was demonstrated by the static force stabilometry. Therefore, postoperatively, all of our patients narrowed the gap between the actual body barycenter and the ideal one. The static force platform assessment also consistently confirmed the effectiveness of an accurate clinical examination of functional impairment from breast hypertrophy. Conclusions: The static force platform assessment might help the clinician to support the diagnosis of functional impairment from a breast hypertrophy with objectively based data. PMID:25289256

  10. Diagrammatic expansion for positive spectral functions beyond GW: Application to vertex corrections in the electron gas

    NASA Astrophysics Data System (ADS)

    Stefanucci, G.; Pavlyukh, Y.; Uimonen, A.-M.; van Leeuwen, R.

    2014-09-01

    We present a diagrammatic approach to construct self-energy approximations within many-body perturbation theory with positive spectral properties. The method cures the problem of negative spectral functions which arises from a straightforward inclusion of vertex diagrams beyond the GW approximation. Our approach consists of a two-step procedure: We first express the approximate many-body self-energy as a product of half-diagrams and then identify the minimal number of half-diagrams to add in order to form a perfect square. The resulting self-energy is an unconventional sum of self-energy diagrams in which the internal lines of half a diagram are time-ordered Green's functions, whereas those of the other half are anti-time-ordered Green's functions, and the lines joining the two halves are either lesser or greater Green's functions. The theory is developed using noninteracting Green's functions and subsequently extended to self-consistent Green's functions. Issues related to the conserving properties of diagrammatic approximations with positive spectral functions are also addressed. As a major application of the formalism we derive the minimal set of additional diagrams to make positive the spectral function of the GW approximation with lowest-order vertex corrections and screened interactions. The method is then applied to vertex corrections in the three-dimensional homogeneous electron gas by using a combination of analytical frequency integrations and numerical Monte Carlo momentum integrations to evaluate the diagrams.

  11. Applied Ecosystem Analysis - Background : EDT the Ecosystem Diagnosis and Treatment Method.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mobrand, Lars E.

    1996-05-01

    This volume consists of eight separate reports. We present them as background to the Ecosystem Diagnosis and Treatment (EDT) methodology. They are a selection from publications, white papers, and presentations prepared over the past two years. Some of the papers are previously published, others are currently being prepared for publication. In the early to mid 1980`s the concern for failure of both natural and hatchery production of Columbia river salmon populations was widespread. The concept of supplementation was proposed as an alternative solution that would integrate artificial propagation with natural production. In response to the growing expectations placed upon themore » supplementation tool, a project called Regional Assessment of Supplementation Project (RASP) was initiated in 1990. The charge of RASP was to define supplementation and to develop guidelines for when, where and how it would be the appropriate solution to salmon enhancement in the Columbia basin. The RASP developed a definition of supplementation and a set of guidelines for planning salmon enhancement efforts which required consideration of all factors affecting salmon populations, including environmental, genetic, and ecological variables. The results of RASP led to a conclusion that salmon issues needed to be addressed in a manner that was consistent with an ecosystem approach. If the limitations and potentials of supplementation or any other management tool were to be fully understood it would have to be within the context of a broadly integrated approach - thus the Ecosystem Diagnosis and Treatment (EDT) method was born.« less

  12. Integrated control of peridomestic larval habitats of Aedes and Culex mosquitoes (Diptera: Culicidae) in atoll villages of French Polynesia.

    PubMed

    Lardeux, Frederic; Sechan, Yves; Loncke, Stepiiane; Deparis, Xavier; Cheffort, Jules; Faaruia, Marc

    2002-05-01

    An integrated larval mosquito control program was carried out in Tiputa village on Rangiroa atoll of French Polynesia. Mosquito abundance before and after treatment was compared with the abundance in an untreated village. Mosquito larval habitats consisted of large concrete or polyurethane cisterns, wells, and 200-liter drums. Depending on the target species, larval habitat category, its configuration, and purpose (drinking consumption or not), abatement methods consisted of sealing the larval habitats with mosquito gauze, treating them with 1% Temephos, covering the water with a 10-cm thick layer of polystyrene beads or introducing fish (Poecillia reticulata Rosen & Bailey). All premises of the chosen village were treated and a health education program explained basic mosquito ecology and the methods of control. A community health agent was trained to continue the control program at the end of the experiment. Entomological indices from human bait collections and larval surveys indicated that mosquito populations were reduced significantly, compared with concurrent samples from the untreated control village, and that mosquito control remained effective for 6 mo after treatment. Effects of the treatment were noticed by the inhabitants in terms of a reduction in the number of mosquito bites. In the Polynesian context, such control programs may succeed in the long-term only if strong political decisions are taken at the village level, if a community member is designated as being responsible for maintaining the program, and if the inhabitants are motivated sufficiently by the mosquito nuisance to intervene.

  13. Quasi-Static Probabilistic Structural Analyses Process and Criteria

    NASA Technical Reports Server (NTRS)

    Goldberg, B.; Verderaime, V.

    1999-01-01

    Current deterministic structural methods are easily applied to substructures and components, and analysts have built great design insights and confidence in them over the years. However, deterministic methods cannot support systems risk analyses, and it was recently reported that deterministic treatment of statistical data is inconsistent with error propagation laws that can result in unevenly conservative structural predictions. Assuming non-nal distributions and using statistical data formats throughout prevailing stress deterministic processes lead to a safety factor in statistical format, which integrated into the safety index, provides a safety factor and first order reliability relationship. The embedded safety factor in the safety index expression allows a historically based risk to be determined and verified over a variety of quasi-static metallic substructures consistent with the traditional safety factor methods and NASA Std. 5001 criteria.

  14. The estimation of convective rainfall by area integrals. I - The theoretical and empirical basis. II - The height-area rainfall threshold (HART) method

    NASA Technical Reports Server (NTRS)

    Rosenfeld, Daniel; Short, David A.; Atlas, David

    1990-01-01

    A theory is developed which establishes the basis for the use of rainfall areas within present thresholds as a measure of either the instantaneous areawide rain rate of convective storms or the total volume of rain from an individual storm over its lifetime. The method is based upon the existence of a well-behaved pdf of rain rate either from the many storms at one instant or from a single storm during its life. The generality of the instantaneous areawide method was examined by applying it to quantitative radar data sets from the GARP Tropical Atlantic Experiment for South Africa, Texas, and Darwin (Australia). It is shown that the pdf's developed for each of these areas are consistent with the theory.

  15. Enhanced Conformational Sampling Using Replica Exchange with Collective-Variable Tempering

    PubMed Central

    2015-01-01

    The computational study of conformational transitions in RNA and proteins with atomistic molecular dynamics often requires suitable enhanced sampling techniques. We here introduce a novel method where concurrent metadynamics are integrated in a Hamiltonian replica-exchange scheme. The ladder of replicas is built with different strengths of the bias potential exploiting the tunability of well-tempered metadynamics. Using this method, free-energy barriers of individual collective variables are significantly reduced compared with simple force-field scaling. The introduced methodology is flexible and allows adaptive bias potentials to be self-consistently constructed for a large number of simple collective variables, such as distances and dihedral angles. The method is tested on alanine dipeptide and applied to the difficult problem of conformational sampling in a tetranucleotide. PMID:25838811

  16. Developing and utilizing an Euler computational method for predicting the airframe/propulsion effects for an aft-mounted turboprop transport. Volume 2: User guide

    NASA Technical Reports Server (NTRS)

    Chen, H. C.; Neback, H. E.; Kao, T. J.; Yu, N. Y.; Kusunose, K.

    1991-01-01

    This manual explains how to use an Euler based computational method for predicting the airframe/propulsion integration effects for an aft-mounted turboprop transport. The propeller power effects are simulated by the actuator disk concept. This method consists of global flow field analysis and the embedded flow solution for predicting the detailed flow characteristics in the local vicinity of an aft-mounted propfan engine. The computational procedure includes the use of several computer programs performing four main functions: grid generation, Euler solution, grid embedding, and streamline tracing. This user's guide provides information for these programs, including input data preparations with sample input decks, output descriptions, and sample Unix scripts for program execution in the UNICOS environment.

  17. COMPUTATIONAL METHODOLOGIES for REAL-SPACE STRUCTURAL REFINEMENT of LARGE MACROMOLECULAR COMPLEXES

    PubMed Central

    Goh, Boon Chong; Hadden, Jodi A.; Bernardi, Rafael C.; Singharoy, Abhishek; McGreevy, Ryan; Rudack, Till; Cassidy, C. Keith; Schulten, Klaus

    2017-01-01

    The rise of the computer as a powerful tool for model building and refinement has revolutionized the field of structure determination for large biomolecular systems. Despite the wide availability of robust experimental methods capable of resolving structural details across a range of spatiotemporal resolutions, computational hybrid methods have the unique ability to integrate the diverse data from multimodal techniques such as X-ray crystallography and electron microscopy into consistent, fully atomistic structures. Here, commonly employed strategies for computational real-space structural refinement are reviewed, and their specific applications are illustrated for several large macromolecular complexes: ribosome, virus capsids, chemosensory array, and photosynthetic chromatophore. The increasingly important role of computational methods in large-scale structural refinement, along with current and future challenges, is discussed. PMID:27145875

  18. Development of a unified guidance system for geocentric transfer. [for solar electric propulsion spacecraft

    NASA Technical Reports Server (NTRS)

    Cake, J. E.; Regetz, J. D., Jr.

    1975-01-01

    A method is presented for open loop guidance of a solar electric propulsion spacecraft to geosynchronous orbit. The method consists of determining the thrust vector profiles on the ground with an optimization computer program, and performing updates based on the difference between the actual trajectory and that predicted with a precision simulation computer program. The motivation for performing the guidance analysis during the mission planning phase is discussed, and a spacecraft design option that employs attitude orientation constraints is presented. The improvements required in both the optimization program and simulation program are set forth, together with the efforts to integrate the programs into the ground support software for the guidance system.

  19. Development of a unified guidance system for geocentric transfer. [solar electric propulsion spacecraft

    NASA Technical Reports Server (NTRS)

    Cake, J. E.; Regetz, J. D., Jr.

    1975-01-01

    A method is presented for open loop guidance of a solar electric propulsion spacecraft to geosynchronsus orbit. The method consists of determining the thrust vector profiles on the ground with an optimization computer program, and performing updates based on the difference between the actual trajectory and that predicted with a precision simulation computer program. The motivation for performing the guidance analysis during the mission planning phase is discussed, and a spacecraft design option that employs attitude orientation constraints is presented. The improvements required in both the optimization program and simulation program are set forth, together with the efforts to integrate the programs into the ground support software for the guidance system.

  20. Applications of algebraic topology to compatible spatial discretizations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bochev, Pavel Blagoveston; Hyman, James M.

    We provide a common framework for compatible discretizations using algebraic topology to guide our analysis. The main concept is the natural inner product on cochains, which induces a combinatorial Hodge theory. The framework comprises of mutually consistent operations of differentiation and integration, has a discrete Stokes theorem, and preserves the invariants of the DeRham cohomology groups. The latter allows for an elementary calculation of the kernel of the discrete Laplacian. Our framework provides an abstraction that includes examples of compatible finite element, finite volume and finite difference methods. We describe how these methods result from the choice of a reconstructionmore » operator and when they are equivalent.« less

  1. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation

    NASA Technical Reports Server (NTRS)

    Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelly, J. H.; Depkovich, T. M.

    1984-01-01

    A generic computer simulation for manipulator systems (ROBSIM) was implemented and the specific technologies necessary to increase the role of automation in various missions were developed. The specific items developed are: (1) capability for definition of a manipulator system consisting of multiple arms, load objects, and an environment; (2) capability for kinematic analysis, requirements analysis, and response simulation of manipulator motion; (3) postprocessing options such as graphic replay of simulated motion and manipulator parameter plotting; (4) investigation and simulation of various control methods including manual force/torque and active compliances control; (5) evaluation and implementation of three obstacle avoidance methods; (6) video simulation and edge detection; and (7) software simulation validation.

  2. Measurement of the top quark mass with the template method in the $$t\\bar{t} \\to\\mathrm{lepton}+\\mathrm{jets}$$ channel using ATLAS data

    DOE PAGES

    Aad, G.; Abbott, B.; Abdallah, J.; ...

    2012-06-21

    The top quark mass has been measured using the template method in the $t\\bar{t} →lepton + jets channel based on data recorded in 2011 with the ATLAS detector at the LHC. The data were taken at a proton-proton centre-of-mass energy of √s = 7 TeV and correspond to an integrated luminosity of 1.04 fb -1. The analyses in the e + jets and μ + jets decay channels yield consistent results. Here the top quark mass is measured to be m top = 174.5±0.6 stat ±2.3 syst GeV.

  3. Integration Processes Compared: Cortical Differences for Consistency Evaluation and Passive Comprehension in Local and Global Coherence.

    PubMed

    Egidi, Giovanna; Caramazza, Alfonso

    2016-10-01

    This research studies the neural systems underlying two integration processes that take place during natural discourse comprehension: consistency evaluation and passive comprehension. Evaluation was operationalized with a consistency judgment task and passive comprehension with a passive listening task. Using fMRI, the experiment examined the integration of incoming sentences with more recent, local context and with more distal, global context in these two tasks. The stimuli were stories in which we manipulated the consistency of the endings with the local context and the relevance of the global context for the integration of the endings. A whole-brain analysis revealed several differences between the two tasks. Two networks previously associated with semantic processing and attention orienting showed more activation during the judgment than the passive listening task. A network previously associated with episodic memory retrieval and construction of mental scenes showed greater activity when global context was relevant, but only during the judgment task. This suggests that evaluation, more than passive listening, triggers the reinstantiation of global context and the construction of a rich mental model for the story. Finally, a network previously linked to fluent updating of a knowledge base showed greater activity for locally consistent endings than inconsistent ones, but only during passive listening, suggesting a mode of comprehension that relies on a local scope approach to language processing. Taken together, these results show that consistency evaluation and passive comprehension weigh differently on distal and local information and are implemented, in part, by different brain networks.

  4. The Sustainable Development Goals - conceptual approaches for science and research projects

    NASA Astrophysics Data System (ADS)

    Schmalzbauer, Bettina; Visbeck, Martin

    2017-04-01

    Challenged to provide answers to some of the world's biggest societal and environmental problems, the scientific community has consistently delivered exciting and solid information that is often used to assess the situation in many different parts of the globe to document the anthropogenic cause of environmental changes and to provide perspectives on possible development scenarios. With the adoption of the Paris climate agreement and the 2030 Agenda for Sustainable Development (and its 17 Sustainable Development Goals (SDGs)) major issues for society are now in its complexity in implementation. That are: consistency with other political processes (e.g. UNFCCC, IPBES), implementability (e.g. interactions between SDGs, pathways) and measurability (e.g. indicators). We argue that science can contribute to all these aspects by providing fundamental knowledge necessary for decision-making and practical implementation of the SDGs. Cooperation beyond disciplines and national boarders is essential, as well as the integration of concepts and methods of natural and social sciences. The outcome of two international conferences has called out four specific areas where science can make significant contributions towards SDG implementation: First, deep and integrated scientific knowledge is needed for better understanding key interactions, synergies and trade-offs embedded in the SDGs. Second, sound scientific input is needed for co-designing and executing of scientific assessments in the context of the SDG process (going beyond the good examples set by IPCC and IPBES). Third, science can support the establishment of evidence-based procedures for the development of scenarios and identify possible pathways for the world in 2030 or beyond. Fourth, progress on SDG implementation needs to be supported by a meaningful indicator framework, and this framework needs scientific input to refine indicators, and further develop and standardise methods. The main conclusion is that a comprehensive approach is needed that combines basic science and solution-oriented science, and integrates knowledge from natural science, social sciences, engineering and humanities (but also from other knowledge domains) to meet the overall objective of the 2030 Agenda. Foresight, integrated assessment and integrated modelling can be possible successful approaches for knowledge exchange, learning, and identifying possible coherent development pathways towards global sustainability.To ensure rapid and effective uptake of new research results the concepts of co-design of research projects and co-production of knowledge show promise.

  5. Determining polarizable force fields with electrostatic potentials from quantum mechanical linear response theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Hao; Yang, Weitao, E-mail: weitao.yang@duke.edu; Department of Physics, Duke University, Durham, North Carolina 27708

    We developed a new method to calculate the atomic polarizabilities by fitting to the electrostatic potentials (ESPs) obtained from quantum mechanical (QM) calculations within the linear response theory. This parallels the conventional approach of fitting atomic charges based on electrostatic potentials from the electron density. Our ESP fitting is combined with the induced dipole model under the perturbation of uniform external electric fields of all orientations. QM calculations for the linear response to the external electric fields are used as input, fully consistent with the induced dipole model, which itself is a linear response model. The orientation of the uniformmore » external electric fields is integrated in all directions. The integration of orientation and QM linear response calculations together makes the fitting results independent of the orientations and magnitudes of the uniform external electric fields applied. Another advantage of our method is that QM calculation is only needed once, in contrast to the conventional approach, where many QM calculations are needed for many different applied electric fields. The molecular polarizabilities obtained from our method show comparable accuracy with those from fitting directly to the experimental or theoretical molecular polarizabilities. Since ESP is directly fitted, atomic polarizabilities obtained from our method are expected to reproduce the electrostatic interactions better. Our method was used to calculate both transferable atomic polarizabilities for polarizable molecular mechanics’ force fields and nontransferable molecule-specific atomic polarizabilities.« less

  6. Multi-ingredients determination and fingerprint analysis of leaves from Ilex latifolia using ultra-performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry.

    PubMed

    Fan, Chunlin; Deng, Jiewei; Yang, Yunyun; Liu, Junshan; Wang, Ying; Zhang, Xiaoqi; Fai, Kuokchiu; Zhang, Qingwen; Ye, Wencai

    2013-10-01

    An ultra-performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry (UPLC-QTOF-MS) method integrating multi-ingredients determination and fingerprint analysis has been established for quality assessment and control of leaves from Ilex latifolia. The method possesses the advantages of speediness, efficiency, accuracy, and allows the multi-ingredients determination and fingerprint analysis in one chromatographic run within 13min. Multi-ingredients determination was performed based on the extracted ion chromatograms of the exact pseudo-molecular ions (with a 0.01Da window), and fingerprint analysis was performed based on the base peak chromatograms, obtained by negative-ion electrospray ionization QTOF-MS. The method validation results demonstrated our developed method possessing desirable specificity, linearity, precision and accuracy. The method was utilized to analyze 22 I. latifolia samples from different origins. The quality assessment was achieved by using both similarity analysis (SA) and principal component analysis (PCA), and the results from SA were consistent with those from PCA. Our experimental results demonstrate that the strategy integrated multi-ingredients determination and fingerprint analysis using UPLC-QTOF-MS technique is a useful approach for rapid pharmaceutical analysis, with promising prospects for the differentiation of origin, the determination of authenticity, and the overall quality assessment of herbal medicines. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. A new interpolation method for gridded extensive variables with application in Lagrangian transport and dispersion models

    NASA Astrophysics Data System (ADS)

    Hittmeir, Sabine; Philipp, Anne; Seibert, Petra

    2017-04-01

    In discretised form, an extensive variable usually represents an integral over a 3-dimensional (x,y,z) grid cell. In the case of vertical fluxes, gridded values represent integrals over a horizontal (x,y) grid face. In meteorological models, fluxes (precipitation, turbulent fluxes, etc.) are usually written out as temporally integrated values, thus effectively forming 3D (x,y,t) integrals. Lagrangian transport models require interpolation of all relevant variables towards the location in 4D space of each of the computational particles. Trivial interpolation algorithms usually implicitly assume the integral value to be a point value valid at the grid centre. If the integral value would be reconstructed from the interpolated point values, it would in general not be correct. If nonlinear interpolation methods are used, non-negativity cannot easily be ensured. This problem became obvious with respect to the interpolation of precipitation for the calculation of wet deposition FLEXPART (http://flexpart.eu) which uses ECMWF model output or other gridded input data. The presently implemented method consists of a special preprocessing in the input preparation software and subsequent linear interpolation in the model. The interpolated values are positive but the criterion of cell-wise conservation of the integral property is violated; it is also not very accurate as it smoothes the field. A new interpolation algorithm was developed which introduces additional supporting grid points in each time interval with linear interpolation to be applied in FLEXPART later between them. It preserves the integral precipitation in each time interval, guarantees the continuity of the time series, and maintains non-negativity. The function values of the remapping algorithm at these subgrid points constitute the degrees of freedom which can be prescribed in various ways. Combining the advantages of different approaches leads to a final algorithm respecting all the required conditions. To improve the monotonicity behaviour we additionally derived a filter to restrict over- or undershooting. At the current stage, the algorithm is meant primarily for the temporal dimension. It can also be applied with operator-splitting to include the two horizontal dimensions. An extension to 2D appears feasible, while a fully 3D version would most likely not justify the effort compared to the operator-splitting approach.

  8. A generic multibody simulation

    NASA Technical Reports Server (NTRS)

    Hopping, K. A.; Kohn, W.

    1986-01-01

    Described is a dynamic simulation package which can be configured for orbital test scenarios involving multiple bodies. The rotational and translational state integration methods are selectable for each individual body and may be changed during a run if necessary. Characteristics of the bodies are determined by assigning components consisting of mass properties, forces, and moments, which are the outputs of user-defined environmental models. Generic model implementation is facilitated by a transformation processor which performs coordinate frame inversions. Transformations are defined in the initialization file as part of the simulation configuration. The simulation package includes an initialization processor, which consists of a command line preprocessor, a general purpose grammar, and a syntax scanner. These permit specifications of the bodies, their interrelationships, and their initial states in a format that is not dependent on a particular test scenario.

  9. Optimised cross-layer synchronisation schemes for wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Nasri, Nejah; Ben Fradj, Awatef; Kachouri, Abdennaceur

    2017-07-01

    This paper aims at synchronisation between the sensor nodes. Indeed, in the context of wireless sensor networks, it is necessary to take into consideration the energy cost induced by the synchronisation, which can represent the majority of the energy consumed. On communication, an already identified hard point consists in imagining a fine synchronisation protocol which must be sufficiently robust to the intermittent energy in the sensors. Hence, this paper worked on aspects of performance and energy saving, in particular on the optimisation of the synchronisation protocol using cross-layer design method such as synchronisation between layers. Our approach consists in balancing the energy consumption between the sensors and choosing the cluster head with the highest residual energy in order to guarantee the reliability, integrity and continuity of communication (i.e. maximising the network lifetime).

  10. New getter configuration at wafer level for assuring long term stability of MEMs

    NASA Astrophysics Data System (ADS)

    Moraja, Marco; Amiotti, Marco; Kullberg, Richard C.

    2003-01-01

    The evolution from ceramic packages to wafer to wafer hermetic sealing poses tremendous technical challenges to integrate a proper getter inside the MEMs to assure a long term stability and reliability of the devices. The state of the art solution to integrate a getter inside the MEMs of the last generation consists in patterning the getter material with a specific geometry onto the Si cap wafer. The practical implementation of this solution consists in a 4" or 6" Si wafers with grooves or particular incisures, where the getter material is placed in form of a thick film. The typical thickness of these thick films is in the range of few microns, depending on the gas load to be handled during the lifetime of the device. The structure of the thick getter film is highly porous in order to improve sorption performances, but at the same time there are no loose particles thanks to a proprietary manufacturing method. The getter thick film is composed of a Zr special alloy with a proper composition to optimize the sorption performances. The getter thick film can be placed selectively into grooves without affecting the lateral regions, surrounding the grooves where the hermetic sealing is performed.

  11. An Integrative Object-Based Image Analysis Workflow for Uav Images

    NASA Astrophysics Data System (ADS)

    Yu, Huai; Yan, Tianheng; Yang, Wen; Zheng, Hong

    2016-06-01

    In this work, we propose an integrative framework to process UAV images. The overall process can be viewed as a pipeline consisting of the geometric and radiometric corrections, subsequent panoramic mosaicking and hierarchical image segmentation for later Object Based Image Analysis (OBIA). More precisely, we first introduce an efficient image stitching algorithm after the geometric calibration and radiometric correction, which employs a fast feature extraction and matching by combining the local difference binary descriptor and the local sensitive hashing. We then use a Binary Partition Tree (BPT) representation for the large mosaicked panoramic image, which starts by the definition of an initial partition obtained by an over-segmentation algorithm, i.e., the simple linear iterative clustering (SLIC). Finally, we build an object-based hierarchical structure by fully considering the spectral and spatial information of the super-pixels and their topological relationships. Moreover, an optimal segmentation is obtained by filtering the complex hierarchies into simpler ones according to some criterions, such as the uniform homogeneity and semantic consistency. Experimental results on processing the post-seismic UAV images of the 2013 Ya'an earthquake demonstrate the effectiveness and efficiency of our proposed method.

  12. From Chaos to Content: An Integrated Approach to Government Web Sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demuth, Nora H.; Knudson, Christa K.

    2005-01-03

    The web development team of the Environmental Technology Directorate (ETD) at the U.S. Department of Energy’s Pacific Northwest National Laboratory (PNNL) redesigned the ETD website as a database-driven system, powered by the newly designed ETD Common Information System (ETD-CIS). The ETD website was redesigned in response to an analysis that showed the previous ETD websites were inefficient, costly, and lacking in a consistent focus. Redesigned and newly created websites based on a new ETD template provide a consistent image, meet or exceed accessibility standards, and are linked through a common database. The protocols used in developing the ETD website supportmore » integration of further organizational sites and facilitate internal use by staff and training on ETD website development and maintenance. Other PNNL organizations have approached the ETD web development team with an interest in applying the methods established by the ETD system. The ETD system protocol could potentially be used by other DOE laboratories to improve their website efficiency and content focus. “The tools by which we share science information must be as extraordinary as the information itself.[ ]” – DOE Science Director Raymond Orbach« less

  13. The Geropathology Research Network: An Interdisciplinary Approach for Integrating Pathology Into Research on Aging

    PubMed Central

    Ikeno, Yuji; Niedernhofer, Laura; McIndoe, Richard A.; Ciol, Marcia A.; Ritchey, Jerry; Liggitt, Denny

    2016-01-01

    Geropathology is the study of aging and age-related lesions and diseases in the form of whole necropsies/autopsies, surgical biopsies, histology, and molecular biomarkers. It encompasses multiple subspecialties of geriatrics, anatomic pathology, molecular pathology, clinical pathology, and gerontology. In order to increase the consistency and scope of communication in the histologic and molecular pathology assessment of tissues from preclinical and clinical aging studies, a Geropathology Research Network has been established consisting of pathologists and scientists with expertise in the comparative pathology of aging, the design of aging research studies, biostatistical methods for analysis of aging data, and bioinformatics for compiling and annotating large sets of data generated from aging studies. The network provides an environment to promote learning and exchange of scientific information and ideas for the aging research community through a series of symposia, the development of uniform ways of integrating pathology into aging studies, and the statistical analysis of pathology data. The efforts of the network are ultimately expected to lead to a refined set of sentinel biomarkers of molecular and anatomic pathology that could be incorporated into preclinical and clinical aging intervention studies to increase the relevance and productivity of these types of investigations. PMID:26243216

  14. Achievements in the development of the Water Cooled Solid Breeder Test Blanket Module of Japan to the milestones for installation in ITER

    NASA Astrophysics Data System (ADS)

    Tsuru, Daigo; Tanigawa, Hisashi; Hirose, Takanori; Mohri, Kensuke; Seki, Yohji; Enoeda, Mikio; Ezato, Koichiro; Suzuki, Satoshi; Nishi, Hiroshi; Akiba, Masato

    2009-06-01

    As the primary candidate of ITER Test Blanket Module (TBM) to be tested under the leadership of Japan, a water cooled solid breeder (WCSB) TBM is being developed. This paper shows the recent achievements towards the milestones of ITER TBMs prior to the installation, which consist of design integration in ITER, module qualification and safety assessment. With respect to the design integration, targeting the detailed design final report in 2012, structure designs of the WCSB TBM and the interfacing components (common frame and backside shielding) that are placed in a test port of ITER and the layout of the cooling system are presented. As for the module qualification, a real-scale first wall mock-up fabricated by using the hot isostatic pressing method by structural material of reduced activation martensitic ferritic steel, F82H, and flow and irradiation test of the mock-up are presented. As for safety milestones, the contents of the preliminary safety report in 2008 consisting of source term identification, failure mode and effect analysis (FMEA) and identification of postulated initiating events (PIEs) and safety analyses are presented.

  15. SU-F-T-450: The Investigation of Radiotherapy Quality Assurance and Automatic Treatment Planning Based On the Kernel Density Estimation Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fan, J; Fan, J; Hu, W

    Purpose: To develop a fast automatic algorithm based on the two dimensional kernel density estimation (2D KDE) to predict the dose-volume histogram (DVH) which can be employed for the investigation of radiotherapy quality assurance and automatic treatment planning. Methods: We propose a machine learning method that uses previous treatment plans to predict the DVH. The key to the approach is the framing of DVH in a probabilistic setting. The training consists of estimating, from the patients in the training set, the joint probability distribution of the dose and the predictive features. The joint distribution provides an estimation of the conditionalmore » probability of the dose given the values of the predictive features. For the new patient, the prediction consists of estimating the distribution of the predictive features and marginalizing the conditional probability from the training over this. Integrating the resulting probability distribution for the dose yields an estimation of the DVH. The 2D KDE is implemented to predict the joint probability distribution of the training set and the distribution of the predictive features for the new patient. Two variables, including the signed minimal distance from each OAR (organs at risk) voxel to the target boundary and its opening angle with respect to the origin of voxel coordinate, are considered as the predictive features to represent the OAR-target spatial relationship. The feasibility of our method has been demonstrated with the rectum, breast and head-and-neck cancer cases by comparing the predicted DVHs with the planned ones. Results: The consistent result has been found between these two DVHs for each cancer and the average of relative point-wise differences is about 5% within the clinical acceptable extent. Conclusion: According to the result of this study, our method can be used to predict the clinical acceptable DVH and has ability to evaluate the quality and consistency of the treatment planning.« less

  16. Development of the GREEN (Garden Resources, Education, and Environment Nexus) Tool: An Evidence-Based Model for School Garden Integration.

    PubMed

    Burt, Kate Gardner; Koch, Pamela; Contento, Isobel

    2017-10-01

    Researchers have established the benefits of school gardens on students' academic achievement, dietary outcomes, physical activity, and psychosocial skills, yet limited research has been conducted about how school gardens become institutionalized and sustained. Our aim was to develop a tool that captures how gardens are effectively established, integrated, and sustained in schools. We conducted a sequential, exploratory, mixed-methods study. Participants were identified with the help of Grow To Learn, the organization coordinating the New York City school garden initiative, and recruited via e-mail. A stratified, purposeful sample of 21 New York City elementary and middle schools participated in this study throughout the 2013/2014 school year. The sample was stratified in their garden budgets and purposeful in that each of the schools' gardens were determined to be well integrated and sustained. The processes and strategies used by school gardeners to establish well-integrated school gardens were assessed via data collected from surveys, interviews, observations, and concept mapping. Descriptive statistics as well as multidimensional scaling and hierarchical cluster analysis were used to examine the survey and concept mapping data. Qualitative data analysis consisted of thematic coding, pattern matching, explanation building and cross-case synthesis. Nineteen components within four domains of school garden integration were found through the mixed-methods concept mapping analysis. When the analyses of other data were combined, relationships between domains and components emerged. These data resulted in the development of the GREEN (Garden Resources, Education, and Environment Nexus) Tool. When schools with integrated and sustained gardens were studied, patterns emerged about how gardeners achieve institutionalization through different combinations of critical components. These patterns are best described by the GREEN Tool, the first framework to identify how to operationalize school gardening components and describe an evidence-based strategy of successful school garden integration. Copyright © 2017 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  17. A Fast and Scalable Radiation Hybrid Map Construction and Integration Strategy

    PubMed Central

    Agarwala, Richa; Applegate, David L.; Maglott, Donna; Schuler, Gregory D.; Schäffer, Alejandro A.

    2000-01-01

    This paper describes a fast and scalable strategy for constructing a radiation hybrid (RH) map from data on different RH panels. The maps on each panel are then integrated to produce a single RH map for the genome. Recurring problems in using maps from several sources are that the maps use different markers, the maps do not place the overlapping markers in same order, and the objective functions for map quality are incomparable. We use methods from combinatorial optimization to develop a strategy that addresses these issues. We show that by the standard objective functions of obligate chromosome breaks and maximum likelihood, software for the traveling salesman problem produces RH maps with better quality much more quickly than using software specifically tailored for RH mapping. We use known algorithms for the longest common subsequence problem as part of our map integration strategy. We demonstrate our methods by reconstructing and integrating maps for markers typed on the Genebridge 4 (GB4) and the Stanford G3 panels publicly available from the RH database. We compare map quality of our integrated map with published maps for GB4 panel and G3 panel by considering whether markers occur in the same order on a map and in DNA sequence contigs submitted to GenBank. We find that all of the maps are inconsistent with the sequence data for at least 50% of the contigs, but our integrated maps are more consistent. The map integration strategy not only scales to multiple RH maps but also to any maps that have comparable criteria for measuring map quality. Our software improves on current technology for doing RH mapping in areas of computation time and algorithms for considering a large number of markers for mapping. The essential impediments to producing dense high-quality RH maps are data quality and panel size, not computation. PMID:10720576

  18. Haplotype Reconstruction in Large Pedigrees with Many Untyped Individuals

    NASA Astrophysics Data System (ADS)

    Li, Xin; Li, Jing

    Haplotypes, as they specify the linkage patterns between dispersed genetic variations, provide important information for understanding the genetics of human traits. However haplotypes are not directly available from current genotyping platforms, and hence there are extensive investigations of computational methods to recover such information. Two major computational challenges arising in current family-based disease studies are large family sizes and many ungenotyped family members. Traditional haplotyping methods can neither handle large families nor families with missing members. In this paper, we propose a method which addresses these issues by integrating multiple novel techniques. The method consists of three major components: pairwise identical-bydescent (IBD) inference, global IBD reconstruction and haplotype restoring. By reconstructing the global IBD of a family from pairwise IBD and then restoring the haplotypes based on the inferred IBD, this method can scale to large pedigrees, and more importantly it can handle families with missing members. Compared with existing methods, this method demonstrates much higher power to recover haplotype information, especially in families with many untyped individuals.

  19. WE-DE-BRA-03: Construction of An Ultrasound Guidance Platform for Image-Guided Radiotherapy with the Intent to Treat Transitional Cell Carcinoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sick, J; Rancilio, N; Fulkerson, C

    Purpose: Ultrasound (US) is a noninvasive, nonradiographic imaging technique with high spatial and temporal resolution that can be used for localizing soft-tissue structures and tumors in real-time during radiotherapy (inter- and intra-fraction). A detailed methodology integrating 3D-US within RT is presented. This method is easier to adopt into current treatment protocol than current US based systems and reduces user variability for image acquisition, thus eliminating transducer induced changes that limit CT planning system. Methods: We designed an in-house integrated US manipulator and platform to relate CT, 3D-US and linear accelerator coordinate systems. To validate the platform, an agar-based phantom withmore » measured densities and speed-of-sound consistent with tissues surrounding the bladder, was rotated (0–45°) resulting in translations (up to 55mm) relative to the CT and US coordinate systems. After acquiring and integrating CT and US images into the treatment planning system, US-to-US and US-to-CT images were co-registered to re-align the phantom relative to the linear accelerator. Errors in the transformation matrix components were calculate to determine precision of this method under different patient positions. Results: Statistical errors from US-US registrations for different patient orientations ranged from 0.06–1.66mm for x, y, and z translational components, and 0.00–1.05° for rotational components. Statistical errors from US-CT registrations were 0.23–1.18mm for the x, y and z translational components, and 0.08–2.52° for the rotational components. Conclusion: Based on our result, this is consistent with currently used techniques for positioning prostate patients if couch re-positioning is less than a 5 degree rotation. We are now testing this on a dog patient to obtain both inter and intra-fractional positional errors. Additional design considerations include the future use of ultrasound-based functionality (photoacoustics, radioacoustics, Doppler) to monitor blood flow and hypoxia and/or in-vivo dosimetry for applications in other therapeutic techniques, such as hyperthermia, anti-angiogenesis, and particle therapy.« less

  20. Enterprise Architecture Planning in developing A planning Information System: a Case Study of Semarang State University

    NASA Astrophysics Data System (ADS)

    Budiman, Kholiq; Prahasto, Toni; Kusumawardhani, Amie

    2018-02-01

    This research has applied an integrated design and development of planning information system, which is been designed using Enterprise Architecture Planning. Frequent discrepancy between planning and realization of the budget that has been made, resulted in ineffective planning, is one of the reason for doing this research. The design using EAP aims to keep development aligned and in line with the strategic direction of the organization. In the practice, EAP is carried out in several stages of the planning initiation, identification and definition of business functions, proceeded with architectural design and EA implementation plan that has been built. In addition to the design of the Enterprise Architecture, this research carried out the implementation, and was tested by several methods of black box and white box. Black box testing method is used to test the fundamental aspects of the software, tested by two kinds of testing, first is using User Acceptance Testing and the second is using software functionality testing. White box testing method is used to test the effectiveness of the code in the software, tested using unit testing. Tests conducted using white box and black box on the integrated planning information system, is declared successful. Success in the software testing can not be ascertained if the software built has not shown any distinction from prior circumstance to the development of this integrated planning information system. For ensuring the success of this system implementation, the authors test consistency between the planning of data and the realization of prior-use of the information system, until after-use information system. This consistency test is done by reducing the time data of the planning and realization time. From the tabulated data, the planning information system that has been built reduces the difference between the planning time and the realization time, in which indicates that the planning information system can motivate the planner unit in realizing the budget that has been designed. It also proves that the value chain of the information planning system has brought implications for budget realization.

Top