Sample records for analysis methods finally

  1. Integration of Research Studies: Meta-Analysis of Research. Methods of Integrative Analysis; Final Report.

    ERIC Educational Resources Information Center

    Glass, Gene V.; And Others

    Integrative analysis, or what is coming to be known as meta-analysis, is the integration of the findings of many empirical research studies of a topic. Meta-analysis differs from traditional narrative forms of research reviewing in that it is more quantitative and statistical. Thus, the methods of meta-analysis are merely statistical methods,…

  2. Change detection for synthetic aperture radar images based on pattern and intensity distinctiveness analysis

    NASA Astrophysics Data System (ADS)

    Wang, Xiao; Gao, Feng; Dong, Junyu; Qi, Qiang

    2018-04-01

    Synthetic aperture radar (SAR) image is independent on atmospheric conditions, and it is the ideal image source for change detection. Existing methods directly analysis all the regions in the speckle noise contaminated difference image. The performance of these methods is easily affected by small noisy regions. In this paper, we proposed a novel change detection framework for saliency-guided change detection based on pattern and intensity distinctiveness analysis. The saliency analysis step can remove small noisy regions, and therefore makes the proposed method more robust to the speckle noise. In the proposed method, the log-ratio operator is first utilized to obtain a difference image (DI). Then, the saliency detection method based on pattern and intensity distinctiveness analysis is utilized to obtain the changed region candidates. Finally, principal component analysis and k-means clustering are employed to analysis pixels in the changed region candidates. Thus, the final change map can be obtained by classifying these pixels into changed or unchanged class. The experiment results on two real SAR images datasets have demonstrated the effectiveness of the proposed method.

  3. Slope Stability Analysis of Waste Dump in Sandstone Open Pit Osielec

    NASA Astrophysics Data System (ADS)

    Adamczyk, Justyna; Cała, Marek; Flisiak, Jerzy; Kolano, Malwina; Kowalski, Michał

    2013-03-01

    This paper presents the slope stability analysis for the current as well as projected (final) geometry of waste dump Sandstone Open Pit "Osielec". For the stability analysis six sections were selected. Then, the final geometry of the waste dump was designed and the stability analysis was conducted. On the basis of the analysis results the opportunities to improve the stability of the object were identified. The next issue addressed in the paper was to determine the proportion of the mixture containing mining and processing wastes, for which the waste dump remains stable. Stability calculations were carried out using Janbu method, which belongs to the limit equilibrium methods.

  4. Accounting for dropout in xenografted tumour efficacy studies: integrated endpoint analysis, reduced bias and better use of animals.

    PubMed

    Martin, Emma C; Aarons, Leon; Yates, James W T

    2016-07-01

    Xenograft studies are commonly used to assess the efficacy of new compounds and characterise their dose-response relationship. Analysis often involves comparing the final tumour sizes across dose groups. This can cause bias, as often in xenograft studies a tumour burden limit (TBL) is imposed for ethical reasons, leading to the animals with the largest tumours being excluded from the final analysis. This means the average tumour size, particularly in the control group, is underestimated, leading to an underestimate of the treatment effect. Four methods to account for dropout due to the TBL are proposed, which use all the available data instead of only final observations: modelling, pattern mixture models, treating dropouts as censored using the M3 method and joint modelling of tumour growth and dropout. The methods were applied to both a simulated data set and a real example. All four proposed methods led to an improvement in the estimate of treatment effect in the simulated data. The joint modelling method performed most strongly, with the censoring method also providing a good estimate of the treatment effect, but with higher uncertainty. In the real data example, the dose-response estimated using the censoring and joint modelling methods was higher than the very flat curve estimated from average final measurements. Accounting for dropout using the proposed censoring or joint modelling methods allows the treatment effect to be recovered in studies where it may have been obscured due to dropout caused by the TBL.

  5. KEY COMPARISON: Final report on CCQM-K57: Chemical composition of clay

    NASA Astrophysics Data System (ADS)

    Salas, Antonio; Ramírez, Estele

    2009-01-01

    After the successful completion of the pilot study, CCQM-P65 [1], the Inorganic Analysis Working Group of CCQM agreed to conduct key comparison CCQM-K57, Chemical composition of clay, in Paris, April 2006. The natural mass fraction levels of five elements—Si, Ca, Fe, Al and Mg—were measured and reported as oxides in clay. Six national metrology institutes participated in CCQM K57, and CENAM (Querétaro, Mexico) coordinated. The methods employed were isotope dilution mass spectrometry (IDMS), inductively coupled plasma-mass spectrometry (ICP-MS), inductively coupled plasma-optical emission spectrometry (ICP-OES) using the dehydration method and condensation method, gravimetric analysis, neutron activation analysis (NAA), prompt gamma activation analysis (PGAA) and x-ray fluorescence spectrometry (XRF) with the reconstitution method and external calibration. This final report presents the capability of the participant institutes, based on the KCRV, which was approved at the IAWG spring meeting in 2008, and the equivalence statements regarding the KCRV, approved at its autumn meeting. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (MRA).

  6. Advancing Clouds Lifecycle Representation in Numerical Models Using Innovative Analysis Methods that Bridge ARM Observations and Models Over a Breadth of Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kollias, Pavlos

    2016-09-06

    This the final report for the DE-SC0007096 - Advancing Clouds Lifecycle Representation in Numerical Models Using Innovative Analysis Methods that Bridge ARM Observations and Models Over a Breadth of Scales - PI: Pavlos Kollias. The final report outline the main findings of the research conducted using the aforementioned award in the area of cloud research from the cloud scale (10-100 m) to the mesoscale (20-50 km).

  7. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis. Moisture...: http://www.archives.gov/federal_register/code_of_federal_regulations/ibr_locations.html): (a) Moisture content—section 16.233 “Method I (52)—Official Final Action”, under the heading “Moisture”. (b) Milkfat...

  8. Solving of the coefficient inverse problems for a nonlinear singularly perturbed reaction-diffusion-advection equation with the final time data

    NASA Astrophysics Data System (ADS)

    Lukyanenko, D. V.; Shishlenin, M. A.; Volkov, V. T.

    2018-01-01

    We propose the numerical method for solving coefficient inverse problem for a nonlinear singularly perturbed reaction-diffusion-advection equation with the final time observation data based on the asymptotic analysis and the gradient method. Asymptotic analysis allows us to extract a priory information about interior layer (moving front), which appears in the direct problem, and boundary layers, which appear in the conjugate problem. We describe and implement the method of constructing a dynamically adapted mesh based on this a priory information. The dynamically adapted mesh significantly reduces the complexity of the numerical calculations and improve the numerical stability in comparison with the usual approaches. Numerical example shows the effectiveness of the proposed method.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rowe, M.D.; Pierce, B.L.

    This report presents results of tests of different final site selection methods used for siting large-scale facilities such as nuclear power plants. Test data are adapted from a nuclear power plant siting study conducted on Long Island, New York. The purpose of the tests is to determine whether or not different final site selection methods produce different results, and to obtain some understanding of the nature of any differences found. Decision rules and weighting methods are included. Decision rules tested are Weighting Summation, Power Law, Decision Analysis, Goal Programming, and Goal Attainment; weighting methods tested are Categorization, Ranking, Rating Ratiomore » Estimation, Metfessel Allocation, Indifferent Tradeoff, Decision Analysis lottery, and Global Evaluation. Results show that different methods can, indeed, produce different results, but that the probability that they will do so is controlled by the structure of differences among the sites being evaluated. Differences in weights and suitability scores attributable to methods have reduced significance if the alternatives include one or two sites that are superior to all others in many attributes. The more tradeoffs there are among good and bad levels of different attributes at different sites, the more important are the specifics of methods to the final decision. 5 refs., 14 figs., 19 tabs.« less

  10. Spectroscopic analysis and control

    DOEpatents

    Tate; , James D.; Reed, Christopher J.; Domke, Christopher H.; Le, Linh; Seasholtz, Mary Beth; Weber, Andy; Lipp, Charles

    2017-04-18

    Apparatus for spectroscopic analysis which includes a tunable diode laser spectrometer having a digital output signal and a digital computer for receiving the digital output signal from the spectrometer, the digital computer programmed to process the digital output signal using a multivariate regression algorithm. In addition, a spectroscopic method of analysis using such apparatus. Finally, a method for controlling an ethylene cracker hydrogenator.

  11. On the Analysis of Output Information of S-tree Method

    NASA Astrophysics Data System (ADS)

    Bekaryan, Karen M.; Melkonyan, Anahit A.

    2007-08-01

    On of the most popular and effective method of analysis of hierarchical structure of N-body gravitating systems is method of S-tree diagrams. Apart from many interesting peculiarities, the method, unfortunately, is not free from some disadvantages, among which most important is an extremely complexity of analysis of output information. To solve this problem a number of methods are suggested. From our point of view, most effective approach is an application of all these methods simultaneousely. This allows to obtaine more complete and objective «picture» concerning a final distribution.

  12. A homotopy analysis method for the nonlinear partial differential equations arising in engineering

    NASA Astrophysics Data System (ADS)

    Hariharan, G.

    2017-05-01

    In this article, we have established the homotopy analysis method (HAM) for solving a few partial differential equations arising in engineering. This technique provides the solutions in rapid convergence series with computable terms for the problems with high degree of nonlinear terms appearing in the governing differential equations. The convergence analysis of the proposed method is also discussed. Finally, we have given some illustrative examples to demonstrate the validity and applicability of the proposed method.

  13. An Analysis of a Finite Element Method for Convection-Diffusion Problems. Part II. A Posteriori Error Estimates and Adaptivity.

    DTIC Science & Technology

    1983-03-01

    AN ANALYSIS OF A FINITE ELEMENT METHOD FOR CONVECTION- DIFFUSION PROBLEMS PART II: A POSTERIORI ERROR ESTIMATES AND ADAPTIVITY by W. G. Szymczak Y 6a...PERIOD COVERED AN ANALYSIS OF A FINITE ELEMENT METHOD FOR final life of the contract CONVECTION- DIFFUSION PROBLEM S. Part II: A POSTERIORI ERROR ...Element Method for Convection- Diffusion Problems. Part II: A Posteriori Error Estimates and Adaptivity W. G. Szvmczak and I. Babu~ka# Laboratory for

  14. Determining association constants from titration experiments in supramolecular chemistry.

    PubMed

    Thordarson, Pall

    2011-03-01

    The most common approach for quantifying interactions in supramolecular chemistry is a titration of the guest to solution of the host, noting the changes in some physical property through NMR, UV-Vis, fluorescence or other techniques. Despite the apparent simplicity of this approach, there are several issues that need to be carefully addressed to ensure that the final results are reliable. This includes the use of non-linear rather than linear regression methods, careful choice of stoichiometric binding model, the choice of method (e.g., NMR vs. UV-Vis) and concentration of host, the application of advanced data analysis methods such as global analysis and finally the estimation of uncertainties and confidence intervals for the results obtained. This tutorial review will give a systematic overview of all these issues-highlighting some of the key messages herein with simulated data analysis examples.

  15. Analysis of impact melt and vapor production in CTH for planetary applications

    DOE PAGES

    Quintana, S. N.; Crawford, D. A.; Schultz, P. H.

    2015-05-19

    This study explores impact melt and vapor generation for a variety of impact speeds and materials using the shock physics code CTH. The study first compares the results of two common methods of impact melt and vapor generation to demonstrate that both the peak pressure method and final temperature method are appropriate for high-speed impact models (speeds greater than 10 km/s). However, for low-speed impact models (speeds less than 10 km/s), only the final temperature method is consistent with laboratory analyses to yield melting and vaporization. Finally, a constitutive model for material strength is important for low-speed impacts because strengthmore » can cause an increase in melting and vaporization.« less

  16. Analysis of impact melt and vapor production in CTH for planetary applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quintana, S. N.; Crawford, D. A.; Schultz, P. H.

    This study explores impact melt and vapor generation for a variety of impact speeds and materials using the shock physics code CTH. The study first compares the results of two common methods of impact melt and vapor generation to demonstrate that both the peak pressure method and final temperature method are appropriate for high-speed impact models (speeds greater than 10 km/s). However, for low-speed impact models (speeds less than 10 km/s), only the final temperature method is consistent with laboratory analyses to yield melting and vaporization. Finally, a constitutive model for material strength is important for low-speed impacts because strengthmore » can cause an increase in melting and vaporization.« less

  17. A Century of Enzyme Kinetic Analysis, 1913 to 2013

    PubMed Central

    Johnson, Kenneth A.

    2013-01-01

    This review traces the history and logical progression of methods for quantitative analysis of enzyme kinetics from the 1913 Michaelis and Menten paper to the application of modern computational methods today. Following a brief review of methods for fitting steady state kinetic data, modern methods are highlighted for fitting full progress curve kinetics based upon numerical integration of rate equations, including a re-analysis of the original Michaelis-Menten full time course kinetic data. Finally, several illustrations of modern transient state kinetic methods of analysis are shown which enable the elucidation of reactions occurring at the active sites of enzymes in order to relate structure and function. PMID:23850893

  18. Analysis of the dynamics of movement of the landing vehicle with an inflatable braking device on the final trajectory under the influence of wind load

    NASA Astrophysics Data System (ADS)

    Koryanov, V.; Kazakovtsev, V.; Harri, A.-M.; Heilimo, J.; Haukka, H.; Aleksashkin, S.

    2015-10-01

    This research work is devoted to analysis of angular motion of the landing vehicle (LV) with an inflatable braking device (IBD), taking into account the influence of the wind load on the final stage of the movement. Using methods to perform a calculation of parameters of angular motion of the landing vehicle with an inflatable braking device based on the availability of small asymmetries, which are capable of complex dynamic phenomena, analyzes motion of the landing vehicle at the final stage of motion in the atmosphere.

  19. Structure identification methods for atomistic simulations of crystalline materials

    DOE PAGES

    Stukowski, Alexander

    2012-05-28

    Here, we discuss existing and new computational analysis techniques to classify local atomic arrangements in large-scale atomistic computer simulations of crystalline solids. This article includes a performance comparison of typical analysis algorithms such as common neighbor analysis (CNA), centrosymmetry analysis, bond angle analysis, bond order analysis and Voronoi analysis. In addition we propose a simple extension to the CNA method that makes it suitable for multi-phase systems. Finally, we introduce a new structure identification algorithm, the neighbor distance analysis, which is designed to identify atomic structure units in grain boundaries.

  20. In silico comparison of the reproducibility of full-arch implant provisional restorations to final restoration between a 3D Scan/CAD/CAM technique and the conventional method.

    PubMed

    Mino, Takuya; Maekawa, Kenji; Ueda, Akihiro; Higuchi, Shizuo; Sejima, Junichi; Takeuchi, Tetsuo; Hara, Emilio Satoshi; Kimura-Ono, Aya; Sonoyama, Wataru; Kuboki, Takuo

    2015-04-01

    The aim of this article was to investigate the accuracy in the reproducibility of full-arch implant provisional restorations to final restorations between a 3D Scan/CAD/CAM technique and the conventional method. We fabricated two final restorations for rehabilitation of maxillary and mandibular complete edentulous area and performed a computer-based comparative analysis of the accuracy in the reproducibility of the provisional restoration to final restoration between a 3D scanning and CAD/CAM (Scan/CAD/CAM) technique and the conventional silicone-mold transfer technique. Final restorations fabricated either by the conventional or Scan/CAD/CAM method were successfully installed in the patient. The total concave/convex volume discrepancy observed with the Scan/CAD/CAM technique was 503.50mm(3) and 338.15 mm(3) for maxillary and mandibular implant-supported prostheses (ISPs), respectively. On the other hand, total concave/convex volume discrepancy observed with the conventional method was markedly high (1106.84 mm(3) and 771.23 mm(3) for maxillary and mandibular ISPs, respectively). The results of the present report suggest that Scan/CAD/CAM method enables a more precise and accurate transfer of provisional restorations to final restorations compared to the conventional method. Copyright © 2014 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.

  1. An operational modal analysis method in frequency and spatial domain

    NASA Astrophysics Data System (ADS)

    Wang, Tong; Zhang, Lingmi; Tamura, Yukio

    2005-12-01

    A frequency and spatial domain decomposition method (FSDD) for operational modal analysis (OMA) is presented in this paper, which is an extension of the complex mode indicator function (CMIF) method for experimental modal analysis (EMA). The theoretical background of the FSDD method is clarified. Singular value decomposition is adopted to separate the signal space from the noise space. Finally, an enhanced power spectrum density (PSD) is proposed to obtain more accurate modal parameters by curve fitting in the frequency domain. Moreover, a simulation case and an application case are used to validate this method.

  2. A consensus reaching model for 2-tuple linguistic multiple attribute group decision making with incomplete weight information

    NASA Astrophysics Data System (ADS)

    Zhang, Wancheng; Xu, Yejun; Wang, Huimin

    2016-01-01

    The aim of this paper is to put forward a consensus reaching method for multi-attribute group decision-making (MAGDM) problems with linguistic information, in which the weight information of experts and attributes is unknown. First, some basic concepts and operational laws of 2-tuple linguistic label are introduced. Then, a grey relational analysis method and a maximising deviation method are proposed to calculate the incomplete weight information of experts and attributes respectively. To eliminate the conflict in the group, a weight-updating model is employed to derive the weights of experts based on their contribution to the consensus reaching process. After conflict elimination, the final group preference can be obtained which will give the ranking of the alternatives. The model can effectively avoid information distortion which is occurred regularly in the linguistic information processing. Finally, an illustrative example is given to illustrate the application of the proposed method and comparative analysis with the existing methods are offered to show the advantages of the proposed method.

  3. Study on a Multi-Frequency Homotopy Analysis Method for Period-Doubling Solutions of Nonlinear Systems

    NASA Astrophysics Data System (ADS)

    Fu, H. X.; Qian, Y. H.

    In this paper, a modification of homotopy analysis method (HAM) is applied to study the two-degree-of-freedom coupled Duffing system. Firstly, the process of calculating the two-degree-of-freedom coupled Duffing system is presented. Secondly, the single periodic solutions and double periodic solutions are obtained by solving the constructed nonlinear algebraic equations. Finally, comparing the periodic solutions obtained by the multi-frequency homotopy analysis method (MFHAM) and the fourth-order Runge-Kutta method, it is found that the approximate solution agrees well with the numerical solution.

  4. Analysis of 3D printing parameters of gears for hybrid manufacturing

    NASA Astrophysics Data System (ADS)

    Budzik, Grzegorz; Przeszlowski, Łukasz; Wieczorowski, Michal; Rzucidlo, Arkadiusz; Gapinski, Bartosz; Krolczyk, Grzegorz

    2018-05-01

    The paper deals with analysis and selection of parameters of rapid prototyping of gears by selective sintering of metal powders. Presented results show wide spectrum of application of RP systems in manufacturing processes of machine elements, basing on analysis of market in term of application of additive manufacturing technology in different sectors of industry. Considerable growth of these methods over the past years can be observed. The characteristic errors of printed model with respect to ideal one for each technique were pointed out. Special attention was paid to the method of preparation of numerical data CAD/STL/RP. Moreover the analysis of manufacturing processes of gear type elements was presented. The tested gears were modeled with different allowances for final machining and made by DMLS. Metallographic analysis and strength tests on prepared specimens were performed. The above mentioned analysis and tests were used to compare the real properties of material with the nominal ones. To improve the quality of surface after sintering the gears were subjected to final machining. The analysis of geometry of gears after hybrid manufacturing method was performed (fig.1). The manufacturing process was defined in a traditional way as well as with the aid of modern manufacturing techniques. Methodology and obtained results can be used for other machine elements than gears and constitutes the general theory of production processes in rapid prototyping methods as well as in designing and implementation of production.

  5. Analytical analysis and implementation of a low-speed high-torque permanent magnet vernier in-wheel motor for electric vehicle

    NASA Astrophysics Data System (ADS)

    Li, Jiangui; Wang, Junhua; Zhigang, Zhao; Yan, Weili

    2012-04-01

    In this paper, analytical analysis of the permanent magnet vernier (PMV) is presented. The key is to analytically solve the governing Laplacian/quasi-Poissonian field equations in the motor regions. By using the time-stepping finite element method, the analytical method is verified. Hence, the performances of the PMV machine are quantitatively compared with that of the analytical results. The analytical results agree well with the finite element method results. Finally, the experimental results are given to further show the validity of the analysis.

  6. Impact of the codec and various QoS methods on the final quality of the transferred voice in an IP network

    NASA Astrophysics Data System (ADS)

    Slavata, Oldřich; Holub, Jan

    2015-02-01

    This paper deals with an analysis of the relation between the codec that is used, the QoS method, and the final voice transmission quality. The Cisco 2811 router is used for adjusting QoS. VoIP client Linphone is used for adjusting the codec. The criterion for transmission quality is the MOS parameter investigated with the ITU-T P.862 PESQ and P.863 POLQA algorithms.

  7. Synthesis of calculational methods for design and analysis of radiation shields for nuclear rocket systems

    NASA Technical Reports Server (NTRS)

    Capo, M. A.; Disney, R. K.; Jordan, T. A.; Soltesz, R. G.; Woodsum, H. C.

    1969-01-01

    Eight computer programs make up a nine volume synthesis containing two design methods for nuclear rocket radiation shields. The first design method is appropriate for parametric and preliminary studies, while the second accomplishes the verification of a final nuclear rocket reactor design.

  8. Multibeam antenna study, phase 1

    NASA Technical Reports Server (NTRS)

    Bellamy, J. L.

    1972-01-01

    A multibeam antenna concept was developed for providing spot beam coverage of the contiguous 48 states. The selection of a suitable antenna concept for the multibeam application and an experimental evaluation of the antenna concept selected are described. The final analysis indicates that the preferred concept is a dual-antenna, circular artificial dielectric lens. A description of the analytical methods is provided, as well as a discussion of the absolute requirements placed on the antenna concepts. Finally, a comparative analysis of reflector antenna off-axis beam performance is presented.

  9. African Primary Care Research: Quantitative analysis and presentation of results

    PubMed Central

    Ogunbanjo, Gboyega A.

    2014-01-01

    Abstract This article is part of a series on Primary Care Research Methods. The article describes types of continuous and categorical data, how to capture data in a spreadsheet, how to use descriptive and inferential statistics and, finally, gives advice on how to present the results in text, figures and tables. The article intends to help Master's level students with writing the data analysis section of their research proposal and presenting their results in their final research report. PMID:26245435

  10. Response to the Directorate of Health Care Studies and Clinical Investigations Final Report (Revised) Assessing Power Analysis Approaches for the Fort Bragg Evaluation Project

    DTIC Science & Technology

    1993-07-28

    Below is a summary of Vanderbilt’s response, requested by Health Services Command ( HSC ), to the final report prepared by the Directorate of Health Care...Command’s ( HSC ) demands for client data it sought to use somehow in power analysis (see Appendix F). Dr. Kapadia’s statement is the standard view held...not involve actual data, the results from this method may be entirely misleading and not accurate." When HSC demanded client data for power analysis in

  11. Graph-based urban scene analysis using symbolic data

    NASA Astrophysics Data System (ADS)

    Moissinac, Henri; Maitre, Henri; Bloch, Isabelle

    1995-07-01

    A framework is presented for the interpretation of a urban landscape based on the analysis of aerial pictures. This method has been designed for the use of a priori knowledge provided by a geographic map in order to improve the image analysis stage. A coherent final interpretation of the studied area is proposed. It relies on a graph based data structure to modelize the urban landscape, and on a global uncertainty management to evaluate the final confidence we can have in the results presented. This structure and uncertainty management tend to reflect the hierarchy of the available data and the interpretation levels.

  12. Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria

    NASA Astrophysics Data System (ADS)

    Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong

    2017-08-01

    In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.

  13. A century of enzyme kinetic analysis, 1913 to 2013.

    PubMed

    Johnson, Kenneth A

    2013-09-02

    This review traces the history and logical progression of methods for quantitative analysis of enzyme kinetics from the 1913 Michaelis and Menten paper to the application of modern computational methods today. Following a brief review of methods for fitting steady state kinetic data, modern methods are highlighted for fitting full progress curve kinetics based upon numerical integration of rate equations, including a re-analysis of the original Michaelis-Menten full time course kinetic data. Finally, several illustrations of modern transient state kinetic methods of analysis are shown which enable the elucidation of reactions occurring at the active sites of enzymes in order to relate structure and function. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  14. Review and analysis of ASAP enforcement efforts, volume 4

    DOT National Transportation Integrated Search

    1975-08-01

    This Final Report recapitulates and summarizes the work of a contract on Review and Analysis of ASAP Enforcement Effort. The major sections of the report are contained in four volumes. Volume 1, Methods for Recording the Behavior of Drinking Drivers,...

  15. Review and analysis of ASAP enforcement efforts, volume 3

    DOT National Transportation Integrated Search

    1975-08-01

    This Final Report recapitulates and summarizes the work of a contract on Review and Analysis of ASAP Enforcement Effort. The major sections of the report are contained in four volumes. Volume 1, Methods for Recording the Behavior of Drinking Drivers,...

  16. Review and analysis of ASAP enforcement efforts, volume 1

    DOT National Transportation Integrated Search

    1975-08-01

    This Final Report recapitulates and summarizes the work of a contract on : Review and Analysis of ASAP Enforcement Effort. The major sections of the report : are contained in four volumes. : Volume 1, Methods for Recording the Behavior of Drinking Dr...

  17. Review and analysis of ASAP enforcement efforts, volume 2

    DOT National Transportation Integrated Search

    1975-08-01

    This Final Report recapitulates and summarizes the work of a contract on Review and Analysis of ASAP Enforcement Effort. The major sections of the report are contained in four volumes. Volume 1, Methods for Recording the Behavior of Drinking Drivers,...

  18. RICH detectors: Analysis methods and their impact on physics

    NASA Astrophysics Data System (ADS)

    Križan, Peter

    2017-12-01

    The paper discusses the importance of particle identification in particle physics experiments, and reviews the impact of ring imaging Cherenkov (RICH) counters in experiments that are currently running, or are under construction. Several analysis methods are discussed that are needed to calibrate a RICH counter, and to align its components with the rest of the detector. Finally, methods are reviewed on how to employ the collected data to efficiently separate one particle species from the other.

  19. Error analysis in stereo vision for location measurement of 3D point

    NASA Astrophysics Data System (ADS)

    Li, Yunting; Zhang, Jun; Tian, Jinwen

    2015-12-01

    Location measurement of 3D point in stereo vision is subjected to different sources of uncertainty that propagate to the final result. For current methods of error analysis, most of them are based on ideal intersection model to calculate the uncertainty region of point location via intersecting two fields of view of pixel that may produce loose bounds. Besides, only a few of sources of error such as pixel error or camera position are taken into account in the process of analysis. In this paper we present a straightforward and available method to estimate the location error that is taken most of source of error into account. We summed up and simplified all the input errors to five parameters by rotation transformation. Then we use the fast algorithm of midpoint method to deduce the mathematical relationships between target point and the parameters. Thus, the expectations and covariance matrix of 3D point location would be obtained, which can constitute the uncertainty region of point location. Afterwards, we turned back to the error propagation of the primitive input errors in the stereo system and throughout the whole analysis process from primitive input errors to localization error. Our method has the same level of computational complexity as the state-of-the-art method. Finally, extensive experiments are performed to verify the performance of our methods.

  20. Influence of Gender and Other Factors to Final Grade

    ERIC Educational Resources Information Center

    Domeova, Ludmila; Jindrova, Andrea; Fejfar, Jiri

    2015-01-01

    The study focused on the relations between the partial evaluation and the final grade. The investigation has been done on a group of 269 students of the Czech University of Life Sciences in Prague, in the course of Mathematical Methods, who have to go through a strictly defined evaluation scheme. The results of statistical analysis confirmed that…

  1. Active controls: A look at analytical methods and associated tools

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.

    1984-01-01

    A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.

  2. Prediction of adult height by Tanner-Whitehouse method in young Caucasian male athletes.

    PubMed

    Ostojic, S M

    2013-04-01

    Although the accuracy of final height prediction using skeletal age development has been confirmed in many studies for children treated for congenital primary hypothyroidism, short normal children, constitutionally tall children, no studies compared the predicted adult height at young age with final stature in athletic population. In this study, the intention was to investigate to what extent the Tanner-Whitehouse (TW) method is adequate for prediction of final stature in young Caucasian male athletes. Prospective observational study. Plain radiographs of the left hand and wrist were obtained from 477 athletic children (ranging in age from 8.0 to 17.9 years) who came to the outpatient clinic between 2000 and 2011 for adult height estimation, with no orthopedic trauma suspected. Adult height was estimated using bone age rates according to TW method. Height was measured both at baseline and follow-up (at the age of 19 years). No significant difference was found between the estimated adult height (184.9 ± 9.7 cm) and final stature (185.6 ± 9.6 cm) [95% confidence interval (CI) 1.61-3.01, P = 0.55]. The relationship between estimated and final adult height was high (r = 0.96). Bland-Altman analysis confirmed that the 95% of differences between estimated adult height and final stature lie between limits of agreement (mean ± 2 SD) (-5.84 and 4.52 cm). TW method is an accurate method of predicting adult height in male normal-growing athletic boys.

  3. African Primary Care Research: Qualitative data analysis and writing results

    PubMed Central

    Govender, Indiran; Ogunbanjo, Gboyega A.; Mash, Bob

    2014-01-01

    Abstract This article is part of a series on African primary care research and gives practical guidance on qualitative data analysis and the presentation of qualitative findings. After an overview of qualitative methods and analytical approaches, the article focuses particularly on content analysis, using the framework method as an example. The steps of familiarisation, creating a thematic index, indexing, charting, interpretation and confirmation are described. Key concepts with regard to establishing the quality and trustworthiness of data analysis are described. Finally, an approach to the presentation of qualitative findings is given. PMID:26245437

  4. African Primary Care Research: qualitative data analysis and writing results.

    PubMed

    Mabuza, Langalibalele H; Govender, Indiran; Ogunbanjo, Gboyega A; Mash, Bob

    2014-06-05

    This article is part of a series on African primary care research and gives practical guidance on qualitative data analysis and the presentation of qualitative findings. After an overview of qualitative methods and analytical approaches, the article focuses particularly on content analysis, using the framework method as an example. The steps of familiarisation, creating a thematic index, indexing, charting, interpretation and confirmation are described. Key concepts with regard to establishing the quality and trustworthiness of data analysis are described. Finally, an approach to the presentation of qualitative findings is given.

  5. Well-tempered metadynamics converges asymptotically.

    PubMed

    Dama, James F; Parrinello, Michele; Voth, Gregory A

    2014-06-20

    Metadynamics is a versatile and capable enhanced sampling method for the computational study of soft matter materials and biomolecular systems. However, over a decade of application and several attempts to give this adaptive umbrella sampling method a firm theoretical grounding prove that a rigorous convergence analysis is elusive. This Letter describes such an analysis, demonstrating that well-tempered metadynamics converges to the final state it was designed to reach and, therefore, that the simple formulas currently used to interpret the final converged state of tempered metadynamics are correct and exact. The results do not rely on any assumption that the collective variable dynamics are effectively Brownian or any idealizations of the hill deposition function; instead, they suggest new, more permissive criteria for the method to be well behaved. The results apply to tempered metadynamics with or without adaptive Gaussians or boundary corrections and whether the bias is stored approximately on a grid or exactly.

  6. Well-Tempered Metadynamics Converges Asymptotically

    NASA Astrophysics Data System (ADS)

    Dama, James F.; Parrinello, Michele; Voth, Gregory A.

    2014-06-01

    Metadynamics is a versatile and capable enhanced sampling method for the computational study of soft matter materials and biomolecular systems. However, over a decade of application and several attempts to give this adaptive umbrella sampling method a firm theoretical grounding prove that a rigorous convergence analysis is elusive. This Letter describes such an analysis, demonstrating that well-tempered metadynamics converges to the final state it was designed to reach and, therefore, that the simple formulas currently used to interpret the final converged state of tempered metadynamics are correct and exact. The results do not rely on any assumption that the collective variable dynamics are effectively Brownian or any idealizations of the hill deposition function; instead, they suggest new, more permissive criteria for the method to be well behaved. The results apply to tempered metadynamics with or without adaptive Gaussians or boundary corrections and whether the bias is stored approximately on a grid or exactly.

  7. On accelerated flow of MHD powell-eyring fluid via homotopy analysis method

    NASA Astrophysics Data System (ADS)

    Salah, Faisal; Viswanathan, K. K.; Aziz, Zainal Abdul

    2017-09-01

    The aim of this article is to obtain the approximate analytical solution for incompressible magnetohydrodynamic (MHD) flow for Powell-Eyring fluid induced by an accelerated plate. Both constant and variable accelerated cases are investigated. Approximate analytical solution in each case is obtained by using the Homotopy Analysis Method (HAM). The resulting nonlinear analysis is carried out to generate the series solution. Finally, Graphical outcomes of different values of the material constants parameters on the velocity flow field are discussed and analyzed.

  8. Microarray technology for major chemical contaminants analysis in food: current status and prospects.

    PubMed

    Zhang, Zhaowei; Li, Peiwu; Hu, Xiaofeng; Zhang, Qi; Ding, Xiaoxia; Zhang, Wen

    2012-01-01

    Chemical contaminants in food have caused serious health issues in both humans and animals. Microarray technology is an advanced technique suitable for the analysis of chemical contaminates. In particular, immuno-microarray approach is one of the most promising methods for chemical contaminants analysis. The use of microarrays for the analysis of chemical contaminants is the subject of this review. Fabrication strategies and detection methods for chemical contaminants are discussed in detail. Application to the analysis of mycotoxins, biotoxins, pesticide residues, and pharmaceutical residues is also described. Finally, future challenges and opportunities are discussed.

  9. Automated Analysis of Counselor Style and Effects: The Development and Evaluation of Methods and Materials to Assess the Stylistic Accuracy and Outcome Effectiveness of Counselor Verbal Behavior. Final Report.

    ERIC Educational Resources Information Center

    Pepyne, Edward W.

    This project attempts to develop, evaluate and implement methods and materials for the automated analysis of the stylistic characteristics of counselor verbal behavior and its effects on client verbal behavior within the counseling interview. To achieve this purpose, the project designed a system of computer programs, the DISCOURSE ANALYSIS…

  10. Regulatory Impact Analysis: Amendments to the National Emission Standards for Hazardous Air Pollutants (NESHAP) and New Source Perofrmance Standards (NSPS) for the Portland Cement Manufacturing Industry Final Report

    EPA Pesticide Factsheets

    For the regulatory process, EPA is required to develop a regulatory impact analysis (RIA). This August 2010 RIA includes an economic impact analysis (EIA) and a small entity impacts analysis and documents the RIA methods and results for the 2010 rules

  11. 77 FR 50153 - Special Purpose Permit Application; Hawaii-Based Shallow-Set Longline Fishery; Final...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-20

    ... a final environmental assessment (FEA) and finding of no significant impact (FONSI) in our analysis... result in significant impacts to the human environment. ADDRESSES: You may download a copy of the FEA and... use one of the methods below to request a hard copy or a CD- ROM. Please specify the ``FEA/FONSI for...

  12. Implementation of hazard analysis and critical control point (HACCP) in dried anchovy production process

    NASA Astrophysics Data System (ADS)

    Citraresmi, A. D. P.; Wahyuni, E. E.

    2018-03-01

    The aim of this study was to inspect the implementation of Hazard Analysis and Critical Control Point (HACCP) for identification and prevention of potential hazards in the production process of dried anchovy at PT. Kelola Mina Laut (KML), Lobuk unit, Sumenep. Cold storage process is needed in each anchovy processing step in order to maintain its physical and chemical condition. In addition, the implementation of quality assurance system should be undertaken to maintain product quality. The research was conducted using a survey method, by following the whole process of making anchovy from the receiving raw materials to the packaging of final product. The method of data analysis used was descriptive analysis method. Implementation of HACCP at PT. KML, Lobuk unit, Sumenep was conducted by applying Pre Requisite Programs (PRP) and preparation stage consisting of 5 initial stages and 7 principles of HACCP. The results showed that CCP was found in boiling process flow with significant hazard of Listeria monocytogenesis bacteria and final sorting process with significant hazard of foreign material contamination in the product. Actions taken were controlling boiling temperature of 100 – 105°C for 3 - 5 minutes and training for sorting process employees.

  13. High frequency, high time resolution time-to-digital converter employing passive resonating circuits.

    PubMed

    Ripamonti, Giancarlo; Abba, Andrea; Geraci, Angelo

    2010-05-01

    A method for measuring time intervals accurate to the picosecond range is based on phase measurements of oscillating waveforms synchronous with their beginning and/or end. The oscillation is generated by triggering an LC resonant circuit, whose capacitance is precharged. By using high Q resonators and a final active quenching of the oscillation, it is possible to conjugate high time resolution and a small measurement time, which allows a high measurement rate. Methods for fast analysis of the data are considered and discussed with reference to computing resource requirements, speed, and accuracy. Experimental tests show the feasibility of the method and a time accuracy better than 4 ps rms. Methods aimed at further reducing hardware resources are finally discussed.

  14. Regression Analysis and Calibration Recommendations for the Characterization of Balance Temperature Effects

    NASA Technical Reports Server (NTRS)

    Ulbrich, N.; Volden, T.

    2018-01-01

    Analysis and use of temperature-dependent wind tunnel strain-gage balance calibration data are discussed in the paper. First, three different methods are presented and compared that may be used to process temperature-dependent strain-gage balance data. The first method uses an extended set of independent variables in order to process the data and predict balance loads. The second method applies an extended load iteration equation during the analysis of balance calibration data. The third method uses temperature-dependent sensitivities for the data analysis. Physical interpretations of the most important temperature-dependent regression model terms are provided that relate temperature compensation imperfections and the temperature-dependent nature of the gage factor to sets of regression model terms. Finally, balance calibration recommendations are listed so that temperature-dependent calibration data can be obtained and successfully processed using the reviewed analysis methods.

  15. Asymptotic modal analysis and statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Dowell, Earl H.

    1992-01-01

    Asymptotic Modal Analysis (AMA) is a method which is used to model linear dynamical systems with many participating modes. The AMA method was originally developed to show the relationship between statistical energy analysis (SEA) and classical modal analysis (CMA). In the limit of a large number of modes of a vibrating system, the classical modal analysis result can be shown to be equivalent to the statistical energy analysis result. As the CMA result evolves into the SEA result, a number of systematic assumptions are made. Most of these assumptions are based upon the supposition that the number of modes approaches infinity. It is for this reason that the term 'asymptotic' is used. AMA is the asymptotic result of taking the limit of CMA as the number of modes approaches infinity. AMA refers to any of the intermediate results between CMA and SEA, as well as the SEA result which is derived from CMA. The main advantage of the AMA method is that individual modal characteristics are not required in the model or computations. By contrast, CMA requires that each modal parameter be evaluated at each frequency. In the latter, contributions from each mode are computed and the final answer is obtained by summing over all the modes in the particular band of interest. AMA evaluates modal parameters only at their center frequency and does not sum the individual contributions from each mode in order to obtain a final result. The method is similar to SEA in this respect. However, SEA is only capable of obtaining spatial averages or means, as it is a statistical method. Since AMA is systematically derived from CMA, it can obtain local spatial information as well.

  16. Investigating cardiorespiratory interaction by cross-spectral analysis of event series

    NASA Astrophysics Data System (ADS)

    Schäfer, Carsten; Rosenblum, Michael G.; Pikovsky, Arkady S.; Kurths, Jürgen

    2000-02-01

    The human cardiovascular and respiratory systems interact with each other and show effects of modulation and synchronization. Here we present a cross-spectral technique that specifically considers the event-like character of the heartbeat and avoids typical restrictions of other spectral methods. Using models as well as experimental data, we demonstrate how modulation and synchronization can be distinguished. Finally, we compare the method to traditional techniques and to the analysis of instantaneous phases.

  17. Final Report for Dynamic Models for Causal Analysis of Panel Data. Methods for Temporal Analysis. Part I, Chapter 1.

    ERIC Educational Resources Information Center

    Hannan, Michael T.; Tuma, Nancy Brandon

    This document is part of a series of chapters described in SO 011 759. Working from the premise that temporal analysis is indispensable for the study of change, the document examines major alternatives in research design of this nature. Five sections focus on the features, advantages, and limitations of temporal analysis. Four designs which…

  18. 75 FR 53968 - Reverb Communications, Inc.; Analysis of Proposed Consent Order To Aid Public Comment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-02

    ... final the agreement's proposed order. This matter involves the public relations, marketing, and sales... Consent Order To Aid Public Comment AGENCY: Federal Trade Commission. ACTION: Proposed Consent Agreement... or deceptive acts or practices or unfair methods of competition. The attached Analysis to Aid Public...

  19. 14 CFR 415.204-415.400 - [Reserved

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Subsystem Design Information 10.4Flight Safety System Analyses 10.5Flight Termination System Environmental... Analysis 4.1.1Flight Safety Sub-Analyses, Methods, and Assumptions 4.1.2Sample Calculation and Products 4.1.3 Launch Specific Updates and Final Flight Safety Analysis Data 4.2Radionuclide Data (where...

  20. 14 CFR 415.204-415.400 - [Reserved

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Subsystem Design Information 10.4Flight Safety System Analyses 10.5Flight Termination System Environmental... Analysis 4.1.1Flight Safety Sub-Analyses, Methods, and Assumptions 4.1.2Sample Calculation and Products 4.1.3 Launch Specific Updates and Final Flight Safety Analysis Data 4.2Radionuclide Data (where...

  1. 14 CFR 415.204-415.400 - [Reserved

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Subsystem Design Information 10.4Flight Safety System Analyses 10.5Flight Termination System Environmental... Analysis 4.1.1Flight Safety Sub-Analyses, Methods, and Assumptions 4.1.2Sample Calculation and Products 4.1.3 Launch Specific Updates and Final Flight Safety Analysis Data 4.2Radionuclide Data (where...

  2. Methods for Human Dehydration Measurement

    NASA Astrophysics Data System (ADS)

    Trenz, Florian; Weigel, Robert; Hagelauer, Amelie

    2018-03-01

    The aim of this article is to give a broad overview of current methods for the identification and quantification of the human dehydration level. Starting off from most common clinical setups, including vital parameters and general patients' appearance, more quantifiable results from chemical laboratory and electromagnetic measurement methods will be reviewed. Different analysis methods throughout the electromagnetic spectrum, ranging from direct current (DC) conductivity measurements up to neutron activation analysis (NAA), are discussed on the base of published results. Finally, promising technologies, which allow for an integration of a dehydration assessment system in a compact and portable way, will be spotted.

  3. Boundary element analysis of post-tensioned slabs

    NASA Astrophysics Data System (ADS)

    Rashed, Youssef F.

    2015-06-01

    In this paper, the boundary element method is applied to carry out the structural analysis of post-tensioned flat slabs. The shear-deformable plate-bending model is employed. The effect of the pre-stressing cables is taken into account via the equivalent load method. The formulation is automated using a computer program, which uses quadratic boundary elements. Verification samples are presented, and finally a practical application is analyzed where results are compared against those obtained from the finite element method. The proposed method is efficient in terms of computer storage and processing time as well as the ease in data input and modifications.

  4. Decoding the Principles of Emergence and Resiliency in Biological Collective Systems - A Multi-Scale Approach: Final Report

    DTIC Science & Technology

    2018-02-15

    models and approaches are also valid using other invasive and non - invasive technologies. Finally, we illustrate and experimentally evaluate this...2017 Project Outline q  Pattern formation diversity in wild microbial societies q  Experimental and mathematical analysis methodology q  Skeleton...chemotaxis, nutrient degradation, and the exchange of amino acids between cells. Using both quantitative experimental methods and several theoretical

  5. A simple method for processing data with least square method

    NASA Astrophysics Data System (ADS)

    Wang, Chunyan; Qi, Liqun; Chen, Yongxiang; Pang, Guangning

    2017-08-01

    The least square method is widely used in data processing and error estimation. The mathematical method has become an essential technique for parameter estimation, data processing, regression analysis and experimental data fitting, and has become a criterion tool for statistical inference. In measurement data analysis, the distribution of complex rules is usually based on the least square principle, i.e., the use of matrix to solve the final estimate and to improve its accuracy. In this paper, a new method is presented for the solution of the method which is based on algebraic computation and is relatively straightforward and easy to understand. The practicability of this method is described by a concrete example.

  6. Design and analysis of tubular permanent magnet linear generator for small-scale wave energy converter

    NASA Astrophysics Data System (ADS)

    Kim, Jeong-Man; Koo, Min-Mo; Jeong, Jae-Hoon; Hong, Keyyong; Cho, Il-Hyoung; Choi, Jang-Young

    2017-05-01

    This paper reports the design and analysis of a tubular permanent magnet linear generator (TPMLG) for a small-scale wave-energy converter. The analytical field computation is performed by applying a magnetic vector potential and a 2-D analytical model to determine design parameters. Based on analytical solutions, parametric analysis is performed to meet the design specifications of a wave-energy converter (WEC). Then, 2-D FEA is employed to validate the analytical method. Finally, the experimental result confirms the predictions of the analytical and finite element analysis (FEA) methods under regular and irregular wave conditions.

  7. Numerical analysis for the fractional diffusion and fractional Buckmaster equation by the two-step Laplace Adam-Bashforth method

    NASA Astrophysics Data System (ADS)

    Jain, Sonal

    2018-01-01

    In this paper, we aim to use the alternative numerical scheme given by Gnitchogna and Atangana for solving partial differential equations with integer and non-integer differential operators. We applied this method to fractional diffusion model and fractional Buckmaster models with non-local fading memory. The method yields a powerful numerical algorithm for fractional order derivative to implement. Also we present in detail the stability analysis of the numerical method for solving the diffusion equation. This proof shows that this method is very stable and also converges very quickly to exact solution and finally some numerical simulation is presented.

  8. Wind Plant Performance Prediction (WP3) Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craig, Anna

    The methods for analysis of operational wind plant data are highly variable across the wind industry, leading to high uncertainties in the validation and bias-correction of preconstruction energy estimation methods. Lack of credibility in the preconstruction energy estimates leads to significant impacts on project financing and therefore the final levelized cost of energy for the plant. In this work, the variation in the evaluation of a wind plant's operational energy production as a result of variations in the processing methods applied to the operational data is examined. Preliminary results indicate that selection of the filters applied to the data andmore » the filter parameters can have significant impacts in the final computed assessment metrics.« less

  9. Analysis of fuel using the Direct LSC method determination of bio-originated fuel in the presence of quenching

    DOE PAGES

    Doll, Charles G.; Wright, Cherylyn W.; Morley, Shannon M.; ...

    2017-02-01

    In this paper, a modified version of the Direct LSC method to correct for quenching effect was investigated for the determination of bio-originated fuel content in fuel samples produced from multiple biological starting materials. The modified method was found to be accurate in determining the percent bio-originated fuel to within 5% of the actual value for samples with quenching effects ≤43%. Finally, analysis of highly quenched samples was possible when diluted with the exception of one sample with a 100% quenching effect.

  10. Analysis of fuel using the Direct LSC method determination of bio-originated fuel in the presence of quenching

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doll, Charles G.; Wright, Cherylyn W.; Morley, Shannon M.

    In this paper, a modified version of the Direct LSC method to correct for quenching effect was investigated for the determination of bio-originated fuel content in fuel samples produced from multiple biological starting materials. The modified method was found to be accurate in determining the percent bio-originated fuel to within 5% of the actual value for samples with quenching effects ≤43%. Finally, analysis of highly quenched samples was possible when diluted with the exception of one sample with a 100% quenching effect.

  11. Identification or Development of Chemical Analysis Methods for Plants and Animal Tissues

    DTIC Science & Technology

    1981-01-01

    Report No. DRXTH-TE-CR-80086 [E EMWSTEEACISITE - IDENTIFICATION OR DEVELOPMENT OF CHEMICAL ANALYSISI METHODS FOR PLANTS AND ANIMAL TISSUES D l...86i 4TITLE (ansdo"t) TYPE or" P 4. Iih~iti.)Final epwt. )9 A4,417 ~, Identlifiation or Development of Chemical I-g 9 O d 18 Analysis Methods for Plants...n TN oT sacmpihdo in. wEr DContinng er adetcsr aid Iuent bybopkmertda lo aeo 1.5th 1/s deeipta24ndectr Tritrotasolsoepae d andT Boloia Matrc,, 1473

  12. An empirical study using permutation-based resampling in meta-regression

    PubMed Central

    2012-01-01

    Background In meta-regression, as the number of trials in the analyses decreases, the risk of false positives or false negatives increases. This is partly due to the assumption of normality that may not hold in small samples. Creation of a distribution from the observed trials using permutation methods to calculate P values may allow for less spurious findings. Permutation has not been empirically tested in meta-regression. The objective of this study was to perform an empirical investigation to explore the differences in results for meta-analyses on a small number of trials using standard large sample approaches verses permutation-based methods for meta-regression. Methods We isolated a sample of randomized controlled clinical trials (RCTs) for interventions that have a small number of trials (herbal medicine trials). Trials were then grouped by herbal species and condition and assessed for methodological quality using the Jadad scale, and data were extracted for each outcome. Finally, we performed meta-analyses on the primary outcome of each group of trials and meta-regression for methodological quality subgroups within each meta-analysis. We used large sample methods and permutation methods in our meta-regression modeling. We then compared final models and final P values between methods. Results We collected 110 trials across 5 intervention/outcome pairings and 5 to 10 trials per covariate. When applying large sample methods and permutation-based methods in our backwards stepwise regression the covariates in the final models were identical in all cases. The P values for the covariates in the final model were larger in 78% (7/9) of the cases for permutation and identical for 22% (2/9) of the cases. Conclusions We present empirical evidence that permutation-based resampling may not change final models when using backwards stepwise regression, but may increase P values in meta-regression of multiple covariates for relatively small amount of trials. PMID:22587815

  13. Tools for T-RFLP data analysis using Excel.

    PubMed

    Fredriksson, Nils Johan; Hermansson, Malte; Wilén, Britt-Marie

    2014-11-08

    Terminal restriction fragment length polymorphism (T-RFLP) analysis is a DNA-fingerprinting method that can be used for comparisons of the microbial community composition in a large number of samples. There is no consensus on how T-RFLP data should be treated and analyzed before comparisons between samples are made, and several different approaches have been proposed in the literature. The analysis of T-RFLP data can be cumbersome and time-consuming, and for large datasets manual data analysis is not feasible. The currently available tools for automated T-RFLP analysis, although valuable, offer little flexibility, and few, if any, options regarding what methods to use. To enable comparisons and combinations of different data treatment methods an analysis template and an extensive collection of macros for T-RFLP data analysis using Microsoft Excel were developed. The Tools for T-RFLP data analysis template provides procedures for the analysis of large T-RFLP datasets including application of a noise baseline threshold and setting of the analysis range, normalization and alignment of replicate profiles, generation of consensus profiles, normalization and alignment of consensus profiles and final analysis of the samples including calculation of association coefficients and diversity index. The procedures are designed so that in all analysis steps, from the initial preparation of the data to the final comparison of the samples, there are various different options available. The parameters regarding analysis range, noise baseline, T-RF alignment and generation of consensus profiles are all given by the user and several different methods are available for normalization of the T-RF profiles. In each step, the user can also choose to base the calculations on either peak height data or peak area data. The Tools for T-RFLP data analysis template enables an objective and flexible analysis of large T-RFLP datasets in a widely used spreadsheet application.

  14. An adhered-particle analysis system based on concave points

    NASA Astrophysics Data System (ADS)

    Wang, Wencheng; Guan, Fengnian; Feng, Lin

    2018-04-01

    Particles adhered together will influence the image analysis in computer vision system. In this paper, a method based on concave point is designed. First, corner detection algorithm is adopted to obtain a rough estimation of potential concave points after image segmentation. Then, it computes the area ratio of the candidates to accurately localize the final separation points. Finally, it uses the separation points of each particle and the neighboring pixels to estimate the original particles before adhesion and provides estimated profile images. The experimental results have shown that this approach can provide good results that match the human visual cognitive mechanism.

  15. Methodological issues underlying multiple decrement life table analysis.

    PubMed

    Mode, C J; Avery, R C; Littman, G S; Potter, R G

    1977-02-01

    In this paper, the actuarial method of multiple decrement life table analysis of censored, longitudinal data is examined. The discussion is organized in terms of the first segment of usage of an intrauterine device. Weaknesses of the actuarial approach are pointed out, and an alternative approach, based on the classical model of competing risks, is proposed. Finally, the actuarial and the alternative method of analyzing censored data are compared, using data from the Taichung Medical Study on Intrauterine Devices.

  16. Nano Electronics on Atomically Controlled van der Waals Quantum Heterostructures

    DTIC Science & Technology

    2015-03-30

    for the structural of the atomically sharp interface between hBN and Bi2Te3. Finally, we have developed unprecedentedly clean graphene supercoductor...crystals by MBE method. We also use transmission electron microscopy (TEM) analysis for the structural of the atomically sharp interface between hBN and...by MBE method. We also use transmission electron microscopy (TEM) analysis for the structural of the atomically sharp interface between hBN and Bi2Te3

  17. Comparing Methods for Assessing Reliability Uncertainty Based on Pass/Fail Data Collected Over Time

    DOE PAGES

    Abes, Jeff I.; Hamada, Michael S.; Hills, Charles R.

    2017-12-20

    In this paper, we compare statistical methods for analyzing pass/fail data collected over time; some methods are traditional and one (the RADAR or Rationale for Assessing Degradation Arriving at Random) was recently developed. These methods are used to provide uncertainty bounds on reliability. We make observations about the methods' assumptions and properties. Finally, we illustrate the differences between two traditional methods, logistic regression and Weibull failure time analysis, and the RADAR method using a numerical example.

  18. Comparing Methods for Assessing Reliability Uncertainty Based on Pass/Fail Data Collected Over Time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abes, Jeff I.; Hamada, Michael S.; Hills, Charles R.

    In this paper, we compare statistical methods for analyzing pass/fail data collected over time; some methods are traditional and one (the RADAR or Rationale for Assessing Degradation Arriving at Random) was recently developed. These methods are used to provide uncertainty bounds on reliability. We make observations about the methods' assumptions and properties. Finally, we illustrate the differences between two traditional methods, logistic regression and Weibull failure time analysis, and the RADAR method using a numerical example.

  19. The solution of linear systems of equations with a structural analysis code on the NAS CRAY-2

    NASA Technical Reports Server (NTRS)

    Poole, Eugene L.; Overman, Andrea L.

    1988-01-01

    Two methods for solving linear systems of equations on the NAS Cray-2 are described. One is a direct method; the other is an iterative method. Both methods exploit the architecture of the Cray-2, particularly the vectorization, and are aimed at structural analysis applications. To demonstrate and evaluate the methods, they were installed in a finite element structural analysis code denoted the Computational Structural Mechanics (CSM) Testbed. A description of the techniques used to integrate the two solvers into the Testbed is given. Storage schemes, memory requirements, operation counts, and reformatting procedures are discussed. Finally, results from the new methods are compared with results from the initial Testbed sparse Choleski equation solver for three structural analysis problems. The new direct solvers described achieve the highest computational rates of the methods compared. The new iterative methods are not able to achieve as high computation rates as the vectorized direct solvers but are best for well conditioned problems which require fewer iterations to converge to the solution.

  20. p-barp interactions at 2. 32 GeV/c

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, C.K.; Fields, T.; Rhines, D.S.

    1978-01-01

    A bubble-chamber experiment based on 304 000 events of p-barp interactions at 2.32 GeV/c is described. The film was automatically scanned and measured by the POLLY II system. Details of the data-analysis methods are given. We report results on cross sections for constrained final states, tests of C invariance, and inclusive pion and rho/sup 0/ multiplicity parameters for annihilation final states.

  1. Visual Tools for Eliciting Connections and Cohesiveness in Mixed Methods Research

    ERIC Educational Resources Information Center

    Murawska, Jaclyn M.; Walker, David A.

    2017-01-01

    In this commentary, we offer a set of visual tools that can assist education researchers, especially those in the field of mathematics, in developing cohesiveness from a mixed methods perspective, commencing at a study's research questions and literature review, through its data collection and analysis, and finally to its results. This expounds…

  2. [Identification of the scope of practice for dental nurses with Delphi method].

    PubMed

    Li, Yu-Hong; Lu, Yue-Cen; Huang, Yao; Ruan, Hong; Wu, Zheng-Yi

    2016-10-01

    To identify the practice scope of dental nurses under the new situations. The draft of scope of practice for dental nurses was based on theoretical analysis, literature review and consultation of advisory panel, and the final scope of practice for dental nurses was established by using the Delphi method. Statistical analysis was implemented using coefficient of variation, Kendall W with SPSS 17.0 software package. Thirty experts were consulted twice by using the Delphi method. The effective rates of two rounds of questionnaire were 100% and 73.3%, respectively. The authority coefficient was 0.837, and the P value of expert coordination coefficients W was less than 0.05. There were totally 116 suggestions from the experts, and 96 were accepted. The scope of practice for dental nurses was finally established, including 4 primary indexes and 25 secondary indexes. The scope of practice for dental nurses under the new situations is established in China through scientific methods. It is favorable for position management of dental nurses and may promote the development of nurse specialists in dental clinic.

  3. Dictionaries and distributions: Combining expert knowledge and large scale textual data content analysis : Distributed dictionary representation.

    PubMed

    Garten, Justin; Hoover, Joe; Johnson, Kate M; Boghrati, Reihane; Iskiwitch, Carol; Dehghani, Morteza

    2018-02-01

    Theory-driven text analysis has made extensive use of psychological concept dictionaries, leading to a wide range of important results. These dictionaries have generally been applied through word count methods which have proven to be both simple and effective. In this paper, we introduce Distributed Dictionary Representations (DDR), a method that applies psychological dictionaries using semantic similarity rather than word counts. This allows for the measurement of the similarity between dictionaries and spans of text ranging from complete documents to individual words. We show how DDR enables dictionary authors to place greater emphasis on construct validity without sacrificing linguistic coverage. We further demonstrate the benefits of DDR on two real-world tasks and finally conduct an extensive study of the interaction between dictionary size and task performance. These studies allow us to examine how DDR and word count methods complement one another as tools for applying concept dictionaries and where each is best applied. Finally, we provide references to tools and resources to make this method both available and accessible to a broad psychological audience.

  4. A Systems Analysis of the MDTA Institutional Training Program. Final Report.

    ERIC Educational Resources Information Center

    North American Rockwell Information Systems Co., Anaheim, CA.

    An industrial study group was contracted to perform a systems analysis of institutional training conducted under the Manpower Development and Training Act (MDTA) of 1962, as amended, in order to: (1) illuminate management decisions in the areas of program priorities, alternative methods of administration, and allocation of resources, and (2)…

  5. Computer analysis of arteriograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Armstrong, J. H.; Beckenbach, E. B.; Blankenhorn, D. H.; Crawford, D. W.; Brooks, S. H.; Sanmarco, M. E.

    1977-01-01

    A computer system has been developed to quantify the degree of atherosclerosis in the human femoral artery. The analysis involves first scanning and digitizing angiographic film, then tracking the outline of the arterial image and finally computing the relative amount of roughness or irregularity in the vessel wall. The image processing system and method are described.

  6. Optical granulometric analysis of sedimentary deposits by color segmentation-based software: OPTGRAN-CS

    NASA Astrophysics Data System (ADS)

    Chávez, G. Moreno; Sarocchi, D.; Santana, E. Arce; Borselli, L.

    2015-12-01

    The study of grain size distribution is fundamental for understanding sedimentological environments. Through these analyses, clast erosion, transport and deposition processes can be interpreted and modeled. However, grain size distribution analysis can be difficult in some outcrops due to the number and complexity of the arrangement of clasts and matrix and their physical size. Despite various technological advances, it is almost impossible to get the full grain size distribution (blocks to sand grain size) with a single method or instrument of analysis. For this reason development in this area continues to be fundamental. In recent years, various methods of particle size analysis by automatic image processing have been developed, due to their potential advantages with respect to classical ones; speed and final detailed content of information (virtually for each analyzed particle). In this framework, we have developed a novel algorithm and software for grain size distribution analysis, based on color image segmentation using an entropy-controlled quadratic Markov measure field algorithm and the Rosiwal method for counting intersections between clast and linear transects in the images. We test the novel algorithm in different sedimentary deposit types from 14 varieties of sedimentological environments. The results of the new algorithm were compared with grain counts performed manually by the same Rosiwal methods applied by experts. The new algorithm has the same accuracy as a classical manual count process, but the application of this innovative methodology is much easier and dramatically less time-consuming. The final productivity of the new software for analysis of clasts deposits after recording field outcrop images can be increased significantly.

  7. Progressive Failure Analysis Methodology for Laminated Composite Structures

    NASA Technical Reports Server (NTRS)

    Sleight, David W.

    1999-01-01

    A progressive failure analysis method has been developed for predicting the failure of laminated composite structures under geometrically nonlinear deformations. The progressive failure analysis uses C(exp 1) shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms and several options are available to degrade the material properties after failures. The progressive failure analysis method is implemented in the COMET finite element analysis code and can predict the damage and response of laminated composite structures from initial loading to final failure. The different failure criteria and material degradation methods are compared and assessed by performing analyses of several laminated composite structures. Results from the progressive failure method indicate good correlation with the existing test data except in structural applications where interlaminar stresses are important which may cause failure mechanisms such as debonding or delaminations.

  8. Markov Chain Monte Carlo Methods for Bayesian Data Analysis in Astronomy

    NASA Astrophysics Data System (ADS)

    Sharma, Sanjib

    2017-08-01

    Markov Chain Monte Carlo based Bayesian data analysis has now become the method of choice for analyzing and interpreting data in almost all disciplines of science. In astronomy, over the last decade, we have also seen a steady increase in the number of papers that employ Monte Carlo based Bayesian analysis. New, efficient Monte Carlo based methods are continuously being developed and explored. In this review, we first explain the basics of Bayesian theory and discuss how to set up data analysis problems within this framework. Next, we provide an overview of various Monte Carlo based methods for performing Bayesian data analysis. Finally, we discuss advanced ideas that enable us to tackle complex problems and thus hold great promise for the future. We also distribute downloadable computer software (available at https://github.com/sanjibs/bmcmc/ ) that implements some of the algorithms and examples discussed here.

  9. Multi-level Discourse Analysis in a Physics Teaching Methods Course from the Psychological Perspective of Activity Theory

    NASA Astrophysics Data System (ADS)

    Vieira, Rodrigo Drumond; Kelly, Gregory J.

    2014-11-01

    In this paper, we present and apply a multi-level method for discourse analysis in science classrooms. This method is based on the structure of human activity (activity, actions, and operations) and it was applied to study a pre-service physics teacher methods course. We argue that such an approach, based on a cultural psychological perspective, affords opportunities for analysts to perform a theoretically based detailed analysis of discourse events. Along with the presentation of analysis, we show and discuss how the articulation of different levels offers interpretative criteria for analyzing instructional conversations. We synthesize the results into a model for a teacher's practice and discuss the implications and possibilities of this approach for the field of discourse analysis in science classrooms. Finally, we reflect on how the development of teachers' understanding of their activity structures can contribute to forms of progressive discourse of science education.

  10. Alcohol-related hot-spot analysis and prediction : final report.

    DOT National Transportation Integrated Search

    2017-05-01

    This project developed methods to more accurately identify alcohol-related crash hot spots, ultimately allowing for more effective and efficient enforcement and safety campaigns. Advancements in accuracy came from improving the calculation of spatial...

  11. In vivo stationary flux analysis by 13C labeling experiments.

    PubMed

    Wiechert, W; de Graaf, A A

    1996-01-01

    Stationary flux analysis is an invaluable tool for metabolic engineering. In the last years the metabolite balancing technique has become well established in the bioengineering community. On the other hand metabolic tracer experiments using 13C isotopes have long been used for intracellular flux determination. Only recently have both techniques been fully combined to form a considerably more powerful flux analysis method. This paper concentrates on modeling and data analysis for the evaluation of such stationary 13C labeling experiments. After reviewing recent experimental developments, the basic equations for modeling carbon labeling in metabolic systems, i.e. metabolite, carbon label and isotopomer balances, are introduced and discussed in some detail. Then the basics of flux estimation from measured extracellular fluxes combined with carbon labeling data are presented and, finally, this method is illustrated by using an example from C. glutamicum. The main emphasis is on the investigation of the extra information that can be obtained with tracer experiments compared with the metabolite balancing technique alone. As a principal result it is shown that the combined flux analysis method can dispense with some rather doubtful assumptions on energy balancing and that the forward and backward flux rates of bidirectional reaction steps can be simultaneously determined in certain situations. Finally, it is demonstrated that the variant of fractional isotopomer measurement is even more powerful than fractional labeling measurement but requires much higher numerical effort to solve the balance equations.

  12. Binary CFG Rebuilt of Self-Modifying Codes

    DTIC Science & Technology

    2016-10-03

    ABOVE ORGANIZATION. 1. REPORT DATE (DD-MM-YYYY)      04-10-2016 2. REPORT TYPE Final 3. DATES COVERED (From - To) 12 May 2014 to 11 May 2016 4. TITLE ...industry to analyze malware is a dynamic analysis in a sand- box . Alternatively, we apply a hybrid method combining concolic testing (dynamic symbolic...virus software based on binary signatures. A popular method in industry to analyze malware is a dynamic analysis in a sand- box . Alternatively, we

  13. Stability analysis of piecewise non-linear systems and its application to chaotic synchronisation with intermittent control

    NASA Astrophysics Data System (ADS)

    Wang, Qingzhi; Tan, Guanzheng; He, Yong; Wu, Min

    2017-10-01

    This paper considers a stability analysis issue of piecewise non-linear systems and applies it to intermittent synchronisation of chaotic systems. First, based on piecewise Lyapunov function methods, more general and less conservative stability criteria of piecewise non-linear systems in periodic and aperiodic cases are presented, respectively. Next, intermittent synchronisation conditions of chaotic systems are derived which extend existing results. Finally, Chua's circuit is taken as an example to verify the validity of our methods.

  14. Battery Test Manual For 48 Volt Mild Hybrid Electric Vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, Lee Kenneth

    2017-03-01

    This manual details the U.S. Advanced Battery Consortium and U.S. Department of Energy Vehicle Technologies Program goals, test methods, and analysis techniques for a 48 Volt Mild Hybrid Electric Vehicle system. The test methods are outlined stating with characterization tests, followed by life tests. The final section details standardized analysis techniques for 48 V systems that allow for the comparison of different programs that use this manual. An example test plan is included, along with guidance to filling in gap table numbers.

  15. An evaluation of teaching methods in the introductory physics classroom

    NASA Astrophysics Data System (ADS)

    Savage, Lauren Michelle Williams

    The introductory physics mechanics course at the University of North Carolina at Charlotte has a history of relatively high DFW rates. In 2011, the course was redesigned from the traditional lecture format to the inverted classroom format (flipped). This format inverts the classroom by introducing material in a video assigned as homework while the instructor conducts problem solving activities and guides discussions during the regular meetings. This format focuses on student-centered learning and is more interactive and engaging. To evaluate the effectiveness of the new method, final exam data over the past 10 years was mined and the pass rates examined. A normalization condition was developed to evaluate semesters equally. The two teaching methods were compared using a grade distribution across multiple semesters. Students in the inverted class outperformed those in the traditional class: "A"s increased by 22% and "B"s increased by 38%. The final exam pass rate increased by 12% under the inverted classroom approach. The same analysis was used to compare the written and online final exam formats. Surprisingly, no students scored "A"s on the online final. However, the percent of "B"s increased by 136%. Combining documented best practices from a literature review with personal observations of student performance and attitudes from first hand classroom experience as a teaching assistant in both teaching methods, reasons are given to support the continued use of the inverted classroom approach as well as the online final. Finally, specific recommendations are given to improve the course structure where weaknesses have been identified.

  16. Health care evaluation, utilitarianism and distortionary taxes.

    PubMed

    Calcott, P

    2000-09-01

    Cost Utility Analysis (CUA) and Cost Benefit Analysis (CBA) are methods to evaluate allocations of health care resources. Problems are raised for both methods when income taxes do not meet the first best optimum. This paper explores the implications of three ways that taxes may fall short of this ideal. First, taxes may be distortionary. Second, they may be designed and administered without reference to information that is used by providers of health care. Finally, the share of tax revenue that is devoted to health care may be suboptimal. The two methods are amended to account for these factors.

  17. Comparison of Analysis, Simulation, and Measurement of Wire-to-Wire Crosstalk. Part 2

    NASA Technical Reports Server (NTRS)

    Bradley, Arthur T.; Yavoich, Brian James; Hodson, Shane M.; Godley, Franklin

    2010-01-01

    In this investigation, we compare crosstalk analysis, simulation, and measurement results for electrically short configurations. Methods include hand calculations, PSPICE simulations, Microstripes transient field solver, and empirical measurement. In total, four representative physical configurations are examined, including a single wire over a ground plane, a twisted pair over a ground plane, generator plus receptor wires inside a cylindrical conduit, and a single receptor wire inside a cylindrical conduit. Part 1 addresses the first two cases, and Part 2 addresses the final two. Agreement between the analysis methods and test data is shown to be very good.

  18. Methods for the visualization and analysis of extracellular matrix protein structure and degradation.

    PubMed

    Leonard, Annemarie K; Loughran, Elizabeth A; Klymenko, Yuliya; Liu, Yueying; Kim, Oleg; Asem, Marwa; McAbee, Kevin; Ravosa, Matthew J; Stack, M Sharon

    2018-01-01

    This chapter highlights methods for visualization and analysis of extracellular matrix (ECM) proteins, with particular emphasis on collagen type I, the most abundant protein in mammals. Protocols described range from advanced imaging of complex in vivo matrices to simple biochemical analysis of individual ECM proteins. The first section of this chapter describes common methods to image ECM components and includes protocols for second harmonic generation, scanning electron microscopy, and several histological methods of ECM localization and degradation analysis, including immunohistochemistry, Trichrome staining, and in situ zymography. The second section of this chapter details both a common transwell invasion assay and a novel live imaging method to investigate cellular behavior with respect to collagen and other ECM proteins of interest. The final section consists of common electrophoresis-based biochemical methods that are used in analysis of ECM proteins. Use of the methods described herein will enable researchers to gain a greater understanding of the role of ECM structure and degradation in development and matrix-related diseases such as cancer and connective tissue disorders. © 2018 Elsevier Inc. All rights reserved.

  19. Eddy current loss analysis of open-slot fault-tolerant permanent-magnet machines based on conformal mapping method

    NASA Astrophysics Data System (ADS)

    Ji, Jinghua; Luo, Jianhua; Lei, Qian; Bian, Fangfang

    2017-05-01

    This paper proposed an analytical method, based on conformal mapping (CM) method, for the accurate evaluation of magnetic field and eddy current (EC) loss in fault-tolerant permanent-magnet (FTPM) machines. The aim of modulation function, applied in CM method, is to change the open-slot structure into fully closed-slot structure, whose air-gap flux density is easy to calculate analytically. Therefore, with the help of Matlab Schwarz-Christoffel (SC) Toolbox, both the magnetic flux density and EC density of FTPM machine are obtained accurately. Finally, time-stepped transient finite-element method (FEM) is used to verify the theoretical analysis, showing that the proposed method is able to predict the magnetic flux density and EC loss precisely.

  20. Major strengths and weaknesses of the lod score method.

    PubMed

    Ott, J

    2001-01-01

    Strengths and weaknesses of the lod score method for human genetic linkage analysis are discussed. The main weakness is its requirement for the specification of a detailed inheritance model for the trait. Various strengths are identified. For example, the lod score (likelihood) method has optimality properties when the trait to be studied is known to follow a Mendelian mode of inheritance. The ELOD is a useful measure for information content of the data. The lod score method can emulate various "nonparametric" methods, and this emulation is equivalent to the nonparametric methods. Finally, the possibility of building errors into the analysis will prove to be essential for the large amount of linkage and disequilibrium data expected in the near future.

  1. Research progress in Asia on methods of processing laser-induced breakdown spectroscopy data

    NASA Astrophysics Data System (ADS)

    Guo, Yang-Min; Guo, Lian-Bo; Li, Jia-Ming; Liu, Hong-Di; Zhu, Zhi-Hao; Li, Xiang-You; Lu, Yong-Feng; Zeng, Xiao-Yan

    2016-10-01

    Laser-induced breakdown spectroscopy (LIBS) has attracted much attention in terms of both scientific research and industrial application. An important branch of LIBS research in Asia, the development of data processing methods for LIBS, is reviewed. First, the basic principle of LIBS and the characteristics of spectral data are briefly introduced. Next, two aspects of research on and problems with data processing methods are described: i) the basic principles of data preprocessing methods are elaborated in detail on the basis of the characteristics of spectral data; ii) the performance of data analysis methods in qualitative and quantitative analysis of LIBS is described. Finally, a direction for future development of data processing methods for LIBS is also proposed.

  2. Assessing the Toxicity and Bioavailability of 2,4-Dinitroanisole in Acute and Sub-Chronic Exposures Using the Earthworm, Eisenia fetida

    DTIC Science & Technology

    2010-06-01

    different methods, 2nd method chosen for final study: ► Coelomocytes collected in 2 ml Guaiacol Glyceryl Ether ( GGE ) solution, centrifuged, decanted...worm to GGE t= 2mins collect coelomocyte solution 1 row per worm/treatment, obtain measurements through spectrophotometer NRRT analysis 1) 2) BUILDING

  3. Meta-analysis and The Cochrane Collaboration: 20 years of the Cochrane Statistical Methods Group

    PubMed Central

    2013-01-01

    The Statistical Methods Group has played a pivotal role in The Cochrane Collaboration over the past 20 years. The Statistical Methods Group has determined the direction of statistical methods used within Cochrane reviews, developed guidance for these methods, provided training, and continued to discuss and consider new and controversial issues in meta-analysis. The contribution of Statistical Methods Group members to the meta-analysis literature has been extensive and has helped to shape the wider meta-analysis landscape. In this paper, marking the 20th anniversary of The Cochrane Collaboration, we reflect on the history of the Statistical Methods Group, beginning in 1993 with the identification of aspects of statistical synthesis for which consensus was lacking about the best approach. We highlight some landmark methodological developments that Statistical Methods Group members have contributed to in the field of meta-analysis. We discuss how the Group implements and disseminates statistical methods within The Cochrane Collaboration. Finally, we consider the importance of robust statistical methodology for Cochrane systematic reviews, note research gaps, and reflect on the challenges that the Statistical Methods Group faces in its future direction. PMID:24280020

  4. Comparison of answer-until-correct and full-credit assessments in a team-based learning course.

    PubMed

    Farland, Michelle Z; Barlow, Patrick B; Levi Lancaster, T; Franks, Andrea S

    2015-03-25

    To assess the impact of awarding partial credit to team assessments on team performance and on quality of team interactions using an answer-until-correct method compared to traditional methods of grading (multiple-choice, full-credit). Subjects were students from 3 different offerings of an ambulatory care elective course, taught using team-based learning. The control group (full-credit) consisted of those enrolled in the course when traditional methods of assessment were used (2 course offerings). The intervention group consisted of those enrolled in the course when answer-until-correct method was used for team assessments (1 course offering). Study outcomes included student performance on individual and team readiness assurance tests (iRATs and tRATs), individual and team final examinations, and student assessment of quality of team interactions using the Team Performance Scale. Eighty-four students enrolled in the courses were included in the analysis (full-credit, n=54; answer-until-correct, n=30). Students who used traditional methods of assessment performed better on iRATs (full-credit mean 88.7 (5.9), answer-until-correct mean 82.8 (10.7), p<0.001). Students who used answer-until-correct method of assessment performed better on the team final examination (full-credit mean 45.8 (1.5), answer-until-correct 47.8 (1.4), p<0.001). There was no significant difference in performance on tRATs and the individual final examination. Students who used the answer-until-correct method had higher quality of team interaction ratings (full-credit 97.1 (9.1), answer-until-correct 103.0 (7.8), p=0.004). Answer-until-correct assessment method compared to traditional, full-credit methods resulted in significantly lower scores for iRATs, similar scores on tRATs and individual final examinations, improved scores on team final examinations, and improved perceptions of the quality of team interactions.

  5. 14 CFR Appendix B of Part 415 - Safety Review Document Outline

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Performed by Certified Personnel 4.0Flight Safety (§ 415.115) 4.1Initial Flight Safety Analysis 4.1.1Flight Safety Sub-Analyses, Methods, and Assumptions 4.1.2Sample Calculation and Products 4.1.3 Launch Specific Updates and Final Flight Safety Analysis Data 4.2Radionuclide Data (where applicable) 4.3Flight Safety...

  6. 14 CFR Appendix B of Part 415 - Safety Review Document Outline

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Performed by Certified Personnel 4.0Flight Safety (§ 415.115) 4.1Initial Flight Safety Analysis 4.1.1Flight Safety Sub-Analyses, Methods, and Assumptions 4.1.2Sample Calculation and Products 4.1.3 Launch Specific Updates and Final Flight Safety Analysis Data 4.2Radionuclide Data (where applicable) 4.3Flight Safety...

  7. 14 CFR Appendix B of Part 415 - Safety Review Document Outline

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Performed by Certified Personnel 4.0Flight Safety (§ 415.115) 4.1Initial Flight Safety Analysis 4.1.1Flight Safety Sub-Analyses, Methods, and Assumptions 4.1.2Sample Calculation and Products 4.1.3 Launch Specific Updates and Final Flight Safety Analysis Data 4.2Radionuclide Data (where applicable) 4.3Flight Safety...

  8. Goods Movement: Regional Analysis and Database Final Report

    DOT National Transportation Integrated Search

    1993-03-26

    The project reported here was undertaken to create and test methods for synthesizing truck flow patterns in urban areas from partial and fragmentary observations. More specifically, the project sought to develop a way to estimate origin-destination (...

  9. Simultaneous analysis of aminoglycosides with many other classes of drug residues in bovine tissues by ultrahigh-performance liquid chromatography-tandem mass spectrometry using an ion-pairing reagent added to final extracts.

    PubMed

    Lehotay, Steven J; Lightfield, Alan R

    2018-01-01

    The way to maximize scope of analysis, sample throughput, and laboratory efficiency in the monitoring of veterinary drug residues in food animals is to determine as many analytes as possible as fast as possible in as few methods as possible. Capital and overhead expenses are also reduced by using fewer instruments in the overall monitoring scheme. Traditionally, the highly polar aminoglycoside antibiotics require different chromatographic conditions from other classes of drugs, but in this work, we demonstrate that an ion-pairing reagent (sodium 1-heptanesulfonate) added to the combined final extracts from two sample preparation methods attains good separation of 174 targeted drugs, including 9 aminoglycosides, in the same 10.5-min ultrahigh-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) analysis. The full method was validated in bovine kidney, liver, and muscle tissues according to US regulatory protocols, and 137-146 (79-84%) of the drugs gave between 70 and 120% average recoveries with ≤ 25% RSDs in the different types of tissues spiked at 0.5, 1, and 2 times the regulatory levels of interest (10-1000 ng/g depending on the drug). This method increases sample throughput and the possible number of drugs monitored in the US National Residue Program, and requires only one UHPLC-MS/MS method and instrument for analysis rather than two by the previous scheme. Graphical abstract Outline of the streamlined approach to monitor 174 veterinary drugs, including aminoglycosides, in bovine tissues by combining two extracts of the same sample with an ion-pairing reagent for analysis by UHPLC-MS/MS.

  10. Analysis on the hot spot and trend of the foreign assembly building research

    NASA Astrophysics Data System (ADS)

    Bi, Xiaoqing; Luo, Yanbing

    2017-03-01

    First of all, the paper analyzes the research on the front of the assembly building in the past 15 years. This article mainly adopts the method of CO word analysis, construct the co word matrix, correlation matrix, and then into a dissimilarity matrix, and on this basis, using factor analysis, cluster analysis and multi scale analysis method to study the structure of prefabricated construction field display. Finally, the results of the analysis are discussed, and summarized the current research focus of foreign prefabricated construction mainly concentrated in 7 aspects: embankment construction, wood construction, bridge construction, crane layout, PCM wall and glass system, based on neural network test, energy saving and recycling, and forecast the future trend of development study.

  11. Medical versus surgical abortion methods for pregnancy in China: a cost-minimization analysis.

    PubMed

    Xia, Wei; She, Shouzhang; Lam, Tai Hing

    2011-01-01

    Both medical and surgical abortions are popular in developing countries. However, the monetary costs of these two methods have not been compared. 430 women seeking abortions were recruited in 2008. Either a medical or surgical method was used for the abortion. We adopted the perspective of a third-party payer. Cost-minimization analysis was used based on all charges for the overall procedures in an out-patient clinic in Guangzhou, China. 219 subjects (51%) chose a medical method (mifepristone and misoprostol), whereas 211 subjects (49%) chose a surgical method. The efficacy in the surgical group was significantly higher than in the medical group (100 vs. 90%, p < 0.001). Surgical abortion incurred much more costs than medical abortion on average after initial treatment. When the subsequent costs were accumulated within the 2-week follow-up, the mean total cost in the medical group increased significantly due to failure of abortion and persistent bleeding. Patients undergoing medical abortion eventually incurred equivalent expenses compared to patients undergoing surgical abortion (p = 0.42). There was no difference in the mean final costs between the two abortion methods. Complications of persistent bleeding and failure to abort (requiring surgical intervention) in the medical treatment group increased the final mean total cost substantially. Copyright © 2011 S. Karger AG, Basel.

  12. Ice Growth Measurements from Image Data to Support Ice Crystal and Mixed-Phase Accretion Testing

    NASA Technical Reports Server (NTRS)

    Struk, Peter M.; Lynch, Christopher J.

    2012-01-01

    This paper describes the imaging techniques as well as the analysis methods used to measure the ice thickness and growth rate in support of ice-crystal icing tests performed at the National Research Council of Canada (NRC) Research Altitude Test Facility (RATFac). A detailed description of the camera setup, which involves both still and video cameras, as well as the analysis methods using the NASA Spotlight software, are presented. Two cases, one from two different test entries, showing significant ice growth are analyzed in detail describing the ice thickness and growth rate which is generally linear. Estimates of the bias uncertainty are presented for all measurements. Finally some of the challenges related to the imaging and analysis methods are discussed as well as methods used to overcome them.

  13. Ice Growth Measurements from Image Data to Support Ice-Crystal and Mixed-Phase Accretion Testing

    NASA Technical Reports Server (NTRS)

    Struk, Peter, M; Lynch, Christopher, J.

    2012-01-01

    This paper describes the imaging techniques as well as the analysis methods used to measure the ice thickness and growth rate in support of ice-crystal icing tests performed at the National Research Council of Canada (NRC) Research Altitude Test Facility (RATFac). A detailed description of the camera setup, which involves both still and video cameras, as well as the analysis methods using the NASA Spotlight software, are presented. Two cases, one from two different test entries, showing significant ice growth are analyzed in detail describing the ice thickness and growth rate which is generally linear. Estimates of the bias uncertainty are presented for all measurements. Finally some of the challenges related to the imaging and analysis methods are discussed as well as methods used to overcome them.

  14. Preliminary Tests For Development Of A Non-Pertechnetate Analysis Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diprete, D.; McCabe, D.

    2016-09-28

    The objective of this task was to develop a non-pertechnetate analysis method that 222-S lab could easily implement. The initial scope involved working with 222-S laboratory personnel to adapt the existing Tc analytical method to fractionate the non-pertechnetate and pertechnetate. SRNL then developed and tested a method using commercial sorbents containing Aliquat ® 336 to extract the pertechnetate (thereby separating it from non-pertechnetate), followed by oxidation, extraction, and stripping steps, and finally analysis by beta counting and Mass Spectroscopy. Several additional items were partially investigated, including impacts of a 137Cs removal step. The method was initially tested on SRS tankmore » waste samples to determine its viability. Although SRS tank waste does not contain non-pertechnetate, testing with it was useful to investigate the compatibility, separation efficiency, interference removal efficacy, and method sensitivity.« less

  15. Simultaneous quantitative analysis of olmesartan, amlodipine and hydrochlorothiazide in their combined dosage form utilizing classical and alternating least squares based chemometric methods.

    PubMed

    Darwish, Hany W; Bakheit, Ahmed H; Abdelhameed, Ali S

    2016-03-01

    Simultaneous spectrophotometric analysis of a multi-component dosage form of olmesartan, amlodipine and hydrochlorothiazide used for the treatment of hypertension has been carried out using various chemometric methods. Multivariate calibration methods include classical least squares (CLS) executed by net analyte processing (NAP-CLS), orthogonal signal correction (OSC-CLS) and direct orthogonal signal correction (DOSC-CLS) in addition to multivariate curve resolution-alternating least squares (MCR-ALS). Results demonstrated the efficiency of the proposed methods as quantitative tools of analysis as well as their qualitative capability. The three analytes were determined precisely using the aforementioned methods in an external data set and in a dosage form after optimization of experimental conditions. Finally, the efficiency of the models was validated via comparison with the partial least squares (PLS) method in terms of accuracy and precision.

  16. Research on Visual Analysis Methods of Terrorism Events

    NASA Astrophysics Data System (ADS)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  17. Speech Emotion Feature Selection Method Based on Contribution Analysis Algorithm of Neural Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang Xiaojia; Mao Qirong; Zhan Yongzhao

    There are many emotion features. If all these features are employed to recognize emotions, redundant features may be existed. Furthermore, recognition result is unsatisfying and the cost of feature extraction is high. In this paper, a method to select speech emotion features based on contribution analysis algorithm of NN is presented. The emotion features are selected by using contribution analysis algorithm of NN from the 95 extracted features. Cluster analysis is applied to analyze the effectiveness for the features selected, and the time of feature extraction is evaluated. Finally, 24 emotion features selected are used to recognize six speech emotions.more » The experiments show that this method can improve the recognition rate and the time of feature extraction.« less

  18. Automatic initial and final segmentation in cleft palate speech of Mandarin speakers

    PubMed Central

    Liu, Yin; Yin, Heng; Zhang, Junpeng; Zhang, Jing; Zhang, Jiang

    2017-01-01

    The speech unit segmentation is an important pre-processing step in the analysis of cleft palate speech. In Mandarin, one syllable is composed of two parts: initial and final. In cleft palate speech, the resonance disorders occur at the finals and the voiced initials, while the articulation disorders occur at the unvoiced initials. Thus, the initials and finals are the minimum speech units, which could reflect the characteristics of cleft palate speech disorders. In this work, an automatic initial/final segmentation method is proposed. It is an important preprocessing step in cleft palate speech signal processing. The tested cleft palate speech utterances are collected from the Cleft Palate Speech Treatment Center in the Hospital of Stomatology, Sichuan University, which has the largest cleft palate patients in China. The cleft palate speech data includes 824 speech segments, and the control samples contain 228 speech segments. The syllables are extracted from the speech utterances firstly. The proposed syllable extraction method avoids the training stage, and achieves a good performance for both voiced and unvoiced speech. Then, the syllables are classified into with “quasi-unvoiced” or with “quasi-voiced” initials. Respective initial/final segmentation methods are proposed to these two types of syllables. Moreover, a two-step segmentation method is proposed. The rough locations of syllable and initial/final boundaries are refined in the second segmentation step, in order to improve the robustness of segmentation accuracy. The experiments show that the initial/final segmentation accuracies for syllables with quasi-unvoiced initials are higher than quasi-voiced initials. For the cleft palate speech, the mean time error is 4.4ms for syllables with quasi-unvoiced initials, and 25.7ms for syllables with quasi-voiced initials, and the correct segmentation accuracy P30 for all the syllables is 91.69%. For the control samples, P30 for all the syllables is 91.24%. PMID:28926572

  19. Automatic initial and final segmentation in cleft palate speech of Mandarin speakers.

    PubMed

    He, Ling; Liu, Yin; Yin, Heng; Zhang, Junpeng; Zhang, Jing; Zhang, Jiang

    2017-01-01

    The speech unit segmentation is an important pre-processing step in the analysis of cleft palate speech. In Mandarin, one syllable is composed of two parts: initial and final. In cleft palate speech, the resonance disorders occur at the finals and the voiced initials, while the articulation disorders occur at the unvoiced initials. Thus, the initials and finals are the minimum speech units, which could reflect the characteristics of cleft palate speech disorders. In this work, an automatic initial/final segmentation method is proposed. It is an important preprocessing step in cleft palate speech signal processing. The tested cleft palate speech utterances are collected from the Cleft Palate Speech Treatment Center in the Hospital of Stomatology, Sichuan University, which has the largest cleft palate patients in China. The cleft palate speech data includes 824 speech segments, and the control samples contain 228 speech segments. The syllables are extracted from the speech utterances firstly. The proposed syllable extraction method avoids the training stage, and achieves a good performance for both voiced and unvoiced speech. Then, the syllables are classified into with "quasi-unvoiced" or with "quasi-voiced" initials. Respective initial/final segmentation methods are proposed to these two types of syllables. Moreover, a two-step segmentation method is proposed. The rough locations of syllable and initial/final boundaries are refined in the second segmentation step, in order to improve the robustness of segmentation accuracy. The experiments show that the initial/final segmentation accuracies for syllables with quasi-unvoiced initials are higher than quasi-voiced initials. For the cleft palate speech, the mean time error is 4.4ms for syllables with quasi-unvoiced initials, and 25.7ms for syllables with quasi-voiced initials, and the correct segmentation accuracy P30 for all the syllables is 91.69%. For the control samples, P30 for all the syllables is 91.24%.

  20. RO1 Funding for Mixed Methods Research: Lessons learned from the Mixed-Method Analysis of Japanese Depression Project

    PubMed Central

    Arnault, Denise Saint; Fetters, Michael D.

    2013-01-01

    Mixed methods research has made significant in-roads in the effort to examine complex health related phenomenon. However, little has been published on the funding of mixed methods research projects. This paper addresses that gap by presenting an example of an NIMH funded project using a mixed methods QUAL-QUAN triangulation design entitled “The Mixed-Method Analysis of Japanese Depression.” We present the Cultural Determinants of Health Seeking model that framed the study, the specific aims, the quantitative and qualitative data sources informing the study, and overview of the mixing of the two studies. Finally, we examine reviewer's comments and our insights related to writing mixed method proposal successful for achieving RO1 level funding. PMID:25419196

  1. Mixed methods in gerontological research: Do the qualitative and quantitative data “touch”?

    PubMed Central

    Happ, Mary Beth

    2010-01-01

    This paper distinguishes between parallel and integrated mixed methods research approaches. Barriers to integrated mixed methods approaches in gerontological research are discussed and critiqued. The author presents examples of mixed methods gerontological research to illustrate approaches to data integration at the levels of data analysis, interpretation, and research reporting. As a summary of the methodological literature, four basic levels of mixed methods data combination are proposed. Opportunities for mixing qualitative and quantitative data are explored using contemporary examples from published studies. Data transformation and visual display, judiciously applied, are proposed as pathways to fuller mixed methods data integration and analysis. Finally, practical strategies for mixing qualitative and quantitative data types are explicated as gerontological research moves beyond parallel mixed methods approaches to achieve data integration. PMID:20077973

  2. An exploration of function analysis and function allocation in the commercial flight domain

    NASA Technical Reports Server (NTRS)

    Mcguire, James C.; Zich, John A.; Goins, Richard T.; Erickson, Jeffery B.; Dwyer, John P.; Cody, William J.; Rouse, William B.

    1991-01-01

    The applicability is explored of functional analysis methods to support cockpit design. Specifically, alternative techniques are studied for ensuring an effective division of responsibility between the flight crew and automation. A functional decomposition is performed of the commercial flight domain to provide the information necessary to support allocation decisions and demonstrate methodology for allocating functions to flight crew or to automation. The function analysis employed 'bottom up' and 'top down' analyses and demonstrated the comparability of identified functions, using the 'lift off' segment of the 'take off' phase as a test case. The normal flight mission and selected contingencies were addressed. Two alternative methods for using the functional description in the allocation of functions between man and machine were investigated. The two methods were compared in order to ascertain their relative strengths and weaknesses. Finally, conclusions were drawn regarding the practical utility of function analysis methods.

  3. Aerodynamic design optimization using sensitivity analysis and computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Baysal, Oktay; Eleshaky, Mohamed E.

    1991-01-01

    A new and efficient method is presented for aerodynamic design optimization, which is based on a computational fluid dynamics (CFD)-sensitivity analysis algorithm. The method is applied to design a scramjet-afterbody configuration for an optimized axial thrust. The Euler equations are solved for the inviscid analysis of the flow, which in turn provides the objective function and the constraints. The CFD analysis is then coupled with the optimization procedure that uses a constrained minimization method. The sensitivity coefficients, i.e. gradients of the objective function and the constraints, needed for the optimization are obtained using a quasi-analytical method rather than the traditional brute force method of finite difference approximations. During the one-dimensional search of the optimization procedure, an approximate flow analysis (predicted flow) based on a first-order Taylor series expansion is used to reduce the computational cost. Finally, the sensitivity of the optimum objective function to various design parameters, which are kept constant during the optimization, is computed to predict new optimum solutions. The flow analysis of the demonstrative example are compared with the experimental data. It is shown that the method is more efficient than the traditional methods.

  4. Human factors analysis of road weather advisory and control information : final report.

    DOT National Transportation Integrated Search

    2010-03-31

    The amount of available weather information and the methods by which this information can be disseminated to travelers have grown considerably in recent years. This growth includes weather gathering devices (sensors, satellites), models and forecasti...

  5. An application of actuarial methods in psychiatric diagnosis.

    PubMed

    Overall, J E; Higgins, C W

    1977-10-01

    An actuarial program for psychiatric diagnosis is evaluated for agreement with final clinical diagnosis in a series of 288 patients. The acturial program provides a probability differential diagnosis based on an analysis of history and background data, symptom rating profiles, and MMPI clinical scale profiles. The observed agreement with final clinical diagnosis is approximately 50% higher than previously reported for psychological testing in this same setting. The results emphasize the importance for psychologists of clinical interview and observation skills.

  6. Quantitative Determination of Cannabinoids in Cannabis and Cannabis Products Using Ultra-High-Performance Supercritical Fluid Chromatography and Diode Array/Mass Spectrometric Detection.

    PubMed

    Wang, Mei; Wang, Yan-Hong; Avula, Bharathi; Radwan, Mohamed M; Wanas, Amira S; Mehmedic, Zlatko; van Antwerp, John; ElSohly, Mahmoud A; Khan, Ikhlas A

    2017-05-01

    Ultra-high-performance supercritical fluid chromatography (UHPSFC) is an efficient analytical technique and has not been fully employed for the analysis of cannabis. Here, a novel method was developed for the analysis of 30 cannabis plant extracts and preparations using UHPSFC/PDA-MS. Nine of the most abundant cannabinoids, viz. CBD, ∆ 8 -THC, THCV, ∆ 9 -THC, CBN, CBG, THCA-A, CBDA, and CBGA, were quantitatively determined (RSDs < 6.9%). Unlike GC methods, no derivatization or decarboxylation was required prior to UHPSFC analysis. The UHPSFC chromatographic separation of cannabinoids displayed an inverse elution order compared to UHPLC. Combining with PDA-MS, this orthogonality is valuable for discrimination of cannabinoids in complex matrices. The developed method was validated, and the quantification results were compared with a standard UHPLC method. The RSDs of these two methods were within ±13.0%. Finally, chemometric analysis including principal component analysis (PCA) and partial least squares-discriminant analysis (PLS-DA) were used to differentiate between cannabis samples. © 2016 American Academy of Forensic Sciences.

  7. A dimension-wise analysis method for the structural-acoustic system with interval parameters

    NASA Astrophysics Data System (ADS)

    Xu, Menghui; Du, Jianke; Wang, Chong; Li, Yunlong

    2017-04-01

    The interval structural-acoustic analysis is mainly accomplished by interval and subinterval perturbation methods. Potential limitations for these intrusive methods include overestimation or interval translation effect for the former and prohibitive computational cost for the latter. In this paper, a dimension-wise analysis method is thus proposed to overcome these potential limitations. In this method, a sectional curve of the system response surface along each input dimensionality is firstly extracted, the minimal and maximal points of which are identified based on its Legendre polynomial approximation. And two input vectors, i.e. the minimal and maximal input vectors, are dimension-wisely assembled by the minimal and maximal points of all sectional curves. Finally, the lower and upper bounds of system response are computed by deterministic finite element analysis at the two input vectors. Two numerical examples are studied to demonstrate the effectiveness of the proposed method and show that, compared to the interval and subinterval perturbation method, a better accuracy is achieved without much compromise on efficiency by the proposed method, especially for nonlinear problems with large interval parameters.

  8. Structures in solutions from joint experimental-computational analysis: applications to cyclic molecules and studies of noncovalent interactions.

    PubMed

    Aliev, Abil E; Mia, Zakirin A; Khaneja, Harmeet S; King, Frank D

    2012-01-26

    The potential of an approach combining nuclear magnetic resonance (NMR) spectroscopy, molecular dynamics (MD) simulations, and quantum mechanical (QM) calculations for full structural characterizations in solution is assessed using cyclic organic compounds, namely, benzazocinone derivatives 1-3 with fused five- and eight-membered aliphatic rings, camphoric anhydride 4, and bullvalene 5. Various MD simulations were considered, using force field and semiempirical QM treatments, implicit and explicit solvation, and high-temperature MD calculations for selecting plausible molecular geometries for subsequent QM geometry optimizations using mainly B3LYP, M062X, and MP2 methods. The QM-predicted values of NMR parameters were compared to their experimental values for verification of the final structures derived from the MD/QM analysis. From these comparisons, initial estimates of quality thresholds (calculated as rms deviations) were 0.7-0.9 Hz for (3)J(HH) couplings, 0.07-0.11 Å for interproton distances, 0.05-0.08 ppm for (1)H chemical shifts, and 1.0-2.1 ppm for (13)C chemical shifts. The obtained results suggest that the accuracy of the MD analysis in predicting geometries and relative conformational energies is not critical and that the final geometry refinements of the structures selected from the MD simulations using QM methods are sufficient for correcting for the expected inaccuracy of the MD analysis. A unique example of C(sp(3))-H···N(sp(3)) intramolecular noncovalent interaction is also identified using the NMR/MD/QM and the natural bond orbital analyses. As the NMR/MD/QM approach relies on the final QM geometry optimization, comparisons of geometric characteristics predicted by different QM methods and those from X-ray and neutron diffraction measurements were undertaken using rigid and flexible cyclic systems. The joint analysis shows that intermolecular noncovalent interactions present in the solid state alter molecular geometries significantly compared to the geometries of isolated molecules from QM calculations.

  9. A multiple ion counter total evaporation (MICTE) method for precise analysis of plutonium by thermal ionization mass spectrometry

    DOE PAGES

    Inglis, Jeremy D.; Maassen, Joel; Kara, Azim; ...

    2017-04-28

    This study presents a total evaporation method for the analysis of sub-picogram quantities of Pu, utilizing an array of multiple ion counters. Data from three standards are presented to assess the utility of the technique. An external precision of 1.5% RSD (2σ) was achieved on aliquots approaching 100 fg for the minor 240Pu isotope. Accurate analysis of <1 femtogram of 240Pu, is achievable, with an external reproducibility of better than 10% RSD (2σ). Finally, this new technique represents a significant advance in the total evaporation method and will allow routine measurement of femtogram sized Pu samples by thermal ionization massmore » spectrometry.« less

  10. A spectrum fractal feature classification algorithm for agriculture crops with hyper spectrum image

    NASA Astrophysics Data System (ADS)

    Su, Junying

    2011-11-01

    A fractal dimension feature analysis method in spectrum domain for hyper spectrum image is proposed for agriculture crops classification. Firstly, a fractal dimension calculation algorithm in spectrum domain is presented together with the fast fractal dimension value calculation algorithm using the step measurement method. Secondly, the hyper spectrum image classification algorithm and flowchart is presented based on fractal dimension feature analysis in spectrum domain. Finally, the experiment result of the agricultural crops classification with FCL1 hyper spectrum image set with the proposed method and SAM (spectral angle mapper). The experiment results show it can obtain better classification result than the traditional SAM feature analysis which can fulfill use the spectrum information of hyper spectrum image to realize precision agricultural crops classification.

  11. EnvironmentalWaveletTool: Continuous and discrete wavelet analysis and filtering for environmental time series

    NASA Astrophysics Data System (ADS)

    Galiana-Merino, J. J.; Pla, C.; Fernandez-Cortes, A.; Cuezva, S.; Ortiz, J.; Benavente, D.

    2014-10-01

    A MATLAB-based computer code has been developed for the simultaneous wavelet analysis and filtering of several environmental time series, particularly focused on the analyses of cave monitoring data. The continuous wavelet transform, the discrete wavelet transform and the discrete wavelet packet transform have been implemented to provide a fast and precise time-period examination of the time series at different period bands. Moreover, statistic methods to examine the relation between two signals have been included. Finally, the entropy of curves and splines based methods have also been developed for segmenting and modeling the analyzed time series. All these methods together provide a user-friendly and fast program for the environmental signal analysis, with useful, practical and understandable results.

  12. A multiple ion counter total evaporation (MICTE) method for precise analysis of plutonium by thermal ionization mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Inglis, Jeremy D.; Maassen, Joel; Kara, Azim

    This study presents a total evaporation method for the analysis of sub-picogram quantities of Pu, utilizing an array of multiple ion counters. Data from three standards are presented to assess the utility of the technique. An external precision of 1.5% RSD (2σ) was achieved on aliquots approaching 100 fg for the minor 240Pu isotope. Accurate analysis of <1 femtogram of 240Pu, is achievable, with an external reproducibility of better than 10% RSD (2σ). Finally, this new technique represents a significant advance in the total evaporation method and will allow routine measurement of femtogram sized Pu samples by thermal ionization massmore » spectrometry.« less

  13. Recent advances in chemiluminescence detection coupled with capillary electrophoresis and microchip capillary electrophoresis.

    PubMed

    Liu, Yuxuan; Huang, Xiangyi; Ren, Jicun

    2016-01-01

    CE is an ideal analytical method for extremely volume-limited biological microenvironments. However, the small injection volume makes it a challenge to achieve highly sensitive detection. Chemiluminescence (CL) detection is characterized by providing low background with excellent sensitivity because of requiring no light source. The coupling of CL with CE and MCE has become a powerful analytical method. So far, this method has been widely applied to chemical analysis, bioassay, drug analysis, and environment analysis. In this review, we first introduce some developments for CE-CL and MCE-CL systems, and then put the emphasis on the applications in the last 10 years. Finally, we discuss the future prospects. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Quantitative HPLC Analysis of Rosmarinic Acid in Extracts of "Melissa officinalis" and Spectrophotometric Measurement of Their Antioxidant Activities

    ERIC Educational Resources Information Center

    Canelas, Vera; da Costa, Cristina Teixeira

    2007-01-01

    The students prepare tea samples using different quantities of lemon balm leaves ("Melissa officinalis") and measure the rosmarinic acid contents by an HPLC-DAD method. The antioxidant properties of the tea samples are evaluated by a spectrophotometric method using a radical-scavenging assay with DPPH. (2,2-diphenyl-1-picrylhydrazyl). Finally the…

  15. Forensic Analysis of Digital Image Tampering

    DTIC Science & Technology

    2004-12-01

    analysis of when each method fails, which Chapter 4 discusses. Finally, a test image containing an invisible watermark using LSB steganography is...2.2 – Example of invisible watermark using Steganography Software F5 ............. 8 Figure 2.3 – Example of copy-move image forgery [12...Figure 3.11 – Algorithm for JPEG Block Technique ....................................................... 54 Figure 3.12 – “Forged” Image with Result

  16. Grid connected integrated community energy system. Phase II: final state 2 report. Cost benefit analysis, operating costs and computer simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1978-03-22

    A grid-connected Integrated Community Energy System (ICES) with a coal-burning power plant located on the University of Minnesota campus is planned. The cost benefit analysis performed for this ICES, the cost accounting methods used, and a computer simulation of the operation of the power plant are described. (LCL)

  17. Determination of lithium in rocks: Fluorometric method

    USGS Publications Warehouse

    White, C.E.; Fletcher, M.H.; Parks, J.

    1951-01-01

    The gravimetric method in general use for the determination of lithium is tedious, and the final weighed product often contains other alkali metals. A fluorometric method was developed to shorten the time required for the analysis and to assure that the final determination is for lithium alone. This procedure is based on the complex formed between lithium and 8-hydroxyquinoline. The fluorescence is developed in a slightly alkaline solution of 95% alcohol and measurement is made on a photoelectric fluorometer. Separation from the ore is carried out by the wet method or by the distillation procedure. Sodium and potassium are removed by alcohol and ether, but complete separation is not necessary. Comparison of analyzed samples shows excellent agreement with spectrographic and gravimetric methods. The fluorometric method is more rapid than the gravimetric and produces more conclusive results. Another useful application is in the preparation of standard lithium solutions from reagent quality salts when a known standard is available. In this case no separations are necessary.

  18. A method for automatic feature points extraction of human vertebrae three-dimensional model

    NASA Astrophysics Data System (ADS)

    Wu, Zhen; Wu, Junsheng

    2017-05-01

    A method for automatic extraction of the feature points of the human vertebrae three-dimensional model is presented. Firstly, the statistical model of vertebrae feature points is established based on the results of manual vertebrae feature points extraction. Then anatomical axial analysis of the vertebrae model is performed according to the physiological and morphological characteristics of the vertebrae. Using the axial information obtained from the analysis, a projection relationship between the statistical model and the vertebrae model to be extracted is established. According to the projection relationship, the statistical model is matched with the vertebrae model to get the estimated position of the feature point. Finally, by analyzing the curvature in the spherical neighborhood with the estimated position of feature points, the final position of the feature points is obtained. According to the benchmark result on multiple test models, the mean relative errors of feature point positions are less than 5.98%. At more than half of the positions, the error rate is less than 3% and the minimum mean relative error is 0.19%, which verifies the effectiveness of the method.

  19. Probabilistic Structural Analysis Theory Development

    NASA Technical Reports Server (NTRS)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  20. Simulation Research on Vehicle Active Suspension Controller Based on G1 Method

    NASA Astrophysics Data System (ADS)

    Li, Gen; Li, Hang; Zhang, Shuaiyang; Luo, Qiuhui

    2017-09-01

    Based on the order relation analysis method (G1 method), the optimal linear controller of vehicle active suspension is designed. The system of the main and passive suspension of the single wheel vehicle is modeled and the system input signal model is determined. Secondly, the system motion state space equation is established by the kinetic knowledge and the optimal linear controller design is completed with the optimal control theory. The weighting coefficient of the performance index coefficients of the main passive suspension is determined by the relational analysis method. Finally, the model is simulated in Simulink. The simulation results show that: the optimal weight value is determined by using the sequence relation analysis method under the condition of given road conditions, and the vehicle acceleration, suspension stroke and tire motion displacement are optimized to improve the comprehensive performance of the vehicle, and the active control is controlled within the requirements.

  1. Optical solitons, complexitons, Gaussian soliton and power series solutions of a generalized Hirota equation

    NASA Astrophysics Data System (ADS)

    Mao, Jin-Jin; Tian, Shou-Fu; Zou, Li; Zhang, Tian-Tian

    2018-05-01

    In this paper, we consider a generalized Hirota equation with a bounded potential, which can be used to describe the propagation properties of optical soliton solutions. By employing the hypothetical method and the sub-equation method, we construct the bright soliton, dark soliton, complexitons and Gaussian soliton solutions of the Hirota equation. Moreover, we explicitly derive the power series solutions with their convergence analysis. Finally, we provide the graphical analysis of such soliton solutions in order to better understand their dynamical behavior.

  2. A combined approach based on MAF analysis and AHP method to fault detection mapping: A case study from a gas field, southwest of Iran

    NASA Astrophysics Data System (ADS)

    Shakiba, Sima; Asghari, Omid; Khah, Nasser Keshavarz Faraj

    2018-01-01

    A combined geostatitical methodology based on Min/Max Auto-correlation Factor (MAF) analysis and Analytical Hierarchy Process (AHP) is presented to generate a suitable Fault Detection Map (FDM) through seismic attributes. Five seismic attributes derived from a 2D time slice obtained from data related to a gas field located in southwest of Iran are used including instantaneous amplitude, similarity, energy, frequency, and Fault Enhancement Filter (FEF). The MAF analysis is implemented to reduce dimension of input variables, and then AHP method is applied on three obtained de-correlated MAF factors as evidential layer. Three Decision Makers (DMs) are used to construct PCMs for determining weights of selected evidential layer. Finally, weights obtained by AHP were multiplied in normalized valued of each alternative (MAF layers) and the concluded weighted layers were integrated in order to prepare final FDM. Results proved that applying algorithm proposed in this study generate a map more acceptable than the each individual attribute and sharpen the non-surface discontinuities as well as enhancing continuity of detected faults.

  3. Singular-Arc Time-Optimal Trajectory of Aircraft in Two-Dimensional Wind Field

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan

    2006-01-01

    This paper presents a study of a minimum time-to-climb trajectory analysis for aircraft flying in a two-dimensional altitude dependent wind field. The time optimal control problem possesses a singular control structure when the lift coefficient is taken as a control variable. A singular arc analysis is performed to obtain an optimal control solution on the singular arc. Using a time-scale separation with the flight path angle treated as a fast state, the dimensionality of the optimal control solution is reduced by eliminating the lift coefficient control. A further singular arc analysis is used to decompose the original optimal control solution into the flight path angle solution and a trajectory solution as a function of the airspeed and altitude. The optimal control solutions for the initial and final climb segments are computed using a shooting method with known starting values on the singular arc The numerical results of the shooting method show that the optimal flight path angle on the initial and final climb segments are constant. The analytical approach provides a rapid means for analyzing a time optimal trajectory for aircraft performance.

  4. A Finite-Volume "Shaving" Method for Interfacing NASA/DAO''s Physical Space Statistical Analysis System to the Finite-Volume GCM with a Lagrangian Control-Volume Vertical Coordinate

    NASA Technical Reports Server (NTRS)

    Lin, Shian-Jiann; DaSilva, Arlindo; Atlas, Robert (Technical Monitor)

    2001-01-01

    Toward the development of a finite-volume Data Assimilation System (fvDAS), a consistent finite-volume methodology is developed for interfacing the NASA/DAO's Physical Space Statistical Analysis System (PSAS) to the joint NASA/NCAR finite volume CCM3 (fvCCM3). To take advantage of the Lagrangian control-volume vertical coordinate of the fvCCM3, a novel "shaving" method is applied to the lowest few model layers to reflect the surface pressure changes as implied by the final analysis. Analysis increments (from PSAS) to the upper air variables are then consistently put onto the Lagrangian layers as adjustments to the volume-mean quantities during the analysis cycle. This approach is demonstrated to be superior to the conventional method of using independently computed "tendency terms" for surface pressure and upper air prognostic variables.

  5. Optimization of Parameter Ranges for Composite Tape Winding Process Based on Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Yu, Tao; Shi, Yaoyao; He, Xiaodong; Kang, Chao; Deng, Bo; Song, Shibo

    2017-08-01

    This study is focus on the parameters sensitivity of winding process for composite prepreg tape. The methods of multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis are proposed. The polynomial empirical model of interlaminar shear strength is established by response surface experimental method. Using this model, the relative sensitivity of key process parameters including temperature, tension, pressure and velocity is calculated, while the single-parameter sensitivity curves are obtained. According to the analysis of sensitivity curves, the stability and instability range of each parameter are recognized. Finally, the optimization method of winding process parameters is developed. The analysis results show that the optimized ranges of the process parameters for interlaminar shear strength are: temperature within [100 °C, 150 °C], tension within [275 N, 387 N], pressure within [800 N, 1500 N], and velocity within [0.2 m/s, 0.4 m/s], respectively.

  6. A tribute to John Gibbon.

    PubMed

    Church, Russell M.

    2002-04-28

    This article provides an overview of the published research of John Gibbon. It describes his experimental research on scalar timing and his development of scalar timing theory. It also describes his methods of research which included mathematical analysis, conditioning methods, psychophysical methods and secondary data analysis. Finally, it describes his application of scalar timing theory to avoidance and punishment, autoshaping, temporal perception and timed behavior, foraging, circadian rhythms, human timing, and the effect of drugs on timed perception and timed performance of Parkinson's patients. The research of Gibbon has shown the essential role of timing in perception, classical conditioning, instrumental learning, behavior in natural environments and in neuropsychology.

  7. Computational Analysis of the Caenorhabditis elegans Germline to Study the Distribution of Nuclei, Proteins, and the Cytoskeleton.

    PubMed

    Gopal, Sandeep; Pocock, Roger

    2018-04-19

    The Caenorhabditis elegans (C. elegans) germline is used to study several biologically important processes including stem cell development, apoptosis, and chromosome dynamics. While the germline is an excellent model, the analysis is often two dimensional due to the time and labor required for three-dimensional analysis. Major readouts in such studies are the number/position of nuclei and protein distribution within the germline. Here, we present a method to perform automated analysis of the germline using confocal microscopy and computational approaches to determine the number and position of nuclei in each region of the germline. Our method also analyzes germline protein distribution that enables the three-dimensional examination of protein expression in different genetic backgrounds. Further, our study shows variations in cytoskeletal architecture in distinct regions of the germline that may accommodate specific spatial developmental requirements. Finally, our method enables automated counting of the sperm in the spermatheca of each germline. Taken together, our method enables rapid and reproducible phenotypic analysis of the C. elegans germline.

  8. Roughness analysis of grade breaks at intersections : final report.

    DOT National Transportation Integrated Search

    1993-03-01

    A method to analyze the roughness of grade breaks at highway intersections is proposed. Although there are a variety of instruments to physically measure the road roughness, there are no known methodologies to analyze profile roughness during the des...

  9. Evaluation/modification of IDOT foundation piling design and construction policy.

    DOT National Transportation Integrated Search

    2009-03-01

    The Illinois Department of Transportation (IDOT) estimates pile lengths based on a static analysis method; : however, the final length of the pile is determined with a dynamic formula based on the pile driving resistance : exhibited in the field. Bec...

  10. Needs, barriers, and analysis methods for integrated urban freight transportation : final report.

    DOT National Transportation Integrated Search

    2015-08-01

    In this joint project University of Maryland, West Virginia University, and Morgan State University worked together to : solve critical problems associated with urban freight systems. A review of literature and case studies on freight : villages and ...

  11. Flood frequency analysis using optimization techniques : final report.

    DOT National Transportation Integrated Search

    1992-10-01

    this study consists of three parts. In the first part, a comprehensive investigation was made to find an improved estimation method for the log-Pearson type 3 (LP3) distribution by using optimization techniques. Ninety sets of observed Louisiana floo...

  12. Application of hierarchical cascading technique to finite element method simulation in bulk acoustic wave devices

    NASA Astrophysics Data System (ADS)

    Li, Xinyi; Bao, Jingfu; Huang, Yulin; Zhang, Benfeng; Omori, Tatsuya; Hashimoto, Ken-ya

    2018-07-01

    In this paper, we propose the use of the hierarchical cascading technique (HCT) for the finite element method (FEM) analysis of bulk acoustic wave (BAW) devices. First, the implementation of this technique is presented for the FEM analysis of BAW devices. It is shown that the traveling-wave excitation sources proposed by the authors are fully compatible with the HCT. Furthermore, a HCT-based absorbing mechanism is also proposed to replace the perfectly matched layer (PML). Finally, it is demonstrated how the technique is much more efficient in terms of memory consumption and execution time than the full FEM analysis.

  13. Application of an enriched FEM technique in thermo-mechanical contact problems

    NASA Astrophysics Data System (ADS)

    Khoei, A. R.; Bahmani, B.

    2018-02-01

    In this paper, an enriched FEM technique is employed for thermo-mechanical contact problem based on the extended finite element method. A fully coupled thermo-mechanical contact formulation is presented in the framework of X-FEM technique that takes into account the deformable continuum mechanics and the transient heat transfer analysis. The Coulomb frictional law is applied for the mechanical contact problem and a pressure dependent thermal contact model is employed through an explicit formulation in the weak form of X-FEM method. The equilibrium equations are discretized by the Newmark time splitting method and the final set of non-linear equations are solved based on the Newton-Raphson method using a staggered algorithm. Finally, in order to illustrate the capability of the proposed computational model several numerical examples are solved and the results are compared with those reported in literature.

  14. [Applications of meta-analysis in multi-omics].

    PubMed

    Han, Mingfei; Zhu, Yunping

    2014-07-01

    As a statistical method integrating multi-features and multi-data, meta-analysis was introduced to the field of life science in the 1990s. With the rapid advances in high-throughput technologies, life omics, the core of which are genomics, transcriptomics and proteomics, is becoming the new hot spot of life science. Although the fast output of massive data has promoted the development of omics study, it results in excessive data that are difficult to integrate systematically. In this case, meta-analysis is frequently applied to analyze different types of data and is improved continuously. Here, we first summarize the representative meta-analysis methods systematically, and then study the current applications of meta-analysis in various omics fields, finally we discuss the still-existing problems and the future development of meta-analysis.

  15. A Finite Element Analysis of a Class of Problems in Elasto-Plasticity with Hidden Variables.

    DTIC Science & Technology

    1985-09-01

    RD-R761 642 A FINITE ELEMENT ANALYSIS OF A CLASS OF PROBLEMS IN 1/2 ELASTO-PLASTICITY MIlT (U) TEXAS INST FOR COMPUTATIONAL MECHANICS AUSTIN J T ODEN...end Subtitle) S. TYPE OF REPORT & PERIOD COVERED A FINITE ELEMENT ANALYSIS OF A CLASS OF PROBLEMS Final Report IN ELASTO-PLASTICITY WITH HIDDEN...aieeoc ede It neceeeary nd Identify by block number) ;"Elastoplasticity, finite deformations; non-convex analysis ; finite element methods, metal forming

  16. The analysis of carbohydrates in milk powder by a new "heart-cutting" two-dimensional liquid chromatography method.

    PubMed

    Ma, Jing; Hou, Xiaofang; Zhang, Bing; Wang, Yunan; He, Langchong

    2014-03-01

    In this study, a new"heart-cutting" two-dimensional liquid chromatography method for the simultaneous determination of carbohydrate contents in milk powder was presented. In this two dimensional liquid chromatography system, a Venusil XBP-C4 analysis column was used in the first dimension ((1)D) as a pre-separation column, a ZORBAX carbohydrates analysis column was used in the second dimension ((2)D) as a final-analysis column. The whole process was completed in less than 35min without a particular sample preparation procedure. The capability of the new two dimensional HPLC method was demonstrated in the determination of carbohydrates in various brands of milk powder samples. A conventional one dimensional chromatography method was also proposed. The two proposed methods were both validated in terms of linearity, limits of detection, accuracy and precision. The comparison between the results obtained with the two methods showed that the new and completely automated two dimensional liquid chromatography method is more suitable for milk powder sample because of its online cleanup effect involved. Crown Copyright © 2013. Published by Elsevier B.V. All rights reserved.

  17. Vectorized Monte Carlo methods for reactor lattice analysis

    NASA Technical Reports Server (NTRS)

    Brown, F. B.

    1984-01-01

    Some of the new computational methods and equivalent mathematical representations of physics models used in the MCV code, a vectorized continuous-enery Monte Carlo code for use on the CYBER-205 computer are discussed. While the principal application of MCV is the neutronics analysis of repeating reactor lattices, the new methods used in MCV should be generally useful for vectorizing Monte Carlo for other applications. For background, a brief overview of the vector processing features of the CYBER-205 is included, followed by a discussion of the fundamentals of Monte Carlo vectorization. The physics models used in the MCV vectorized Monte Carlo code are then summarized. The new methods used in scattering analysis are presented along with details of several key, highly specialized computational routines. Finally, speedups relative to CDC-7600 scalar Monte Carlo are discussed.

  18. Nonlinear multivariate and time series analysis by neural network methods

    NASA Astrophysics Data System (ADS)

    Hsieh, William W.

    2004-03-01

    Methods in multivariate statistical analysis are essential for working with large amounts of geophysical data, data from observational arrays, from satellites, or from numerical model output. In classical multivariate statistical analysis, there is a hierarchy of methods, starting with linear regression at the base, followed by principal component analysis (PCA) and finally canonical correlation analysis (CCA). A multivariate time series method, the singular spectrum analysis (SSA), has been a fruitful extension of the PCA technique. The common drawback of these classical methods is that only linear structures can be correctly extracted from the data. Since the late 1980s, neural network methods have become popular for performing nonlinear regression and classification. More recently, neural network methods have been extended to perform nonlinear PCA (NLPCA), nonlinear CCA (NLCCA), and nonlinear SSA (NLSSA). This paper presents a unified view of the NLPCA, NLCCA, and NLSSA techniques and their applications to various data sets of the atmosphere and the ocean (especially for the El Niño-Southern Oscillation and the stratospheric quasi-biennial oscillation). These data sets reveal that the linear methods are often too simplistic to describe real-world systems, with a tendency to scatter a single oscillatory phenomenon into numerous unphysical modes or higher harmonics, which can be largely alleviated in the new nonlinear paradigm.

  19. Cross-country Analysis of ICT and Education Indicators: An Exploratory Study

    NASA Astrophysics Data System (ADS)

    Pratama, Ahmad R.

    2017-03-01

    This paper explores the relationship between world ICT and education indicators by using the latest available data from World Bank and UNESCO in range of 2011-2014 with the help of different exploratory methods such as principal component analysis (PCA), factor analysis (FA), cluster analysis, and ordinary least square (OLS) regression. After dealing with all missing values, 119 countries were included in the final dataset. The findings show that most ICT and education indicators are highly associated with income of the respective country and therefore confirm the existence of digital divide in ICT utilization and participation gap in education between rich and poor countries. It also indicates that digital divide and participation gap is highly associated with each other. Finally, the findings also confirm reverse causality in ICT and education; higher participation rate in education increases technology utilization, which in turn helps promote better outcomes of education.

  20. Final Environmental Impact Statement on Debris Removal from Boston Harbor, Massachusetts. Revision.

    DTIC Science & Technology

    1980-05-01

    34Trace Metal Analysis of Boston Harbor Waters and Sediments", July 1972. Storey , D. A., "The Massachusetts Marina Boatyard Industry 1972-1973", Mass...is possible that a feasible re-use alternative will be identified during the final design stage of the project. If this happens, and the method of re...points. Coliform counts in the Outer Harbor routinely exceed the SB standard designated for that area. 2.27 In summary, the Harbor receives a heavy

  1. Zircon Carburation Studies as Intermediate Stage in the Zirconium Fabrication; ESTUDIOS ENCAMINADOS A LA CARBURACTION DEL CIRON COMO ETAPA INTERMEDIA EN LA OBTENCION DE CIRCONIO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huertas, V.A.; Gonzalez, L.S.; Lopez, M.

    1963-01-01

    Zirconium carbide and carbonitride mixtures were obtained by Kroll's method. Reaction products were identified by micrography and x-ray diffraction analysis. The optimum graphite content in the initial charge for the carburization reaction was studied. Zirconium, silicon, and carbon content in the final product was controlled as a function of current in the furnace and reaction time. Further chlorination of the final product was performed successfully. (auth)

  2. The Uncertainties on the GIS Based Land Suitability Assessment for Urban and Rural Planning

    NASA Astrophysics Data System (ADS)

    Liu, H.; Zhan, Q.; Zhan, M.

    2017-09-01

    The majority of the research on the uncertainties of spatial data and spatial analysis focuses on some specific data feature or analysis tool. Few have accomplished the uncertainties of the whole process of an application like planning, making the research of uncertainties detached from practical applications. The paper discusses the uncertainties of the geographical information systems (GIS) based land suitability assessment in planning on the basis of literature review. The uncertainties considered range from index system establishment to the classification of the final result. Methods to reduce the uncertainties arise from the discretization of continuous raster data and the index weight determination are summarized. The paper analyzes the merits and demerits of the "Nature Breaks" method which is broadly used by planners. It also explores the other factors which impact the accuracy of the final classification like the selection of class numbers, intervals and the autocorrelation of the spatial data. In the conclusion part, the paper indicates that the adoption of machine learning methods should be modified to integrate the complexity of land suitability assessment. The work contributes to the application of spatial data and spatial analysis uncertainty research on land suitability assessment, and promotes the scientific level of the later planning and decision-making.

  3. Prediction of flood quantiles at ungaged watersheds in Louisiana : final report.

    DOT National Transportation Integrated Search

    1989-12-01

    Four popular regional flood frequency methods were compared using Louisiana stream flow series. The state was divided into four homogeneous regions and all undistorted, long-term stream gages were used in the analysis. The GEV, TCEV, regional LP3 and...

  4. Analysis and evaluation of methods for backcalculation of Mr values : volume 1 : research report : final report.

    DOT National Transportation Integrated Search

    1993-01-01

    Use of the 1986 AASHTO Design Guide requires accurate estimates of the resilient modulus of flexible pavement materials. Traditionally, these properties have been determined from either laboratory testing or by backcalculation from deflection data. S...

  5. 78 FR 69369 - Laminated Woven Sacks From the People's Republic of China: Final Results of the Expedited Sunset...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-19

    ... fabric; laminated by any method either to an exterior ply of plastic film such as biaxially-oriented... (Decision Memorandum). Analysis of Comments Received All issues raised in this sunset review are addressed...

  6. CONCEPTS AND APPROACHES FOR THE BIOASSESSMENT OF NON-WADEABLE STREAMS AND RIVERS

    EPA Science Inventory

    This document is intended to assist users in establishing or refining protocols, including the specific methods related to field sampling, laboratory sample processing, taxonomy, data entry, management and analysis, and final assessment and reporting. It also reviews and provide...

  7. Addressing the issue of insufficient information in data-based bridge health monitoring : final report.

    DOT National Transportation Integrated Search

    2015-11-01

    One of the most efficient ways to solve the damage detection problem using the statistical pattern recognition : approach is that of exploiting the methods of outlier analysis. Cast within the pattern recognition framework, : damage detection assesse...

  8. Application of finite element method in mechanical design of automotive parts

    NASA Astrophysics Data System (ADS)

    Gu, Suohai

    2017-09-01

    As an effective numerical analysis method, finite element method (FEM) has been widely used in mechanical design and other fields. In this paper, the development of FEM is introduced firstly, then the specific steps of FEM applications are illustrated and the difficulties of FEM are summarized in detail. Finally, applications of FEM in automobile components such as automobile wheel, steel plate spring, body frame, shaft parts and so on are summarized, compared with related research experiments.

  9. What is the Final Verification of Engineering Requirements?

    NASA Technical Reports Server (NTRS)

    Poole, Eric

    2010-01-01

    This slide presentation reviews the process of development through the final verification of engineering requirements. The definition of the requirements is driven by basic needs, and should be reviewed by both the supplier and the customer. All involved need to agree upon a formal requirements including changes to the original requirements document. After the requirements have ben developed, the engineering team begins to design the system. The final design is reviewed by other organizations. The final operational system must satisfy the original requirements, though many verifications should be performed during the process. The verification methods that are used are test, inspection, analysis and demonstration. The plan for verification should be created once the system requirements are documented. The plan should include assurances that every requirement is formally verified, that the methods and the responsible organizations are specified, and that the plan is reviewed by all parties. The options of having the engineering team involved in all phases of the development as opposed to having some other organization continue the process once the design has been complete is discussed.

  10. Topological Analysis of Wireless Networks (TAWN)

    DTIC Science & Technology

    2016-05-31

    transmissions from any other node. Definition 1. A wireless network vulnerability is its susceptibility to becoming disconnected when a single source of...19b. TELEPHONE NUMBER (Include area code) 31-05-2016 FINAL REPORT 12-02-2015 -- 31-05-2016 Topological Analysis of Wireless Networks (TAWN) Robinson...Release, Distribution Unlimited) N/A The goal of this project was to develop topological methods to detect and localize vulnerabilities of wireless

  11. Allocating Sample Sizes to Reduce Budget for Fixed-Effect 2×2 Heterogeneous Analysis of Variance

    ERIC Educational Resources Information Center

    Luh, Wei-Ming; Guo, Jiin-Huarng

    2016-01-01

    This article discusses the sample size requirements for the interaction, row, and column effects, respectively, by forming a linear contrast for a 2×2 factorial design for fixed-effects heterogeneous analysis of variance. The proposed method uses the Welch t test and its corresponding degrees of freedom to calculate the final sample size in a…

  12. At Last: "What's Discourse Got to Do with It?" A Meditation on Critical Discourse Analysis in Literacy Research

    ERIC Educational Resources Information Center

    Lewis, Cynthia

    2006-01-01

    Lewis explains why critical discourse analysis (CDA) has become an indispensable method for many researchers trying to understand how ideologies and social structures are reflected in and reified by language. The critical linguistic turn that has occurred in the humanities and social sciences for the last three decade has finally taken hold in the…

  13. [Policy recommendations based on SWOT analysis for agricultural industrialization of traditional Chinese medicinal materials--a case study of uncariae ramulus cum uncis from Jianhe county in Guizhou province].

    PubMed

    Hu, Yong; Huo, Ke-Yi; Xiang, Hua

    2013-09-01

    This thesis reviews the historical background of agricultural industrialization, and analyzes the major theories of agricultural industrialization. It also utilizes SWOT analysis method to discuss the industrialization of traditional Chinese medicinal materials in Jianhe county, and finally it puts forward the recommendations for its further development.

  14. Systematic text condensation: a strategy for qualitative analysis.

    PubMed

    Malterud, Kirsti

    2012-12-01

    To present background, principles, and procedures for a strategy for qualitative analysis called systematic text condensation and discuss this approach compared with related strategies. Giorgi's psychological phenomenological analysis is the point of departure and inspiration for systematic text condensation. The basic elements of Giorgi's method and the elaboration of these in systematic text condensation are presented, followed by a detailed description of procedures for analysis according to systematic text condensation. Finally, similarities and differences compared with other frequently applied methods for qualitative analysis are identified, as the foundation of a discussion of strengths and limitations of systematic text condensation. Systematic text condensation is a descriptive and explorative method for thematic cross-case analysis of different types of qualitative data, such as interview studies, observational studies, and analysis of written texts. The method represents a pragmatic approach, although inspired by phenomenological ideas, and various theoretical frameworks can be applied. The procedure consists of the following steps: 1) total impression - from chaos to themes; 2) identifying and sorting meaning units - from themes to codes; 3) condensation - from code to meaning; 4) synthesizing - from condensation to descriptions and concepts. Similarities and differences comparing systematic text condensation with other frequently applied qualitative methods regarding thematic analysis, theoretical methodological framework, analysis procedures, and taxonomy are discussed. Systematic text condensation is a strategy for analysis developed from traditions shared by most of the methods for analysis of qualitative data. The method offers the novice researcher a process of intersubjectivity, reflexivity, and feasibility, while maintaining a responsible level of methodological rigour.

  15. Bayesian survival analysis in clinical trials: What methods are used in practice?

    PubMed

    Brard, Caroline; Le Teuff, Gwénaël; Le Deley, Marie-Cécile; Hampson, Lisa V

    2017-02-01

    Background Bayesian statistics are an appealing alternative to the traditional frequentist approach to designing, analysing, and reporting of clinical trials, especially in rare diseases. Time-to-event endpoints are widely used in many medical fields. There are additional complexities to designing Bayesian survival trials which arise from the need to specify a model for the survival distribution. The objective of this article was to critically review the use and reporting of Bayesian methods in survival trials. Methods A systematic review of clinical trials using Bayesian survival analyses was performed through PubMed and Web of Science databases. This was complemented by a full text search of the online repositories of pre-selected journals. Cost-effectiveness, dose-finding studies, meta-analyses, and methodological papers using clinical trials were excluded. Results In total, 28 articles met the inclusion criteria, 25 were original reports of clinical trials and 3 were re-analyses of a clinical trial. Most trials were in oncology (n = 25), were randomised controlled (n = 21) phase III trials (n = 13), and half considered a rare disease (n = 13). Bayesian approaches were used for monitoring in 14 trials and for the final analysis only in 14 trials. In the latter case, Bayesian survival analyses were used for the primary analysis in four cases, for the secondary analysis in seven cases, and for the trial re-analysis in three cases. Overall, 12 articles reported fitting Bayesian regression models (semi-parametric, n = 3; parametric, n = 9). Prior distributions were often incompletely reported: 20 articles did not define the prior distribution used for the parameter of interest. Over half of the trials used only non-informative priors for monitoring and the final analysis (n = 12) when it was specified. Indeed, no articles fitting Bayesian regression models placed informative priors on the parameter of interest. The prior for the treatment effect was based on historical data in only four trials. Decision rules were pre-defined in eight cases when trials used Bayesian monitoring, and in only one case when trials adopted a Bayesian approach to the final analysis. Conclusion Few trials implemented a Bayesian survival analysis and few incorporated external data into priors. There is scope to improve the quality of reporting of Bayesian methods in survival trials. Extension of the Consolidated Standards of Reporting Trials statement for reporting Bayesian clinical trials is recommended.

  16. Optimization of Interior Permanent Magnet Motor by Quality Engineering and Multivariate Analysis

    NASA Astrophysics Data System (ADS)

    Okada, Yukihiro; Kawase, Yoshihiro

    This paper has described the method of optimization based on the finite element method. The quality engineering and the multivariable analysis are used as the optimization technique. This optimizing method consists of two steps. At Step.1, the influence of parameters for output is obtained quantitatively, at Step.2, the number of calculation by the FEM can be cut down. That is, the optimal combination of the design parameters, which satisfies the required characteristic, can be searched for efficiently. In addition, this method is applied to a design of IPM motor to reduce the torque ripple. The final shape can maintain average torque and cut down the torque ripple 65%. Furthermore, the amount of permanent magnets can be reduced.

  17. A comparison between Warner-Bratzler shear force measurement and texture profile analysis of meat and meat products: a review

    NASA Astrophysics Data System (ADS)

    Novaković, S.; Tomašević, I.

    2017-09-01

    Texture is one of the most important characteristics of meat and we can explain it as the human physiological-psychological awareness of a number of rheological and other properties of foods and their relations. In this paper, we discuss instrumental measurement of texture by Warner-Bratzler shear force (WBSF) and texture profile analysis (TPA). The conditions for using the device are detailed in WBSF measurements, and the influence of different parameters on the execution of the method and final results are shown. After that, the main disadvantages are reflected in the non-standardized method. Also, we introduce basic texture parameters which connect and separate TPA and WBSF methods and mention contemporary methods with their main advantage.

  18. Investigation of safety analysis methods using computer vision techniques

    NASA Astrophysics Data System (ADS)

    Shirazi, Mohammad Shokrolah; Morris, Brendan Tran

    2017-09-01

    This work investigates safety analysis methods using computer vision techniques. The vision-based tracking system is developed to provide the trajectory of road users including vehicles and pedestrians. Safety analysis methods are developed to estimate time to collision (TTC) and postencroachment time (PET) that are two important safety measurements. Corresponding algorithms are presented and their advantages and drawbacks are shown through their success in capturing the conflict events in real time. The performance of the tracking system is evaluated first, and probability density estimation of TTC and PET are shown for 1-h monitoring of a Las Vegas intersection. Finally, an idea of an intersection safety map is introduced, and TTC values of two different intersections are estimated for 1 day from 8:00 a.m. to 6:00 p.m.

  19. Pretest online discussion groups to augment teaching and learning.

    PubMed

    Kuhn, Jonathan; Hasbargen, Barbara; Miziniak, Halina

    2010-01-01

    Tests and final examination scores of three semesters of control students in a nursing foundation course were compared with tests and final examination scores of three semesters of participating students. Participating students were offered access to an asynchronous pretest online discussion activity with a faculty e-moderator. While the simplified Bloom's revised taxonomy assisted in creating appropriate preparatory test and final examination questions for pretest online discussion, Salmon's five-stage online method provided direction to the e-moderator on how to encourage students to achieve Bloom's higher-order thinking skills during the pretest online discussions. Statistical analysis showed the pretest online discussion activity had a generally positive impact on tests and final examination scores, when controlling for a number of possible confounding variables, including instructor, cumulative grade point average, age, and credit hours.

  20. Space Radiation Transport Methods Development

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tripathi, R. K.; Qualls, G. D.; Cucinotta, F. A.; Prael, R. E.; Norbury, J. W.; Heinbockel, J. H.; Tweed, J.

    2002-01-01

    Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard Finite Element Method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 milliseconds and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of reconfigurable computing and could be utilized in the final design as verification of the deterministic method optimized design.

  1. Reliability analysis method of a solar array by using fault tree analysis and fuzzy reasoning Petri net

    NASA Astrophysics Data System (ADS)

    Wu, Jianing; Yan, Shaoze; Xie, Liyang

    2011-12-01

    To address the impact of solar array anomalies, it is important to perform analysis of the solar array reliability. This paper establishes the fault tree analysis (FTA) and fuzzy reasoning Petri net (FRPN) models of a solar array mechanical system and analyzes reliability to find mechanisms of the solar array fault. The index final truth degree (FTD) and cosine matching function (CMF) are employed to resolve the issue of how to evaluate the importance and influence of different faults. So an improvement reliability analysis method is developed by means of the sorting of FTD and CMF. An example is analyzed using the proposed method. The analysis results show that harsh thermal environment and impact caused by particles in space are the most vital causes of the solar array fault. Furthermore, other fault modes and the corresponding improvement methods are discussed. The results reported in this paper could be useful for the spacecraft designers, particularly, in the process of redesigning the solar array and scheduling its reliability growth plan.

  2. Kinematics Control and Analysis of Industrial Robot

    NASA Astrophysics Data System (ADS)

    Zhu, Tongbo; Cai, Fan; Li, Yongmei; Liu, Wei

    2018-03-01

    The robot’s development present situation, basic principle and control system are introduced briefly. Research is mainly focused on the study of the robot’s kinematics and motion control. The structural analysis of a planar articulated robot (SCARA) robot is presented,the coordinate system is established to obtain the position and orientation matrix of the end effector,a method of robot kinematics analysis based on homogeneous transformation method is proposed, and the kinematics solution of the robot is obtained.Establishment of industrial robot’s kinematics equation and formula for positive kinematics by example. Finally,the kinematic analysis of this robot was verified by examples.It provides a basis for structural design and motion control.It has active significance to promote the motion control of industrial robot.

  3. Fire-protection research for energy technology: Fy 80 year end report

    NASA Astrophysics Data System (ADS)

    Hasegawa, H. K.; Alvares, N. J.; Lipska, A. E.; Ford, H.; Priante, S.; Beason, D. G.

    1981-05-01

    This continuing research program was initiated in order to advance fire protection strategies for Fusion Energy Experiments (FEE). The program expanded to encompass other forms of energy research. Accomplishments for fiscal year 1980 were: finalization of the fault-free analysis of the Shiva fire management system; development of a second-generation, fire-growth analysis using an alternate model and new LLNL combustion dynamics data; improvements of techniques for chemical smoke aerosol analysis; development and test of a simple method to assess the corrosive potential of smoke aerosols; development of an initial aerosol dilution system; completion of primary small-scale tests for measurements of the dynamics of cable fires; finalization of primary survey format for non-LLNL energy technology facilities; and studies of fire dynamics and aerosol production from electrical insulation and computer tape cassettes.

  4. D2 Delta Robot Structural Design and Kinematics Analysis

    NASA Astrophysics Data System (ADS)

    Yang, Xudong; wang, Song; Dong, Yu; Yang, Hai

    2017-12-01

    In this paper, a new type of Delta robot with only two degrees of freedom is proposed on the basis of multi - degree - of - freedom delta robot. In order to meet our application requirements, we have carried out structural design and analysis of the robot. Through SolidWorks modeling, combined with 3D printing technology to determine the final robot structure. In order to achieve the precise control of the robot, the kinematics analysis of the robot was carried out. The SimMechanics toolbox of MATLAB is used to establish the mechanism model, and the kinematics mathematical model is used to simulate the robot motion control in Matlab environment. Finally, according to the design mechanism, the working space of the robot is drawn by the graphic method, which lays the foundation for the motion control of the subsequent robot.

  5. [A novel method of multi-channel feature extraction combining multivariate autoregression and multiple-linear principal component analysis].

    PubMed

    Wang, Jinjia; Zhang, Yanna

    2015-02-01

    Brain-computer interface (BCI) systems identify brain signals through extracting features from them. In view of the limitations of the autoregressive model feature extraction method and the traditional principal component analysis to deal with the multichannel signals, this paper presents a multichannel feature extraction method that multivariate autoregressive (MVAR) model combined with the multiple-linear principal component analysis (MPCA), and used for magnetoencephalography (MEG) signals and electroencephalograph (EEG) signals recognition. Firstly, we calculated the MVAR model coefficient matrix of the MEG/EEG signals using this method, and then reduced the dimensions to a lower one, using MPCA. Finally, we recognized brain signals by Bayes Classifier. The key innovation we introduced in our investigation showed that we extended the traditional single-channel feature extraction method to the case of multi-channel one. We then carried out the experiments using the data groups of IV-III and IV - I. The experimental results proved that the method proposed in this paper was feasible.

  6. A novel feature ranking method for prediction of cancer stages using proteomics data

    PubMed Central

    Saghapour, Ehsan; Sehhati, Mohammadreza

    2017-01-01

    Proteomic analysis of cancers' stages has provided new opportunities for the development of novel, highly sensitive diagnostic tools which helps early detection of cancer. This paper introduces a new feature ranking approach called FRMT. FRMT is based on the Technique for Order of Preference by Similarity to Ideal Solution method (TOPSIS) which select the most discriminative proteins from proteomics data for cancer staging. In this approach, outcomes of 10 feature selection techniques were combined by TOPSIS method, to select the final discriminative proteins from seven different proteomic databases of protein expression profiles. In the proposed workflow, feature selection methods and protein expressions have been considered as criteria and alternatives in TOPSIS, respectively. The proposed method is tested on seven various classifier models in a 10-fold cross validation procedure that repeated 30 times on the seven cancer datasets. The obtained results proved the higher stability and superior classification performance of method in comparison with other methods, and it is less sensitive to the applied classifier. Moreover, the final introduced proteins are informative and have the potential for application in the real medical practice. PMID:28934234

  7. A static analysis method for barge-impact design of bridges with consideration of dynamic amplification : final report, November 2009.

    DOT National Transportation Integrated Search

    2009-11-01

    Current practice with regard to designing bridge structures to resist impact loads associated with barge collisions relies upon the : use of the American Association of State Highway and Transportation Officials (AASHTO) bridge design specifications....

  8. Evaluation and analysis of liquid deicers for winter maintenance : final report.

    DOT National Transportation Integrated Search

    2017-09-01

    The Ohio Department of Transportation (ODOT) uses mechanical and chemical methods to keep the roads safe during snow and ice events. Chemical solutions are available on the market to assist ODOT in preventing snow from bonding to the road surface as ...

  9. Rapid Prototyping Technology for Manufacturing GTE Turbine Blades

    NASA Astrophysics Data System (ADS)

    Balyakin, A. V.; Dobryshkina, E. M.; Vdovin, R. A.; Alekseev, V. P.

    2018-03-01

    The conventional approach to manufacturing turbine blades by investment casting is expensive and time-consuming, as it takes a lot of time to make geometrically precise and complex wax patterns. Turbine blade manufacturing in pilot production can be sped up by accelerating the casting process while keeping the geometric precision of the final product. This paper compares the rapid prototyping method (casting the wax pattern composition into elastic silicone molds) to the conventional technology. Analysis of the size precision of blade casts shows that silicon-mold casting features sufficient geometric precision. Thus, this method for making wax patterns can be a cost-efficient solution for small-batch or pilot production of turbine blades for gas-turbine units (GTU) and gas-turbine engines (GTE). The paper demonstrates how additive technology and thermographic analysis can speed up the cooling of wax patterns in silicone molds. This is possible at an optimal temperature and solidification time, which make the process more cost-efficient while keeping the geometric quality of the final product.

  10. Intraoperative consultation of central nervous system lesions. Frozen section, cytology or both?

    PubMed

    Sharifabadi, Ali Haidari; Haeri, Hayedeh; Zeinalizadeh, Mehdi; Zargari, Neda; Razavi, Amirnader Emami; Shahbazi, Nargess; Tahvildari, Malahat; Azmoudeh-Ardalan, Farid

    2016-03-01

    Frozen section is the traditional method of assessing central nervous system (CNS) lesions intraoperatively. Our aim is to determine the diagnostic accuracy of frozen section and/or cytological evaluation of CNS lesions in our center. A total of 157 patients with CNS lesions underwent open surgical biopsy or excision in our center during a period of 2 years (2012-2013). All specimens were studied cytologically; of these specimens, 146 cases were also examined by frozen section. Cytology and frozen section slides were studied separately by two general pathologists who were blind to final diagnoses. The final diagnoses were based on permanent sections and IHC studies. The accuracy rates of frozen section analysis and cytological evaluation were 87% and 86%, respectively. If the two methods were considered together, the accuracy rate improved to about 95%. Cytological evaluation is an acceptable alternative to frozen section analysis and also a great supplement to the diagnosis of CNS lesions. Copyright © 2015 Elsevier GmbH. All rights reserved.

  11. Distribution of the two-sample t-test statistic following blinded sample size re-estimation.

    PubMed

    Lu, Kaifeng

    2016-05-01

    We consider the blinded sample size re-estimation based on the simple one-sample variance estimator at an interim analysis. We characterize the exact distribution of the standard two-sample t-test statistic at the final analysis. We describe a simulation algorithm for the evaluation of the probability of rejecting the null hypothesis at given treatment effect. We compare the blinded sample size re-estimation method with two unblinded methods with respect to the empirical type I error, the empirical power, and the empirical distribution of the standard deviation estimator and final sample size. We characterize the type I error inflation across the range of standardized non-inferiority margin for non-inferiority trials, and derive the adjusted significance level to ensure type I error control for given sample size of the internal pilot study. We show that the adjusted significance level increases as the sample size of the internal pilot study increases. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  12. A Simple Method for Causal Analysis of Return on IT Investment

    PubMed Central

    Alemi, Farrokh; Zargoush, Manaf; Oakes, James L.; Edrees, Hanan

    2011-01-01

    This paper proposes a method for examining the causal relationship among investment in information technology (IT) and the organization's productivity. In this method, first a strong relationship among (1) investment in IT, (2) use of IT and (3) organization's productivity is verified using correlations. Second, the assumption that IT investment preceded improved productivity is tested using partial correlation. Finally, the assumption of what may have happened in the absence of IT investment, the so called counterfactual, is tested through forecasting productivity at different levels of investment. The paper applies the proposed method to investment in the Veterans Health Information Systems and Technology Architecture (VISTA) system. Result show that the causal analysis can be done, even with limited data. Furthermore, because the procedure relies on overall organization's productivity, it might be more objective than when the analyst picks and chooses which costs and benefits should be included in the analysis. PMID:23019515

  13. The Delicate Analysis of Short-Term Load Forecasting

    NASA Astrophysics Data System (ADS)

    Song, Changwei; Zheng, Yuan

    2017-05-01

    This paper proposes a new method for short-term load forecasting based on the similar day method, correlation coefficient and Fast Fourier Transform (FFT) to achieve the precision analysis of load variation from three aspects (typical day, correlation coefficient, spectral analysis) and three dimensions (time dimension, industry dimensions, the main factors influencing the load characteristic such as national policies, regional economic, holidays, electricity and so on). First, the branch algorithm one-class-SVM is adopted to selection the typical day. Second, correlation coefficient method is used to obtain the direction and strength of the linear relationship between two random variables, which can reflect the influence caused by the customer macro policy and the scale of production to the electricity price. Third, Fourier transform residual error correction model is proposed to reflect the nature of load extracting from the residual error. Finally, simulation result indicates the validity and engineering practicability of the proposed method.

  14. Mechanical modeling for magnetorheological elastomer isolators based on constitutive equations and electromagnetic analysis

    NASA Astrophysics Data System (ADS)

    Wang, Qi; Dong, Xufeng; Li, Luyu; Ou, Jinping

    2018-06-01

    As constitutive models are too complicated and existing mechanical models lack universality, these models are beyond satisfaction for magnetorheological elastomer (MRE) devices. In this article, a novel universal method is proposed to build concise mechanical models. Constitutive model and electromagnetic analysis were applied in this method to ensure universality, while a series of derivations and simplifications were carried out to obtain a concise formulation. To illustrate the proposed modeling method, a conical MRE isolator was introduced. Its basic mechanical equations were built based on equilibrium, deformation compatibility, constitutive equations and electromagnetic analysis. An iteration model and a highly efficient differential equation editor based model were then derived to solve the basic mechanical equations. The final simplified mechanical equations were obtained by re-fitting the simulations with a novel optimal algorithm. In the end, verification test of the isolator has proved the accuracy of the derived mechanical model and the modeling method.

  15. Spectrum analysis on quality requirements consideration in software design documents.

    PubMed

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  16. Similitude design for the vibration problems of plates and shells: A review

    NASA Astrophysics Data System (ADS)

    Zhu, Yunpeng; Wang, You; Luo, Zhong; Han, Qingkai; Wang, Deyou

    2017-06-01

    Similitude design plays a vital role in the analysis of vibration and shock problems encountered in large engineering equipment. Similitude design, including dimensional analysis and governing equation method, is founded on the dynamic similitude theory. This study reviews the application of similitude design methods in engineering practice and summarizes the major achievements of the dynamic similitude theory in structural vibration and shock problems in different fields, including marine structures, civil engineering structures, and large power equipment. This study also reviews the dynamic similitude design methods for thin-walled and composite material plates and shells, including the most recent work published by the authors. Structure sensitivity analysis is used to evaluate the scaling factors to attain accurate distorted scaling laws. Finally, this study discusses the existing problems and the potential of the dynamic similitude theory for the analysis of vibration and shock problems of structures.

  17. Vibration Signature Analysis of a Faulted Gear Transmission System

    NASA Technical Reports Server (NTRS)

    Choy, F. K.; Huang, S.; Zakrajsek, J. J.; Handschuh, R. F.; Townsend, D. P.

    1994-01-01

    A comprehensive procedure in predicting faults in gear transmission systems under normal operating conditions is presented. Experimental data was obtained from a spiral bevel gear fatigue test rig at NASA Lewis Research Center. Time synchronous averaged vibration data was recorded throughout the test as the fault progressed from a small single pit to severe pitting over several teeth, and finally tooth fracture. A numerical procedure based on the Winger-Ville distribution was used to examine the time averaged vibration data. Results from the Wigner-Ville procedure are compared to results from a variety of signal analysis techniques which include time domain analysis methods and frequency analysis methods. Using photographs of the gear tooth at various stages of damage, the limitations and accuracy of the various techniques are compared and discussed. Conclusions are drawn from the comparison of the different approaches as well as the applicability of the Wigner-Ville method in predicting gear faults.

  18. Use of multiple cluster analysis methods to explore the validity of a community outcomes concept map.

    PubMed

    Orsi, Rebecca

    2017-02-01

    Concept mapping is now a commonly-used technique for articulating and evaluating programmatic outcomes. However, research regarding validity of knowledge and outcomes produced with concept mapping is sparse. The current study describes quantitative validity analyses using a concept mapping dataset. We sought to increase the validity of concept mapping evaluation results by running multiple cluster analysis methods and then using several metrics to choose from among solutions. We present four different clustering methods based on analyses using the R statistical software package: partitioning around medoids (PAM), fuzzy analysis (FANNY), agglomerative nesting (AGNES) and divisive analysis (DIANA). We then used the Dunn and Davies-Bouldin indices to assist in choosing a valid cluster solution for a concept mapping outcomes evaluation. We conclude that the validity of the outcomes map is high, based on the analyses described. Finally, we discuss areas for further concept mapping methods research. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Determination of U, Th and K in bricks by gamma-ray spectrometry, X-ray fluorescence analysis and neutron activation analysis

    NASA Astrophysics Data System (ADS)

    Bártová, H.; Kučera, J.; Musílek, L.; Trojek, T.; Gregorová, E.

    2017-11-01

    Knowledge of the content of natural radionuclides in bricks can be important in some cases in dosimetry and application of ionizing radiation. Dosimetry of naturally occurring radionuclides in matter (NORM) in general is one of them, the other one, related to radiation protection, is radon exposure evaluation, and finally, it is needed for the thermoluminescence (TL) dating method. The internal dose rate inside bricks is caused mostly by contributions of the natural radionuclides 238U, 232Th, radionuclides of their decay chains, and 40K. The decay chain of 235U is usually much less important. The concentrations of 238U, 232Th and 40K were measured by various methods, namely by gamma-ray spectrometry, X-ray fluorescence analysis (XRF), and neutron activation analysis (NAA) which was used as a reference method. These methods were compared from the point of view of accuracy, limit of detection (LOD), amount of sample needed and sample handling, time demands, and instrument availability.

  20. Measuring Road Network Vulnerability with Sensitivity Analysis

    PubMed Central

    Jun-qiang, Leng; Long-hai, Yang; Liu, Wei-yi; Zhao, Lin

    2017-01-01

    This paper focuses on the development of a method for road network vulnerability analysis, from the perspective of capacity degradation, which seeks to identify the critical infrastructures in the road network and the operational performance of the whole traffic system. This research involves defining the traffic utility index and modeling vulnerability of road segment, route, OD (Origin Destination) pair and road network. Meanwhile, sensitivity analysis method is utilized to calculate the change of traffic utility index due to capacity degradation. This method, compared to traditional traffic assignment, can improve calculation efficiency and make the application of vulnerability analysis to large actual road network possible. Finally, all the above models and calculation method is applied to actual road network evaluation to verify its efficiency and utility. This approach can be used as a decision-supporting tool for evaluating the performance of road network and identifying critical infrastructures in transportation planning and management, especially in the resource allocation for mitigation and recovery. PMID:28125706

  1. Molecularly imprinted membrane extraction combined with high-performance liquid chromatography for selective analysis of cloxacillin from shrimp samples.

    PubMed

    Du, Wei; Sun, Min; Guo, Pengqi; Chang, Chun; Fu, Qiang

    2018-09-01

    Nowadays, the abuse of antibiotics in aquaculture has generated considerable problems for food safety. Therefore, it is imperative to develop a simple and selective method for monitoring illegal use of antibiotics in aquatic products. In this study, a method combined molecularly imprinted membranes (MIMs) extraction and liquid chromatography was developed for the selective analysis of cloxacillin from shrimp samples. The MIMs was synthesized by UV photopolymerization, and characterized by scanning electron microscope, Fourier transform infrared spectra, thermo-gravimetric analysis and swelling test. The results showed that the MIMs exhibited excellent permselectivity, high adsorption capacity and fast adsorption rate for cloxacillin. Finally, the method was utilized to determine cloxacillin from shrimp samples, with good accuracies and acceptable relative standard deviation values for precision. The proposed method was a promising alternative for selective analysis of cloxacillin in shrimp samples, due to the easy-operation and excellent selectivity. Copyright © 2018. Published by Elsevier Ltd.

  2. Multiscale analysis of the correlation of processing parameters on viscidity of composites fabricated by automated fiber placement

    NASA Astrophysics Data System (ADS)

    Han, Zhenyu; Sun, Shouzheng; Fu, Yunzhong; Fu, Hongya

    2017-10-01

    Viscidity is an important physical indicator for assessing fluidity of resin that is beneficial to contact resin with the fibers effectively and reduce manufacturing defects during automated fiber placement (AFP) process. However, the effect of processing parameters on viscidity evolution is rarely studied during AFP process. In this paper, viscidities under different scales are analyzed based on multi-scale analysis method. Firstly, viscous dissipation energy (VDE) within meso-unit under different processing parameters is assessed by using finite element method (FEM). According to multi-scale energy transfer model, meso-unit energy is used as the boundary condition for microscopic analysis. Furthermore, molecular structure of micro-system is built by molecular dynamics (MD) method. And viscosity curves are then obtained by integrating stress autocorrelation function (SACF) with time. Finally, the correlation characteristics of processing parameters to viscosity are revealed by using gray relational analysis method (GRAM). A group of processing parameters is found out to achieve the stability of viscosity and better fluidity of resin.

  3. Nudged elastic band method and density functional theory calculation for finding a local minimum energy pathway of p-benzoquinone and phenol fragmentation in mass spectrometry.

    PubMed

    Sugimura, Natsuhiko; Igarashi, Yoko; Aoyama, Reiko; Shibue, Toshimichi

    2017-02-01

    Analysis of the fragmentation pathways of molecules in mass spectrometry gives a fundamental insight into gas-phase ion chemistry. However, the conventional intrinsic reaction coordinates method requires knowledge of the transition states of ion structures in the fragmentation pathways. Herein, we use the nudged elastic band method, using only the initial and final state ion structures in the fragmentation pathways, and report the advantages and limitations of the method. We found a minimum energy path of p-benzoquinone ion fragmentation with two saddle points and one intermediate structure. The primary energy barrier, which corresponded to the cleavage of the C-C bond adjacent to the CO group, was calculated to be 1.50 eV. An additional energy barrier, which corresponded to the cleavage of the CO group, was calculated to be 0.68 eV. We also found an energy barrier of 3.00 eV, which was the rate determining step of the keto-enol tautomerization in CO elimination from the molecular ion of phenol. The nudged elastic band method allowed the determination of a minimum energy path using only the initial and final state ion structures in the fragmentation pathways, and it provided faster than the conventional intrinsic reaction coordinates method. In addition, this method was found to be effective in the analysis of the charge structures of the molecules during the fragmentation in mass spectrometry.

  4. [Survival analysis with competing risks: estimating failure probability].

    PubMed

    Llorca, Javier; Delgado-Rodríguez, Miguel

    2004-01-01

    To show the impact of competing risks of death on survival analysis. We provide an example of survival time without chronic rejection after heart transplantation, where death before rejection acts as a competing risk. Using a computer simulation, we compare the Kaplan-Meier estimator and the multiple decrement model. The Kaplan-Meier method overestimated the probability of rejection. Next, we illustrate the use of the multiple decrement model to analyze secondary end points (in our example: death after rejection). Finally, we discuss Kaplan-Meier assumptions and why they fail in the presence of competing risks. Survival analysis should be adjusted for competing risks of death to avoid overestimation of the risk of rejection produced with the Kaplan-Meier method.

  5. Effective visibility analysis method in virtual geographic environment

    NASA Astrophysics Data System (ADS)

    Li, Yi; Zhu, Qing; Gong, Jianhua

    2008-10-01

    Visibility analysis in virtual geographic environment has broad applications in many aspects in social life. But in practical use it is urged to improve the efficiency and accuracy, as well as to consider human vision restriction. The paper firstly introduces a high-efficient 3D data modeling method, which generates and organizes 3D data model using R-tree and LOD techniques. Then a new visibility algorithm which can realize real-time viewshed calculation considering the shelter of DEM and 3D building models and some restrictions of human eye to the viewshed generation. Finally an experiment is conducted to prove the visibility analysis calculation quickly and accurately which can meet the demand of digital city applications.

  6. Development and validation of chromatographic methods (HPLC and GC) for the determination of the active components (benzocaine, tyrothricin and menthol) of a pharmaceutical preparation.

    PubMed

    Ortiz-Boyer, F; Tena, M T; Luque de Castro, M D; Valcárcel, M

    1995-10-01

    Methods are reported for the determination of tyrothricin and benzocaine by HPLC and menthol by GC in the analysis of throat lozenges (tablets) containing all three compounds. After optimization of the variables involved in both HPLC and GC the methods have been characterized and validated according to the guidelines of the Spanish Pharmacopoeia, and applied to both the monitoring of the manufacturing process and the quality control of the final product.

  7. High-performance liquid chromatographic analysis of dextromethorphan, guaifenesin and benzoate in a cough syrup for stability testing.

    PubMed

    Galli, V; Barbas, C

    2004-09-10

    A method has been developed for the analysis of a cough syrup containing dextromethorphan, guaifenesin, benzoic acid, saccharin and other components. Forced degradation was also studied to demonstrate that the method could be employed during a stability study of the syrup. Final conditions were phosphate buffer (25 mM, pH 2.8) with triethylamine (TEA)-acetonitrile (75:25, v/v). In such conditions, all the actives, excipients and degradation products were baseline resolved in less than 14 min, and different wavelengths were used for the different analytes and related compounds.

  8. Research on power market technical analysis index system employing high-low matching mechanism

    NASA Astrophysics Data System (ADS)

    Li, Tao; Wang, Shengyu

    2018-06-01

    The power market trading technical analysis refers to a method that takes the bidding behavior of members in the power market as the research object, sums up some typical market rules and price trends by applying mathematical and logical methods, and finally can effectively assist members in the power market to make more reasonable trading decisions. In this paper, the following four indicators have been proposed: bidding price difference scale, extreme bidding price rate, dispersion of bidding price and monthly transaction satisfaction of electricity trading, which are the core of the index system.

  9. Joint research effort on vibrations of twisted plates, phase 1: Final results

    NASA Technical Reports Server (NTRS)

    Kielb, R. E.; Leissa, A. W.; Macbain, J. C.; Carney, K. S.

    1985-01-01

    The complete theoretical and experimental results of the first phase of a joint government/industry/university research study on the vibration characteristics of twisted cantilever plates are given. The study is conducted to generate an experimental data base and to compare many different theoretical methods with each other and with the experimental results. Plates with aspect ratios, thickness ratios, and twist angles representative of current gas turbine engine blading are investigated. The theoretical results are generated by numerous finite element, shell, and beam analysis methods. The experimental results are obtained by precision matching a set of twisted plates and testing them at two laboratories. The second and final phase of the study will concern the effects of rotation.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nose, Y.

    Methods were developed for generating an integrated, statistical model of the anatomical structures within the human thorax relevant to radioisotope powered artificial heart implantation. These methods involve measurement and analysis of anatomy in four areas: chest wall, pericardium, vascular connections, and great vessels. A model for the prediction of thorax outline from radiograms was finalized. These models were combined with 100 radiograms to arrive at a size distribution representing the adult male and female populations. (CH)

  11. An analyst's self-analysis.

    PubMed

    Calder, K T

    1980-01-01

    I have told you why I selected the topic of self-analysis, and I have described my method for it: of recording primary data such as dreams, daydreams, memories, and symptoms and of recording associations to this primary data, followed by an attempt at analyzing this written material. I have described a dream, a memory and a daydream which is also a symptom, each of which primary data I found useful in understanding myself. Finally, I reached some conclusions regarding the uses of self-analysis, including self-analysis as a research tool.

  12. Predictors of performance of students in biochemistry in a doctor of chiropractic curriculum.

    PubMed

    Shaw, Kathy; Rabatsky, Ali; Dishman, Veronica; Meseke, Christopher

    2014-01-01

    Objective : This study investigated the effect of completion of course prerequisites, undergraduate grade point average (GPA), undergraduate degree, and study habits on the performance of students in the biochemistry course at Palmer College of Chiropractic Florida. Methods : Students self-reported information regarding academic preparation at the beginning of the semester using a questionnaire. Final exam grade and final course grade were noted and used as measures of performance. Multivariate analysis of variance was used to determine if number of prerequisites completed, undergraduate GPA, undergraduate degree, hours spent studying in undergraduate study, and hours spent studying in the first quarter of the chiropractic program were associated significantly with the biochemistry final exam grade or the final grade for the biochemistry course. Results : The number of prerequisites completed, undergraduate degree, hours spent studying in undergraduate study, and hours spent studying in the first quarter of the chiropractic program did not significantly affect the biochemistry final exam grade or the final grade for the biochemistry course, but undergraduate GPA did. Subsequent univariate analysis and Tukey's post hoc comparisons revealed that students with an undergraduate GPA in the 3.5 to 3.99 range earned significantly higher final course grades than students with an undergraduate GPA in the 2.5 to 2.99 range. Conclusion : No single variable was determined to be a factor that determines student success in biochemistry. The interrelationship between the factors examined warrants further investigation to understand fully how to predict the success of a student in the biochemistry course.

  13. Investigation of the Bailey Method for the design and analysis of dense-graded HMAC using Oregon aggregates : final report.

    DOT National Transportation Integrated Search

    2006-09-01

    Historically Oregon has specified gradations for dense-graded hot mix asphalt concrete (HMAC) using a combination of broadband limits and recommended ideal gradations. The recent adoption of SuperPave and Stone Matrix Asphalt (SMA) technolog...

  14. Correlation of rapid hydrometer analysis for select material to existing procedure LDH-TR-407-66 : final report.

    DOT National Transportation Integrated Search

    1968-05-01

    Conditions arise during construction of bases with Portland cement stabilized soils which require close programming of work. Therefore, time is of significant importance. : That is the objective of this report; to evaluate a method by which considera...

  15. Shutdown Dose Rate Analysis Using the Multi-Step CADIS Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ibrahim, Ahmad M.; Peplow, Douglas E.; Peterson, Joshua L.

    2015-01-01

    The Multi-Step Consistent Adjoint Driven Importance Sampling (MS-CADIS) hybrid Monte Carlo (MC)/deterministic radiation transport method was proposed to speed up the shutdown dose rate (SDDR) neutron MC calculation using an importance function that represents the neutron importance to the final SDDR. This work applied the MS-CADIS method to the ITER SDDR benchmark problem. The MS-CADIS method was also used to calculate the SDDR uncertainty resulting from uncertainties in the MC neutron calculation and to determine the degree of undersampling in SDDR calculations because of the limited ability of the MC method to tally detailed spatial and energy distributions. The analysismore » that used the ITER benchmark problem compared the efficiency of the MS-CADIS method to the traditional approach of using global MC variance reduction techniques for speeding up SDDR neutron MC calculation. Compared to the standard Forward-Weighted-CADIS (FW-CADIS) method, the MS-CADIS method increased the efficiency of the SDDR neutron MC calculation by 69%. The MS-CADIS method also increased the fraction of nonzero scoring mesh tally elements in the space-energy regions of high importance to the final SDDR.« less

  16. Analysis of wave motion in one-dimensional structures through fast-Fourier-transform-based wavelet finite element method

    NASA Astrophysics Data System (ADS)

    Shen, Wei; Li, Dongsheng; Zhang, Shuaifang; Ou, Jinping

    2017-07-01

    This paper presents a hybrid method that combines the B-spline wavelet on the interval (BSWI) finite element method and spectral analysis based on fast Fourier transform (FFT) to study wave propagation in One-Dimensional (1D) structures. BSWI scaling functions are utilized to approximate the theoretical wave solution in the spatial domain and construct a high-accuracy dynamic stiffness matrix. Dynamic reduction on element level is applied to eliminate the interior degrees of freedom of BSWI elements and substantially reduce the size of the system matrix. The dynamic equations of the system are then transformed and solved in the frequency domain through FFT-based spectral analysis which is especially suitable for parallel computation. A comparative analysis of four different finite element methods is conducted to demonstrate the validity and efficiency of the proposed method when utilized in high-frequency wave problems. Other numerical examples are utilized to simulate the influence of crack and delamination on wave propagation in 1D rods and beams. Finally, the errors caused by FFT and their corresponding solutions are presented.

  17. Simple automatic strategy for background drift correction in chromatographic data analysis.

    PubMed

    Fu, Hai-Yan; Li, He-Dong; Yu, Yong-Jie; Wang, Bing; Lu, Peng; Cui, Hua-Peng; Liu, Ping-Ping; She, Yuan-Bin

    2016-06-03

    Chromatographic background drift correction, which influences peak detection and time shift alignment results, is a critical stage in chromatographic data analysis. In this study, an automatic background drift correction methodology was developed. Local minimum values in a chromatogram were initially detected and organized as a new baseline vector. Iterative optimization was then employed to recognize outliers, which belong to the chromatographic peaks, in this vector, and update the outliers in the baseline until convergence. The optimized baseline vector was finally expanded into the original chromatogram, and linear interpolation was employed to estimate background drift in the chromatogram. The principle underlying the proposed method was confirmed using a complex gas chromatographic dataset. Finally, the proposed approach was applied to eliminate background drift in liquid chromatography quadrupole time-of-flight samples used in the metabolic study of Escherichia coli samples. The proposed method was comparable with three classical techniques: morphological weighted penalized least squares, moving window minimum value strategy and background drift correction by orthogonal subspace projection. The proposed method allows almost automatic implementation of background drift correction, which is convenient for practical use. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Sensitivity analysis of a sound absorption model with correlated inputs

    NASA Astrophysics Data System (ADS)

    Chai, W.; Christen, J.-L.; Zine, A.-M.; Ichchou, M.

    2017-04-01

    Sound absorption in porous media is a complex phenomenon, which is usually addressed with homogenized models, depending on macroscopic parameters. Since these parameters emerge from the structure at microscopic scale, they may be correlated. This paper deals with sensitivity analysis methods of a sound absorption model with correlated inputs. Specifically, the Johnson-Champoux-Allard model (JCA) is chosen as the objective model with correlation effects generated by a secondary micro-macro semi-empirical model. To deal with this case, a relatively new sensitivity analysis method Fourier Amplitude Sensitivity Test with Correlation design (FASTC), based on Iman's transform, is taken into application. This method requires a priori information such as variables' marginal distribution functions and their correlation matrix. The results are compared to the Correlation Ratio Method (CRM) for reference and validation. The distribution of the macroscopic variables arising from the microstructure, as well as their correlation matrix are studied. Finally the results of tests shows that the correlation has a very important impact on the results of sensitivity analysis. Assessment of correlation strength among input variables on the sensitivity analysis is also achieved.

  19. Perceived stress at transition to workplace: a qualitative interview study exploring final-year medical students’ needs

    PubMed Central

    Moczko, Tobias R; Bugaj, Till J; Herzog, Wolfgang; Nikendei, Christoph

    2016-01-01

    Objectives This study was designed to explore final-year medical students’ stressors and coping strategies at the transition to the clinical workplace. Methods In this qualitative study, semi-standardized interviews with eight final-year medical students (five male, three female; aged 25.9±1.4 years) were conducted during their internal medicine rotation. After verbatim transcription, a qualitative content analysis of students’ impressions of stress provoking and easing factors during final-year education was performed. Results Students’ statements regarding burdens and dealing with stress were classified into four main categories: A) perceived stressors and provoking factors, B) stress-induced consequences, C) personal and external resources for preventing and dealing with stress, and D) final-year students’ suggestions for workplace improvement. Conclusion Final-year medical students perceived different types of stress during their transition to medical wards, and reported both negative consequences and coping resources concerning perceived stress. As supervision, feedback, and coping strategies played an important role in the students’ perception of stress, final-year medical education curricula development should focus on these specifically. PMID:26834503

  20. pcr: an R package for quality assessment, analysis and testing of qPCR data

    PubMed Central

    Ahmed, Mahmoud

    2018-01-01

    Background Real-time quantitative PCR (qPCR) is a broadly used technique in the biomedical research. Currently, few different analysis models are used to determine the quality of data and to quantify the mRNA level across the experimental conditions. Methods We developed an R package to implement methods for quality assessment, analysis and testing qPCR data for statistical significance. Double Delta CT and standard curve models were implemented to quantify the relative expression of target genes from CT in standard qPCR control-group experiments. In addition, calculation of amplification efficiency and curves from serial dilution qPCR experiments are used to assess the quality of the data. Finally, two-group testing and linear models were used to test for significance of the difference in expression control groups and conditions of interest. Results Using two datasets from qPCR experiments, we applied different quality assessment, analysis and statistical testing in the pcr package and compared the results to the original published articles. The final relative expression values from the different models, as well as the intermediary outputs, were checked against the expected results in the original papers and were found to be accurate and reliable. Conclusion The pcr package provides an intuitive and unified interface for its main functions to allow biologist to perform all necessary steps of qPCR analysis and produce graphs in a uniform way. PMID:29576953

  1. EMD-Based Symbolic Dynamic Analysis for the Recognition of Human and Nonhuman Pyroelectric Infrared Signals.

    PubMed

    Zhao, Jiaduo; Gong, Weiguo; Tang, Yuzhen; Li, Weihong

    2016-01-20

    In this paper, we propose an effective human and nonhuman pyroelectric infrared (PIR) signal recognition method to reduce PIR detector false alarms. First, using the mathematical model of the PIR detector, we analyze the physical characteristics of the human and nonhuman PIR signals; second, based on the analysis results, we propose an empirical mode decomposition (EMD)-based symbolic dynamic analysis method for the recognition of human and nonhuman PIR signals. In the proposed method, first, we extract the detailed features of a PIR signal into five symbol sequences using an EMD-based symbolization method, then, we generate five feature descriptors for each PIR signal through constructing five probabilistic finite state automata with the symbol sequences. Finally, we use a weighted voting classification strategy to classify the PIR signals with their feature descriptors. Comparative experiments show that the proposed method can effectively classify the human and nonhuman PIR signals and reduce PIR detector's false alarms.

  2. Neutral monosaccharide composition analysis of plant-derived oligo- and polysaccharides by high performance liquid chromatography.

    PubMed

    Yan, Jun; Shi, Songshan; Wang, Hongwei; Liu, Ruimin; Li, Ning; Chen, Yonglin; Wang, Shunchun

    2016-01-20

    A novel analytical method for neutral monosaccharide composition analysis of plant-derived oligo- and polysaccharides was developed using hydrophilic interaction liquid chromatography coupled to a charged aerosol detector. The effects of column type, additives, pH and column temperature on retention and separation were evaluated. Additionally, the method could distinguish potential impurities in samples, including chloride, sulfate and sodium, from sugars. The results of validation demonstrated that this method had good linearity (R(2) ≥ 0.9981), high precision (relative standard deviation ≤ 4.43%), and adequate accuracy (94.02-103.37% recovery) and sensitivity (detection limit: 15-40 ng). Finally, the monosaccharide compositions of the polysaccharide from Eclipta prostrasta L. and stachyose were successfully profiled through this method. This report represents the first time that all of these common monosaccharides could be well-separated and determined simultaneously by high performance liquid chromatography without additional derivatization. This newly developed method is convenient, efficient and reliable for monosaccharide analysis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Failure Bounding And Sensitivity Analysis Applied To Monte Carlo Entry, Descent, And Landing Simulations

    NASA Technical Reports Server (NTRS)

    Gaebler, John A.; Tolson, Robert H.

    2010-01-01

    In the study of entry, descent, and landing, Monte Carlo sampling methods are often employed to study the uncertainty in the designed trajectory. The large number of uncertain inputs and outputs, coupled with complicated non-linear models, can make interpretation of the results difficult. Three methods that provide statistical insights are applied to an entry, descent, and landing simulation. The advantages and disadvantages of each method are discussed in terms of the insights gained versus the computational cost. The first method investigated was failure domain bounding which aims to reduce the computational cost of assessing the failure probability. Next a variance-based sensitivity analysis was studied for the ability to identify which input variable uncertainty has the greatest impact on the uncertainty of an output. Finally, probabilistic sensitivity analysis is used to calculate certain sensitivities at a reduced computational cost. These methods produce valuable information that identifies critical mission parameters and needs for new technology, but generally at a significant computational cost.

  4. A third-order approximation method for three-dimensional wheel-rail contact

    NASA Astrophysics Data System (ADS)

    Negretti, Daniele

    2012-03-01

    Multibody train analysis is used increasingly by railway operators whenever a reliable and time-efficient method to evaluate the contact between wheel and rail is needed; particularly, the wheel-rail contact is one of the most important aspects that affects a reliable and time-efficient vehicle dynamics computation. The focus of the approach proposed here is to carry out such tasks by means of online wheel-rail elastic contact detection. In order to improve efficiency and save time, a main analytical approach is used for the definition of wheel and rail surfaces as well as for contact detection, then a final numerical evaluation is used to locate contact. The final numerical procedure consists in finding the zeros of a nonlinear function in a single variable. The overall method is based on the approximation of the wheel surface, which does not influence the contact location significantly, as shown in the paper.

  5. [The application of stereology in radiology imaging and cell biology fields].

    PubMed

    Hu, Na; Wang, Yan; Feng, Yuanming; Lin, Wang

    2012-08-01

    Stereology is an interdisciplinary method for 3D morphological study developed from mathematics and morphology. It is widely used in medical image analysis and cell biology studies. Because of its unbiased, simple, fast, reliable and non-invasive characteristics, stereology has been widely used in biomedical areas for quantitative analysis and statistics, such as histology, pathology and medical imaging. Because the stereological parameters show distinct differences in different pathology, many scholars use stereological methods to do quantitative analysis in their studies in recent years, for example, in the areas of the condition of cancer cells, tumor grade, disease development and the patient's prognosis, etc. This paper describes the stereological concept and estimation methods, also illustrates the applications of stereology in the fields of CT images, MRI images and cell biology, and finally reflects the universality, the superiority and reliability of stereology.

  6. Comparison of sample preparation methods combined with fast gas chromatography-mass spectrometry for ultratrace analysis of pesticide residues in baby food.

    PubMed

    Hercegová, Andrea; Dömötörová, Milena; Kruzlicová, Dása; Matisová, Eva

    2006-05-01

    Four sample preparation techniques were compared for the ultratrace analysis of pesticide residues in baby food: (a) modified Schenck's method based on ACN extraction with SPE cleaning; (b) quick, easy, cheap, effective, rugged, and safe (QuEChERS) method based on ACN extraction and dispersive SPE; (c) modified QuEChERS method which utilizes column-based SPE instead of dispersive SPE; and (d) matrix solid phase dispersion (MSPD). The methods were combined with fast gas chromatographic-mass spectrometric analysis. The effectiveness of clean-up of the final extract was determined by comparison of the chromatograms obtained. Time consumption, laboriousness, demands on glassware and working place, and consumption of chemicals, especially solvents, increase in the following order QuEChERS < modified QuEChERS < MSPD < modified Schenck's method. All methods offer satisfactory analytical characteristics at the concentration levels of 5, 10, and 100 microg/kg in terms of recoveries and repeatability. Recoveries obtained for the modified QuEChERS method were lower than for the original QuEChERS. In general the best LOQs were obtained for the modified Schenck's method. Modified QuEChERS method provides 21-72% better LOQs than the original method.

  7. Spectral iterative method and convergence analysis for solving nonlinear fractional differential equation

    NASA Astrophysics Data System (ADS)

    Yarmohammadi, M.; Javadi, S.; Babolian, E.

    2018-04-01

    In this study a new spectral iterative method (SIM) based on fractional interpolation is presented for solving nonlinear fractional differential equations (FDEs) involving Caputo derivative. This method is equipped with a pre-algorithm to find the singularity index of solution of the problem. This pre-algorithm gives us a real parameter as the index of the fractional interpolation basis, for which the SIM achieves the highest order of convergence. In comparison with some recent results about the error estimates for fractional approximations, a more accurate convergence rate has been attained. We have also proposed the order of convergence for fractional interpolation error under the L2-norm. Finally, general error analysis of SIM has been considered. The numerical results clearly demonstrate the capability of the proposed method.

  8. Development and validation of a professionalism assessment scale for medical students

    PubMed Central

    Klemenc-Ketis, Zalika; Vrecko, Helena

    2014-01-01

    Objectives To develop and validate a scale for the assess-ment of professionalism in medical students based on students' perceptions of and attitudes towards professional-ism in medicine. Methods This was a mixed methods study with under-graduate medical students. Two focus groups were carried out with 12 students, followed by a transcript analysis (grounded theory method with open coding). Then, a 3-round Delphi with 20 family medicine experts was carried out. A psychometric assessment of the scale was performed with a group of 449 students. The items of the Professional-ism Assessment Scale could be answered on a five-point Likert scale. Results After the focus groups, the first version of the PAS consisted of 56 items and after the Delphi study, 30 items remained. The final sample for quantitative study consisted of 122 students (27.2% response rate). There were 95 (77.9%) female students in the sample. The mean age of the sample was 22.1 ± 2.1 years. After the principal component analysis, we removed 8 items and produced the final version of the PAS (22 items). The Cronbach's alpha of the scale was 0.88. Factor analysis revealed three factors: empathy and humanism, professional relationships and development and responsibility. Conclusions The new Professionalism Assessment Scale proved to be valid and reliable. It can be used for the assessment of professionalism in undergraduate medical students. PMID:25382090

  9. Predicting analysis time in events-driven clinical trials using accumulating time-to-event surrogate information.

    PubMed

    Wang, Jianming; Ke, Chunlei; Yu, Zhinuan; Fu, Lei; Dornseif, Bruce

    2016-05-01

    For clinical trials with time-to-event endpoints, predicting the accrual of the events of interest with precision is critical in determining the timing of interim and final analyses. For example, overall survival (OS) is often chosen as the primary efficacy endpoint in oncology studies, with planned interim and final analyses at a pre-specified number of deaths. Often, correlated surrogate information, such as time-to-progression (TTP) and progression-free survival, are also collected as secondary efficacy endpoints. It would be appealing to borrow strength from the surrogate information to improve the precision of the analysis time prediction. Currently available methods in the literature for predicting analysis timings do not consider utilizing the surrogate information. In this article, using OS and TTP as an example, a general parametric model for OS and TTP is proposed, with the assumption that disease progression could change the course of the overall survival. Progression-free survival, related both to OS and TTP, will be handled separately, as it can be derived from OS and TTP. The authors seek to develop a prediction procedure using a Bayesian method and provide detailed implementation strategies under certain assumptions. Simulations are performed to evaluate the performance of the proposed method. An application to a real study is also provided. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  10. An Application of Retroduction to Analyzing and Testing the Backing off of Nuts and Bolts During Dynamic Loading

    NASA Technical Reports Server (NTRS)

    Kerley, James J.

    1987-01-01

    The method of retroduction, adapted from the doctoral thesis of Dr. A. Croce, relies on a process of dialectic questioning that begins with the information sought, proceeds to Given items (either in the form of dimensions or limits of research). and to Known mathematical forms of analysis in design or to principles of study in research. Finally, analysis and synthesis are used to abstract the dielectic questions and to arrive at the information desired. This method is used to solve the engineering design problem of a beam and to determine why bolts and nuts vibrate apart. Both mathematical analysis and dialectic logical analysis are utilized. Results are provided of tests conducted to check the retroductive study of why and how nuts back off.

  11. Expert system for web based collaborative CAE

    NASA Astrophysics Data System (ADS)

    Hou, Liang; Lin, Zusheng

    2006-11-01

    An expert system for web based collaborative CAE was developed based on knowledge engineering, relational database and commercial FEA (Finite element analysis) software. The architecture of the system was illustrated. In this system, the experts' experiences, theories and typical examples and other related knowledge, which will be used in the stage of pre-process in FEA, were categorized into analysis process and object knowledge. Then, the integrated knowledge model based on object-oriented method and rule based method was described. The integrated reasoning process based on CBR (case based reasoning) and rule based reasoning was presented. Finally, the analysis process of this expert system in web based CAE application was illustrated, and an analysis example of a machine tool's column was illustrated to prove the validity of the system.

  12. Various methods of heat supply for a building which is operated periodically during the year

    NASA Astrophysics Data System (ADS)

    Małetka, Marek; Laska, Marta

    2017-11-01

    Stand-alone buildings operated periodically require heat supply for hot water and heating purposes to be carefully analyzed in terms of the technical capabilities, the energy and financial outlays. The paper presents the analysis of heat supply for hot water purposes and central heating in the stand-alone cloakroom building located in Poland. The analysis is undertaken for different variants of heat delivery for a building from electric heaters, gas boiler and district heating solutions to renewable sources applications, namely solar panels and heat pumps. For each solution, usage of usable, final and primary energy was calculated. Also the financial analysis for investments and energy costs were carried out. This analysis has been done in according to SPBT and NPV method for different levels of building use.

  13. Analysis and application of intelligence network based on FTTH

    NASA Astrophysics Data System (ADS)

    Feng, Xiancheng; Yun, Xiang

    2008-12-01

    With the continued rapid growth of Internet, new network service emerges in endless stream, especially the increase of network game, meeting TV, video on demand, etc. The bandwidth requirement increase continuously. Network technique, optical device technical development is swift and violent. FTTH supports all present and future service with enormous bandwidth, including traditional telecommunication service, traditional data service and traditional TV service, and the future digital TV and VOD. With huge bandwidth of FTTH, it wins the final solution of broadband network, becomes the final goal of development of optical access network. Firstly, it introduces the main service which FTTH supports, main analysis key technology such as FTTH system composition way, topological structure, multiplexing, optical cable and device. It focus two kinds of realization methods - PON, P2P technology. Then it proposed that the solution of FTTH can support comprehensive access (service such as broadband data, voice, video and narrowband private line). Finally, it shows the engineering application for FTTH in the district and building. It brings enormous economic benefits and social benefit.

  14. Boosting Higgs pair production in the [Formula: see text] final state with multivariate techniques.

    PubMed

    Behr, J Katharina; Bortoletto, Daniela; Frost, James A; Hartland, Nathan P; Issever, Cigdem; Rojo, Juan

    2016-01-01

    The measurement of Higgs pair production will be a cornerstone of the LHC program in the coming years. Double Higgs production provides a crucial window upon the mechanism of electroweak symmetry breaking and has a unique sensitivity to the Higgs trilinear coupling. We study the feasibility of a measurement of Higgs pair production in the [Formula: see text] final state at the LHC. Our analysis is based on a combination of traditional cut-based methods with state-of-the-art multivariate techniques. We account for all relevant backgrounds, including the contributions from light and charm jet mis-identification, which are ultimately comparable in size to the irreducible 4 b QCD background. We demonstrate the robustness of our analysis strategy in a high pileup environment. For an integrated luminosity of [Formula: see text] ab[Formula: see text], a signal significance of [Formula: see text] is obtained, indicating that the [Formula: see text] final state alone could allow for the observation of double Higgs production at the High Luminosity LHC.

  15. Bending Distortion Analysis of a Steel Shaft Manufacturing Chain from Cold Drawing to Grinding

    NASA Astrophysics Data System (ADS)

    Dias, Vinicius Waechter; da Silva Rocha, Alexandre; Zottis, Juliana; Dong, Juan; Epp, Jérémy; Zoch, Hans Werner

    2017-04-01

    Shafts are usually manufactured from bars that are cold drawn, cut machined, induction hardened, straightened, and finally ground. The main distortion is characterized by bending that appears after induction hardening and is corrected by straightening and/or grinding. In this work, the consequence of the variation of manufacturing parameters on the distortion was analyzed for a complete manufacturing route for production of induction hardened shafts made of Grade 1045 steel. A DoE plan was implemented varying the drawing angle, cutting method, induction hardening layer depth, and grinding penetration depth. The distortion was determined by calculating curvature vectors from dimensional analysis by 3D coordinate measurements. Optical microscopy, microhardness testing, residual stress analysis, and FEM process simulation were used to evaluate and understand effects of the main carriers of distortion potential. The drawing process was identified as the most significant influence on the final distortion of the shafts.

  16. Remarks on residual stress measurement by hole-drilling and electronic speckle pattern interferometry.

    PubMed

    Barile, Claudia; Casavola, Caterina; Pappalettera, Giovanni; Pappalettere, Carmine

    2014-01-01

    Hole drilling is the most widespread method for measuring residual stress. It is based on the principle that drilling a hole in the material causes a local stress relaxation; the initial residual stress can be calculated by measuring strain in correspondence with each drill depth. Recently optical techniques were introduced to measure strain; in this case, the accuracy of the final results depends, among other factors, on the proper choice of the area of analysis. Deformations are in fact analyzed within an annulus determined by two parameters: the internal and the external radius. In this paper, the influence of the choice of the area of analysis was analysed. A known stress field was introduced on a Ti grade 5 sample and then the stress was measured in correspondence with different values of the internal and the external radius of analysis; results were finally compared with the expected theoretical value.

  17. Remarks on Residual Stress Measurement by Hole-Drilling and Electronic Speckle Pattern Interferometry

    PubMed Central

    2014-01-01

    Hole drilling is the most widespread method for measuring residual stress. It is based on the principle that drilling a hole in the material causes a local stress relaxation; the initial residual stress can be calculated by measuring strain in correspondence with each drill depth. Recently optical techniques were introduced to measure strain; in this case, the accuracy of the final results depends, among other factors, on the proper choice of the area of analysis. Deformations are in fact analyzed within an annulus determined by two parameters: the internal and the external radius. In this paper, the influence of the choice of the area of analysis was analysed. A known stress field was introduced on a Ti grade 5 sample and then the stress was measured in correspondence with different values of the internal and the external radius of analysis; results were finally compared with the expected theoretical value. PMID:25276850

  18. Three-Dimensional Viscous Alternating Direction Implicit Algorithm and Strategies for Shape Optimization

    NASA Technical Reports Server (NTRS)

    Pandya, Mohagna J.; Baysal, Oktay

    1997-01-01

    A gradient-based shape optimization based on quasi-analytical sensitivities has been extended for practical three-dimensional aerodynamic applications. The flow analysis has been rendered by a fully implicit, finite-volume formulation of the Euler and Thin-Layer Navier-Stokes (TLNS) equations. Initially, the viscous laminar flow analysis for a wing has been compared with an independent computational fluid dynamics (CFD) code which has been extensively validated. The new procedure has been demonstrated in the design of a cranked arrow wing at Mach 2.4 with coarse- and fine-grid based computations performed with Euler and TLNS equations. The influence of the initial constraints on the geometry and aerodynamics of the optimized shape has been explored. Various final shapes generated for an identical initial problem formulation but with different optimization path options (coarse or fine grid, Euler or TLNS), have been aerodynamically evaluated via a common fine-grid TLNS-based analysis. The initial constraint conditions show significant bearing on the optimization results. Also, the results demonstrate that to produce an aerodynamically efficient design, it is imperative to include the viscous physics in the optimization procedure with the proper resolution. Based upon the present results, to better utilize the scarce computational resources, it is recommended that, a number of viscous coarse grid cases using either a preconditioned bi-conjugate gradient (PbCG) or an alternating-direction-implicit (ADI) method, should initially be employed to improve the optimization problem definition, the design space and initial shape. Optimized shapes should subsequently be analyzed using a high fidelity (viscous with fine-grid resolution) flow analysis to evaluate their true performance potential. Finally, a viscous fine-grid-based shape optimization should be conducted, using an ADI method, to accurately obtain the final optimized shape.

  19. Analytical tools for the analysis of β-carotene and its degradation products

    PubMed Central

    Stutz, H.; Bresgen, N.; Eckl, P. M.

    2015-01-01

    Abstract β-Carotene, the precursor of vitamin A, possesses pronounced radical scavenging properties. This has centered the attention on β-carotene dietary supplementation in healthcare as well as in the therapy of degenerative disorders and several cancer types. However, two intervention trials with β-carotene have revealed adverse effects on two proband groups, that is, cigarette smokers and asbestos-exposed workers. Beside other causative reasons, the detrimental effects observed have been related to the oxidation products of β-carotene. Their generation originates in the polyene structure of β-carotene that is beneficial for radical scavenging, but is also prone to oxidation. Depending on the dominant degradation mechanism, bond cleavage might occur either randomly or at defined positions of the conjugated electron system, resulting in a diversity of cleavage products (CPs). Due to their instability and hydrophobicity, the handling of standards and real samples containing β-carotene and related CPs requires preventive measures during specimen preparation, analyte extraction, and final analysis, to avoid artificial degradation and to preserve the initial analyte portfolio. This review critically discusses different preparation strategies of standards and treatment solutions, and also addresses their protection from oxidation. Additionally, in vitro oxidation strategies for the generation of oxidative model compounds are surveyed. Extraction methods are discussed for volatile and non-volatile CPs individually. Gas chromatography (GC), (ultra)high performance liquid chromatography (U)HPLC, and capillary electrochromatography (CEC) are reviewed as analytical tools for final analyte analysis. For identity confirmation of analytes, mass spectrometry (MS) is indispensable, and the appropriate ionization principles are comprehensively discussed. The final sections cover analysis of real samples and aspects of quality assurance, namely matrix effects and method validation. PMID:25867077

  20. Eigenvalue sensitivity analysis of planar frames with variable joint and support locations

    NASA Technical Reports Server (NTRS)

    Chuang, Ching H.; Hou, Gene J. W.

    1991-01-01

    Two sensitivity equations are derived in this study based upon the continuum approach for eigenvalue sensitivity analysis of planar frame structures with variable joint and support locations. A variational form of an eigenvalue equation is first derived in which all of the quantities are expressed in the local coordinate system attached to each member. Material derivative of this variational equation is then sought to account for changes in member's length and orientation resulting form the perturbation of joint and support locations. Finally, eigenvalue sensitivity equations are formulated in either domain quantities (by the domain method) or boundary quantities (by the boundary method). It is concluded that the sensitivity equation derived by the boundary method is more efficient in computation but less accurate than that of the domain method. Nevertheless, both of them in terms of computational efficiency are superior to the conventional direct differentiation method and the finite difference method.

  1. Qualitative data analysis: conceptual and practical considerations.

    PubMed

    Liamputtong, Pranee

    2009-08-01

    Qualitative inquiry requires that collected data is organised in a meaningful way, and this is referred to as data analysis. Through analytic processes, researchers turn what can be voluminous data into understandable and insightful analysis. This paper sets out the different approaches that qualitative researchers can use to make sense of their data including thematic analysis, narrative analysis, discourse analysis and semiotic analysis and discusses the ways that qualitative researchers can analyse their data. I first discuss salient issues in performing qualitative data analysis, and then proceed to provide some suggestions on different methods of data analysis in qualitative research. Finally, I provide some discussion on the use of computer-assisted data analysis.

  2. Environmental friendly method for the extraction of coir fibre and isolation of nanofibre.

    PubMed

    Abraham, Eldho; Deepa, B; Pothen, L A; Cintil, J; Thomas, S; John, M J; Anandjiwala, R; Narine, S S

    2013-02-15

    The objective of this work was to develop an environmental friendly method for the effective utilization of coir fibre by adopting steam pre-treatment. The retting of the coconut bunch makes strong environmental problems which can be avoided by this method. Chemical characterization of the fibre during each processing stages confirmed the increase of cellulose content from raw (40%) to final steam treated fibres (93%). Morphological and dynamic light scattering analyses of the fibres at different processing stages revealed that the isolation of cellulose nano fibres occur in the final step of the process as an aqueous suspension. FT-IR and XRD analysis demonstrated that the treatments lead to the gradual removal of lignin and hemicelluloses from the fibres. The existence of strong lignin-cellulose complex in the raw coir fibre is proved by its enhanced thermal stability. Steam explosion has been proved to be a green method to expand the application areas of coir fibre. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. IMMUNE FUNCTION AS A BIOMARKER FOR CONTAMINANT EXPOSURE IN SEABIRDS: DEVELOPMENT OF SAMPLE STORAGE AND ANALYSIS METHODS. (U915730)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  4. Analysis of Costs and Benefits in Rehabilitation. Final Report.

    ERIC Educational Resources Information Center

    Berkowitz, Monroe, Ed.; And Others

    This report suggests feasible alternatives to the present methods of calculating benefits and costs of the joint federal-state vocational rehabilitation program. "Summary and Guide to Reading This Report" (Monroe Berkowitz) appears first. Part I, Background, Theory and Models, includes "The Cost Benefit Tradition in Vocational Rehabilitation"…

  5. Effectiveness of various public private partnership pavement rehabilitation treatments: A big data informatics survival analysis of pavement service life : final report.

    DOT National Transportation Integrated Search

    2017-09-29

    Past research efforts have used a wide variety of methodological approaches to analyze pavement performance indicators, pavement rehabilitation treatments, and pavement service life. Using big data informatics methods, the intent of this study is to ...

  6. 78 FR 76310 - Agency Information Collection Activities: Submission to OMB for Review and Approval; Public...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-17

    ... collecting qualitative and quantitative information. To support the qualitative analysis, HRSA will conduct... sites in order to gain a deeper understanding of the program's implementation. Finally, quantitative... forms; and 3. Client satisfaction surveys. ORHP is seeking approval from OMB for the three methods of...

  7. Unemployment Benefit Exhaustion: Incentive Effects on Job-Finding Rates

    ERIC Educational Resources Information Center

    Filges, Trine; Geerdsen, Lars Pico; Knudsen, Anne-Sofie Due; Jørgensen, Anne-Marie Klint

    2015-01-01

    Purpose: This systematic review studied the impact of exhaustion of unemployment benefits on the exit rate out of unemployment and into employment prior to benefit exhaustion or shortly thereafter. Method: We followed Campbell Collaboration guidelines to prepare this review, and ultimately located 12 studies for final analysis and interpretation.…

  8. Static aeroelastic analysis and tailoring of a single-element racing car wing

    NASA Astrophysics Data System (ADS)

    Sadd, Christopher James

    This thesis presents the research from an Engineering Doctorate research programme in collaboration with Reynard Motorsport Ltd, a manufacturer of racing cars. Racing car wing design has traditionally considered structures to be rigid. However, structures are never perfectly rigid and the interaction between aerodynamic loading and structural flexibility has a direct impact on aerodynamic performance. This interaction is often referred to as static aeroelasticity and the focus of this research has been the development of a computational static aeroelastic analysis method to improve the design of a single-element racing car wing. A static aeroelastic analysis method has been developed by coupling a Reynolds-Averaged Navier-Stokes CFD analysis method with a Finite Element structural analysis method using an iterative scheme. Development of this method has included assessment of CFD and Finite Element analysis methods and development of data transfer and mesh deflection methods. Experimental testing was also completed to further assess the computational analyses. The computational and experimental results show a good correlation and these studies have also shown that a Navier-Stokes static aeroelastic analysis of an isolated wing can be performed at an acceptable computational cost. The static aeroelastic analysis tool was used to assess methods of tailoring the structural flexibility of the wing to increase its aerodynamic performance. These tailoring methods were then used to produce two final wing designs to increase downforce and reduce drag respectively. At the average operating dynamic pressure of the racing car, the computational analysis predicts that the downforce-increasing wing has a downforce of C[1]=-1.377 in comparison to C[1]=-1.265 for the original wing. The computational analysis predicts that the drag-reducing wing has a drag of C[d]=0.115 in comparison to C[d]=0.143 for the original wing.

  9. a Cognitive Approach to Teaching a Graduate-Level Geobia Course

    NASA Astrophysics Data System (ADS)

    Bianchetti, Raechel A.

    2016-06-01

    Remote sensing image analysis training occurs both in the classroom and the research lab. Education in the classroom for traditional pixel-based image analysis has been standardized across college curriculums. However, with the increasing interest in Geographic Object-Based Image Analysis (GEOBIA), there is a need to develop classroom instruction for this method of image analysis. While traditional remote sensing courses emphasize the expansion of skills and knowledge related to the use of computer-based analysis, GEOBIA courses should examine the cognitive factors underlying visual interpretation. This current paper provides an initial analysis of the development, implementation, and outcomes of a GEOBIA course that considers not only the computational methods of GEOBIA, but also the cognitive factors of expertise, that such software attempts to replicate. Finally, a reflection on the first instantiation of this course is presented, in addition to plans for development of an open-source repository for course materials.

  10. Evaluating water management strategies in watersheds by new hybrid Fuzzy Analytical Network Process (FANP) methods

    NASA Astrophysics Data System (ADS)

    RazaviToosi, S. L.; Samani, J. M. V.

    2016-03-01

    Watersheds are considered as hydrological units. Their other important aspects such as economic, social and environmental functions play crucial roles in sustainable development. The objective of this work is to develop methodologies to prioritize watersheds by considering different development strategies in environmental, social and economic sectors. This ranking could play a significant role in management to assign the most critical watersheds where by employing water management strategies, best condition changes are expected to be accomplished. Due to complex relations among different criteria, two new hybrid fuzzy ANP (Analytical Network Process) algorithms, fuzzy TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) and fuzzy max-min set methods are used to provide more flexible and accurate decision model. Five watersheds in Iran named Oroomeyeh, Atrak, Sefidrood, Namak and Zayandehrood are considered as alternatives. Based on long term development goals, 38 water management strategies are defined as subcriteria in 10 clusters. The main advantage of the proposed methods is its ability to overcome uncertainty. This task is accomplished by using fuzzy numbers in all steps of the algorithms. To validate the proposed method, the final results were compared with those obtained from the ANP algorithm and the Spearman rank correlation coefficient is applied to find the similarity in the different ranking methods. Finally, the sensitivity analysis was conducted to investigate the influence of cluster weights on the final ranking.

  11. FPGA Implementation of the Coupled Filtering Method and the Affine Warping Method.

    PubMed

    Zhang, Chen; Liang, Tianzhu; Mok, Philip K T; Yu, Weichuan

    2017-07-01

    In ultrasound image analysis, the speckle tracking methods are widely applied to study the elasticity of body tissue. However, "feature-motion decorrelation" still remains as a challenge for the speckle tracking methods. Recently, a coupled filtering method and an affine warping method were proposed to accurately estimate strain values, when the tissue deformation is large. The major drawback of these methods is the high computational complexity. Even the graphics processing unit (GPU)-based program requires a long time to finish the analysis. In this paper, we propose field-programmable gate array (FPGA)-based implementations of both methods for further acceleration. The capability of FPGAs on handling different image processing components in these methods is discussed. A fast and memory-saving image warping approach is proposed. The algorithms are reformulated to build a highly efficient pipeline on FPGA. The final implementations on a Xilinx Virtex-7 FPGA are at least 13 times faster than the GPU implementation on the NVIDIA graphic card (GeForce GTX 580).

  12. Simultaneous analysis of aminoglycosides with many other classes of drug residues in bovine tissues by ultrahigh-performance liquid chromatography-tandem mass spectrometry using an ion-pairing reagent added to final extracts

    USDA-ARS?s Scientific Manuscript database

    The way to maximize scope of analysis, sample throughput, and laboratory efficiency in the monitoring of veterinary drug residues in food animals is to determine as many analytes as possible as fast as possible in as few methods as possible. Capital and overhead expenses are also reduced by using f...

  13. Three-parameter error analysis method based on rotating coordinates in rotating birefringent polarizer system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao, Junjie; Jia, Hongzhi, E-mail: hzjia@usst.edu.cn

    2015-11-15

    We propose error analysis using a rotating coordinate system with three parameters of linearly polarized light—incidence angle, azimuth angle on the front surface, and angle between the incidence and vibration planes—and demonstrate the method on a rotating birefringent prism system. The transmittance and angles are calculated plane-by-plane using a birefringence ellipsoid model and the final transmitted intensity equation is deduced. The effects of oblique incidence, light interference, beam convergence, and misalignment of the rotation and prism axes are discussed. We simulate the entire error model using MATLAB and conduct experiments based on a built polarimeter. The simulation and experimental resultsmore » are consistent and demonstrate the rationality and validity of this method.« less

  14. Classical least squares multivariate spectral analysis

    DOEpatents

    Haaland, David M.

    2002-01-01

    An improved classical least squares multivariate spectral analysis method that adds spectral shapes describing non-calibrated components and system effects (other than baseline corrections) present in the analyzed mixture to the prediction phase of the method. These improvements decrease or eliminate many of the restrictions to the CLS-type methods and greatly extend their capabilities, accuracy, and precision. One new application of PACLS includes the ability to accurately predict unknown sample concentrations when new unmodeled spectral components are present in the unknown samples. Other applications of PACLS include the incorporation of spectrometer drift into the quantitative multivariate model and the maintenance of a calibration on a drifting spectrometer. Finally, the ability of PACLS to transfer a multivariate model between spectrometers is demonstrated.

  15. A round robin approach to the analysis of bisphenol a (BPA) in human blood samples

    PubMed Central

    2014-01-01

    Background Human exposure to bisphenol A (BPA) is ubiquitous, yet there are concerns about whether BPA can be measured in human blood. This Round Robin was designed to address this concern through three goals: 1) to identify collection materials, reagents and detection apparatuses that do not contribute BPA to serum; 2) to identify sensitive and precise methods to accurately measure unconjugated BPA (uBPA) and BPA-glucuronide (BPA-G), a metabolite, in serum; and 3) to evaluate whether inadvertent hydrolysis of BPA-G occurs during sample handling and processing. Methods Four laboratories participated in this Round Robin. Laboratories screened materials to identify BPA contamination in collection and analysis materials. Serum was spiked with concentrations of uBPA and/or BPA-G ranging from 0.09-19.5 (uBPA) and 0.5-32 (BPA-G) ng/mL. Additional samples were preserved unspiked as ‘environmental’ samples. Blinded samples were provided to laboratories that used LC/MSMS to simultaneously quantify uBPA and BPA-G. To determine whether inadvertent hydrolysis of BPA metabolites occurred, samples spiked with only BPA-G were analyzed for the presence of uBPA. Finally, three laboratories compared direct and indirect methods of quantifying BPA-G. Results We identified collection materials and reagents that did not introduce BPA contamination. In the blinded spiked sample analysis, all laboratories were able to distinguish low from high values of uBPA and BPA-G, for the whole spiked sample range and for those samples spiked with the three lowest concentrations (0.5-3.1 ng/ml). By completion of the Round Robin, three laboratories had verified methods for the analysis of uBPA and two verified for the analysis of BPA-G (verification determined by: 4 of 5 samples within 20% of spiked concentrations). In the analysis of BPA-G only spiked samples, all laboratories reported BPA-G was the majority of BPA detected (92.2 – 100%). Finally, laboratories were more likely to be verified using direct methods than indirect ones using enzymatic hydrolysis. Conclusions Sensitive and accurate methods for the direct quantification of uBPA and BPA-G were developed in multiple laboratories and can be used for the analysis of human serum samples. BPA contamination can be controlled during sample collection and inadvertent hydrolysis of BPA conjugates can be avoided during sample handling. PMID:24690217

  16. Noise Reduction Design of the Volute for a Centrifugal Compressor

    NASA Astrophysics Data System (ADS)

    Song, Zhen; Wen, Huabing; Hong, Liangxing; Jin, Yudong

    2017-08-01

    In order to effectively control the aerodynamic noise of a compressor, this paper takes into consideration a marine exhaust turbocharger compressor as a research object. According to the different design concept of volute section, tongue and exit cone, six different volute models were established. The finite volume method is used to calculate the flow field, whiles the finite element method is used for the acoustic calculation. Comparison and analysis of different structure designs from three aspects: noise level, isentropic efficiency and Static pressure recovery coefficient. The results showed that under the concept of volute section model 1 yielded the best result, under the concept of tongue analysis model 3 yielded the best result and finally under exit cone analysis model 6 yielded the best results.

  17. Applying reliability analysis to design electric power systems for More-electric aircraft

    NASA Astrophysics Data System (ADS)

    Zhang, Baozhu

    The More-Electric Aircraft (MEA) is a type of aircraft that replaces conventional hydraulic and pneumatic systems with electrically powered components. These changes have significantly challenged the aircraft electric power system design. This thesis investigates how reliability analysis can be applied to automatically generate system topologies for the MEA electric power system. We first use a traditional method of reliability block diagrams to analyze the reliability level on different system topologies. We next propose a new methodology in which system topologies, constrained by a set reliability level, are automatically generated. The path-set method is used for analysis. Finally, we interface these sets of system topologies with control synthesis tools to automatically create correct-by-construction control logic for the electric power system.

  18. Development of direct-inverse 3-D methods for applied transonic aerodynamic wing design and analysis

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1989-01-01

    Progress in the direct-inverse wing design method in curvilinear coordinates has been made. This includes the remedying of a spanwise oscillation problem and the assessment of grid skewness, viscous interaction, and the initial airfoil section on the final design. It was found that, in response to the spanwise oscillation problem that designing at every other spanwise station produced the best results for the cases presented, a smoothly varying grid is especially needed for the accurate design at the wing tip, the boundary layer displacement thicknesses must be included in a successful wing design, the design of high and medium aspect ratio wings is possible with this code, and the final airfoil section designed is fairly independent of the initial section.

  19. Difficulties Encountered by Final-Year Male Nursing Students in Their Internship Programmes

    PubMed Central

    Al-Momani, Mohammed Mahmoud

    2017-01-01

    Background The cultural norms of the Kingdom of Saudi Arabia do not encourage men to choose nursing as a career. Understanding male nursing students’ experiences of their clinical exposure to the nursing profession throughout their internship might increase their retention. This study explored the experiences of final-year male nursing students as they transitioned to the role of registered nurse. Methods A qualitative descriptive research design with an inductive content-analysis approach was used. The experiences of 22 final-year male nursing students from three public hospitals in a major city of Saudi Arabia were explored. The data were collected using focus-group interviews and documentary analysis in March 2015 and May 2015. Results Content analysis revealed three major themes: the societal and cultural image of male nurses, male students’ engagement in nursing practice, and restructuring the internship programmes’ policies to suit male students’ needs. Conclusion The findings reveal issues that mainly stem from negative social views of nursing as a male profession. Considering the students’ social and cultural needs during their internship programme will facilitate their transition into the role of registered nurse and their retention in the nursing profession. PMID:28951687

  20. 10 CFR 52.157 - Contents of applications; technical information in final safety analysis report.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...; technical information in final safety analysis report. The application must contain a final safety analysis...) Information sufficient to demonstrate compliance with the applicable requirements regarding testing, analysis... 10 Energy 2 2013-01-01 2013-01-01 false Contents of applications; technical information in final...

  1. 10 CFR 52.157 - Contents of applications; technical information in final safety analysis report.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...; technical information in final safety analysis report. The application must contain a final safety analysis...) Information sufficient to demonstrate compliance with the applicable requirements regarding testing, analysis... 10 Energy 2 2012-01-01 2012-01-01 false Contents of applications; technical information in final...

  2. 10 CFR 52.157 - Contents of applications; technical information in final safety analysis report.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...; technical information in final safety analysis report. The application must contain a final safety analysis...) Information sufficient to demonstrate compliance with the applicable requirements regarding testing, analysis... 10 Energy 2 2014-01-01 2014-01-01 false Contents of applications; technical information in final...

  3. 10 CFR 52.157 - Contents of applications; technical information in final safety analysis report.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...; technical information in final safety analysis report. The application must contain a final safety analysis...) Information sufficient to demonstrate compliance with the applicable requirements regarding testing, analysis... 10 Energy 2 2011-01-01 2011-01-01 false Contents of applications; technical information in final...

  4. A space radiation transport method development

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tripathi, R. K.; Qualls, G. D.; Cucinotta, F. A.; Prael, R. E.; Norbury, J. W.; Heinbockel, J. H.; Tweed, J.

    2004-01-01

    Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest-order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard finite element method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 ms and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of re-configurable computing and could be utilized in the final design as verification of the deterministic method optimized design. Published by Elsevier Ltd on behalf of COSPAR.

  5. A multi-strategy approach to informative gene identification from gene expression data.

    PubMed

    Liu, Ziying; Phan, Sieu; Famili, Fazel; Pan, Youlian; Lenferink, Anne E G; Cantin, Christiane; Collins, Catherine; O'Connor-McCourt, Maureen D

    2010-02-01

    An unsupervised multi-strategy approach has been developed to identify informative genes from high throughput genomic data. Several statistical methods have been used in the field to identify differentially expressed genes. Since different methods generate different lists of genes, it is very challenging to determine the most reliable gene list and the appropriate method. This paper presents a multi-strategy method, in which a combination of several data analysis techniques are applied to a given dataset and a confidence measure is established to select genes from the gene lists generated by these techniques to form the core of our final selection. The remainder of the genes that form the peripheral region are subject to exclusion or inclusion into the final selection. This paper demonstrates this methodology through its application to an in-house cancer genomics dataset and a public dataset. The results indicate that our method provides more reliable list of genes, which are validated using biological knowledge, biological experiments, and literature search. We further evaluated our multi-strategy method by consolidating two pairs of independent datasets, each pair is for the same disease, but generated by different labs using different platforms. The results showed that our method has produced far better results.

  6. Identification Method of Mud Shale Fractures Base on Wavelet Transform

    NASA Astrophysics Data System (ADS)

    Xia, Weixu; Lai, Fuqiang; Luo, Han

    2018-01-01

    In recent years, inspired by seismic analysis technology, a new method for analysing mud shale fractures oil and gas reservoirs by logging properties has emerged. By extracting the high frequency attribute of the wavelet transform in the logging attribute, the formation information hidden in the logging signal is extracted, identified the fractures that are not recognized by conventional logging and in the identified fracture segment to show the “cycle jump”, “high value”, “spike” and other response effect is more obvious. Finally formed a complete wavelet denoising method and wavelet high frequency identification fracture method.

  7. A simplified method in comparison with comprehensive interaction incremental dynamic analysis to assess seismic performance of jacket-type offshore platforms

    NASA Astrophysics Data System (ADS)

    Zolfaghari, M. R.; Ajamy, A.; Asgarian, B.

    2015-12-01

    The primary goal of seismic reassessment procedures in oil platform codes is to determine the reliability of a platform under extreme earthquake loading. Therefore, in this paper, a simplified method is proposed to assess seismic performance of existing jacket-type offshore platforms (JTOP) in regions ranging from near-elastic to global collapse. The simplified method curve exploits well agreement between static pushover (SPO) curve and the entire summarized interaction incremental dynamic analysis (CI-IDA) curve of the platform. Although the CI-IDA method offers better understanding and better modelling of the phenomenon, it is a time-consuming and challenging task. To overcome the challenges, the simplified procedure, a fast and accurate approach, is introduced based on SPO analysis. Then, an existing JTOP in the Persian Gulf is presented to illustrate the procedure, and finally a comparison is made between the simplified method and CI-IDA results. The simplified method is very informative and practical for current engineering purposes. It is able to predict seismic performance elasticity to global dynamic instability with reasonable accuracy and little computational effort.

  8. Gait Analysis Methods: An Overview of Wearable and Non-Wearable Systems, Highlighting Clinical Applications

    PubMed Central

    Muro-de-la-Herran, Alvaro; Garcia-Zapirain, Begonya; Mendez-Zorrilla, Amaia

    2014-01-01

    This article presents a review of the methods used in recognition and analysis of the human gait from three different approaches: image processing, floor sensors and sensors placed on the body. Progress in new technologies has led the development of a series of devices and techniques which allow for objective evaluation, making measurements more efficient and effective and providing specialists with reliable information. Firstly, an introduction of the key gait parameters and semi-subjective methods is presented. Secondly, technologies and studies on the different objective methods are reviewed. Finally, based on the latest research, the characteristics of each method are discussed. 40% of the reviewed articles published in late 2012 and 2013 were related to non-wearable systems, 37.5% presented inertial sensor-based systems, and the remaining 22.5% corresponded to other wearable systems. An increasing number of research works demonstrate that various parameters such as precision, conformability, usability or transportability have indicated that the portable systems based on body sensors are promising methods for gait analysis. PMID:24556672

  9. The analysis method of the DRAM cell pattern hotspot

    NASA Astrophysics Data System (ADS)

    Lee, Kyusun; Lee, Kweonjae; Chang, Jinman; Kim, Taeheon; Han, Daehan; Hong, Aeran; Kim, Yonghyeon; Kang, Jinyoung; Choi, Bumjin; Lee, Joosung; Lee, Jooyoung; Hong, Hyeongsun; Lee, Kyupil; Jin, Gyoyoung

    2015-03-01

    It is increasingly difficult to determine degree of completion of the patterning and the distribution at the DRAM Cell Patterns. When we research DRAM Device Cell Pattern, there are three big problems currently, it is as follows. First, due to etch loading, it is difficult to predict the potential defect. Second, due to under layer topology, it is impossible to demonstrate the influence of the hotspot. Finally, it is extremely difficult to predict final ACI pattern by the photo simulation, because current patterning process is double patterning technology which means photo pattern is completely different from final etch pattern. Therefore, if the hotspot occurs in wafer, it is very difficult to find it. CD-SEM is the most common pattern measurement tool in semiconductor fabrication site. CD-SEM is used to accurately measure small region of wafer pattern primarily. Therefore, there is no possibility of finding places where unpredictable defect occurs. Even though, "Current Defect detector" can measure a wide area, every chip has same pattern issue, the detector cannot detect critical hotspots. Because defect detecting algorithm of bright field machine is based on image processing, if same problems occur on compared and comparing chip, the machine cannot identify it. Moreover this instrument is not distinguished the difference of distribution about 1nm~3nm. So, "Defect detector" is difficult to handle the data for potential weak point far lower than target CD. In order to solve those problems, another method is needed. In this paper, we introduce the analysis method of the DRAM Cell Pattern Hotspot.

  10. Robust Bayesian Algorithm for Targeted Compound Screening in Forensic Toxicology.

    PubMed

    Woldegebriel, Michael; Gonsalves, John; van Asten, Arian; Vivó-Truyols, Gabriel

    2016-02-16

    As part of forensic toxicological investigation of cases involving unexpected death of an individual, targeted or untargeted xenobiotic screening of post-mortem samples is normally conducted. To this end, liquid chromatography (LC) coupled to high-resolution mass spectrometry (MS) is typically employed. For data analysis, almost all commonly applied algorithms are threshold-based (frequentist). These algorithms examine the value of a certain measurement (e.g., peak height) to decide whether a certain xenobiotic of interest (XOI) is present/absent, yielding a binary output. Frequentist methods pose a problem when several sources of information [e.g., shape of the chromatographic peak, isotopic distribution, estimated mass-to-charge ratio (m/z), adduct, etc.] need to be combined, requiring the approach to make arbitrary decisions at substep levels of data analysis. We hereby introduce a novel Bayesian probabilistic algorithm for toxicological screening. The method tackles the problem with a different strategy. It is not aimed at reaching a final conclusion regarding the presence of the XOI, but it estimates its probability. The algorithm effectively and efficiently combines all possible pieces of evidence from the chromatogram and calculates the posterior probability of the presence/absence of XOI features. This way, the model can accommodate more information by updating the probability if extra evidence is acquired. The final probabilistic result assists the end user to make a final decision with respect to the presence/absence of the xenobiotic. The Bayesian method was validated and found to perform better (in terms of false positives and false negatives) than the vendor-supplied software package.

  11. Characterization of fiber diameter using image analysis

    NASA Astrophysics Data System (ADS)

    Baheti, S.; Tunak, M.

    2017-10-01

    Due to high surface area and porosity, the applications of nanofibers have increased in recent years. In the production process, determination of average fiber diameter and fiber orientation is crucial for quality assessment. The objective of present study was to compare the relative performance of different methods discussed in literature for estimation of fiber diameter. In this work, the existing automated fiber diameter analysis software packages available in literature were developed and validated based on simulated images of known fiber diameter. Finally, all methods were compared for their reliable and accurate estimation of fiber diameter in electro spun nanofiber membranes based on obtained mean and standard deviation.

  12. Coding, Constant Comparisons, and Core Categories: A Worked Example for Novice Constructivist Grounded Theorists.

    PubMed

    Giles, Tracey M; de Lacey, Sheryl; Muir-Cochrane, Eimear

    2016-01-01

    Grounded theory method has been described extensively in the literature. Yet, the varying processes portrayed can be confusing for novice grounded theorists. This article provides a worked example of the data analysis phase of a constructivist grounded theory study that examined family presence during resuscitation in acute health care settings. Core grounded theory methods are exemplified, including initial and focused coding, constant comparative analysis, memo writing, theoretical sampling, and theoretical saturation. The article traces the construction of the core category "Conditional Permission" from initial and focused codes, subcategories, and properties, through to its position in the final substantive grounded theory.

  13. Analysis and optimization of hybrid excitation permanent magnet synchronous generator for stand-alone power system

    NASA Astrophysics Data System (ADS)

    Wang, Huijun; Qu, Zheng; Tang, Shaofei; Pang, Mingqi; Zhang, Mingju

    2017-08-01

    In this paper, electromagnetic design and permanent magnet shape optimization for permanent magnet synchronous generator with hybrid excitation are investigated. Based on generator structure and principle, design outline is presented for obtaining high efficiency and low voltage fluctuation. In order to realize rapid design, equivalent magnetic circuits for permanent magnet and iron poles are developed. At the same time, finite element analysis is employed. Furthermore, by means of design of experiment (DOE) method, permanent magnet is optimized to reduce voltage waveform distortion. Finally, the validity of proposed design methods is validated by the analytical and experimental results.

  14. Genetic Interaction Score (S-Score) Calculation, Clustering, and Visualization of Genetic Interaction Profiles for Yeast.

    PubMed

    Roguev, Assen; Ryan, Colm J; Xu, Jiewei; Colson, Isabelle; Hartsuiker, Edgar; Krogan, Nevan

    2018-02-01

    This protocol describes computational analysis of genetic interaction screens, ranging from data capture (plate imaging) to downstream analyses. Plate imaging approaches using both digital camera and office flatbed scanners are included, along with a protocol for the extraction of colony size measurements from the resulting images. A commonly used genetic interaction scoring method, calculation of the S-score, is discussed. These methods require minimal computer skills, but some familiarity with MATLAB and Linux/Unix is a plus. Finally, an outline for using clustering and visualization software for analysis of resulting data sets is provided. © 2018 Cold Spring Harbor Laboratory Press.

  15. Chemical Attribution of Fentanyl Using Multivariate Statistical Analysis of Orthogonal Mass Spectral Data

    DOE PAGES

    Mayer, Brian P.; DeHope, Alan J.; Mew, Daniel A.; ...

    2016-03-24

    Attribution of the origin of an illicit drug relies on identification of compounds indicative of its clandestine production and is a key component of many modern forensic investigations. Here, the results of these studies can yield detailed information on method of manufacture, starting material source, and final product, all critical forensic evidence. In the present work, chemical attribution signatures (CAS) associated with the synthesis of the analgesic fentanyl, N-(1-phenylethylpiperidin-4-yl)-N-phenylpropanamide, were investigated. Six synthesis methods, all previously published fentanyl synthetic routes or hybrid versions thereof, were studied in an effort to identify and classify route-specific signatures. A total of 160 distinctmore » compounds and inorganic species were identified using gas and liquid chromatographies combined with mass spectrometric methods (gas chromatography/mass spectrometry (GC/MS) and liquid chromatography–tandem mass spectrometry-time of-flight (LC–MS/MS-TOF)) in conjunction with inductively coupled plasma mass spectrometry (ICPMS). The complexity of the resultant data matrix urged the use of multivariate statistical analysis. Using partial least-squares-discriminant analysis (PLS-DA), 87 route-specific CAS were classified and a statistical model capable of predicting the method of fentanyl synthesis was validated and tested against CAS profiles from crude fentanyl products deposited and later extracted from two operationally relevant surfaces: stainless steel and vinyl tile. Finally, this work provides the most detailed fentanyl CAS investigation to date by using orthogonal mass spectral data to identify CAS of forensic significance for illicit drug detection, profiling, and attribution.« less

  16. Chemical Attribution of Fentanyl Using Multivariate Statistical Analysis of Orthogonal Mass Spectral Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayer, Brian P.; DeHope, Alan J.; Mew, Daniel A.

    Attribution of the origin of an illicit drug relies on identification of compounds indicative of its clandestine production and is a key component of many modern forensic investigations. Here, the results of these studies can yield detailed information on method of manufacture, starting material source, and final product, all critical forensic evidence. In the present work, chemical attribution signatures (CAS) associated with the synthesis of the analgesic fentanyl, N-(1-phenylethylpiperidin-4-yl)-N-phenylpropanamide, were investigated. Six synthesis methods, all previously published fentanyl synthetic routes or hybrid versions thereof, were studied in an effort to identify and classify route-specific signatures. A total of 160 distinctmore » compounds and inorganic species were identified using gas and liquid chromatographies combined with mass spectrometric methods (gas chromatography/mass spectrometry (GC/MS) and liquid chromatography–tandem mass spectrometry-time of-flight (LC–MS/MS-TOF)) in conjunction with inductively coupled plasma mass spectrometry (ICPMS). The complexity of the resultant data matrix urged the use of multivariate statistical analysis. Using partial least-squares-discriminant analysis (PLS-DA), 87 route-specific CAS were classified and a statistical model capable of predicting the method of fentanyl synthesis was validated and tested against CAS profiles from crude fentanyl products deposited and later extracted from two operationally relevant surfaces: stainless steel and vinyl tile. Finally, this work provides the most detailed fentanyl CAS investigation to date by using orthogonal mass spectral data to identify CAS of forensic significance for illicit drug detection, profiling, and attribution.« less

  17. Stability analysis of Caisson Cofferdam Based on Strength Reduction Method

    NASA Astrophysics Data System (ADS)

    Xu, B. B.; Zhang, N. S.

    2018-05-01

    The working mechanism of the caisson cofferdam depends on the self-weight of the structure and internal filling to ensure its skid resistance and overturn resistance stability. Using the strength reduction method, the safety factor of the caisson cofferdam can be obtained. The potential slide surface can be searched automatically without constraining the range of the arc center. According to the results, the slippage surface goes through the bottom of the caisson. Based on the judgement criterion of the strength reduction method, the final safety factor is about 1.65.

  18. Methods and results of peak-flow frequency analyses for streamgages in and bordering Minnesota, through water year 2011

    USGS Publications Warehouse

    Kessler, Erich W.; Lorenz, David L.; Sanocki, Christopher A.

    2013-01-01

    Peak-flow frequency analyses were completed for 409 streamgages in and bordering Minnesota having at least 10 systematic peak flows through water year 2011. Selected annual exceedance probabilities were determined by fitting a log-Pearson type III probability distribution to the recorded annual peak flows. A detailed explanation of the methods that were used to determine the annual exceedance probabilities, the historical period, acceptable low outliers, and analysis method for each streamgage are presented. The final results of the analyses are presented.

  19. The development of advanced manufacturing systems

    NASA Astrophysics Data System (ADS)

    Doumeingts, Guy; Vallespir, Bruno; Darricau, Didier; Roboam, Michel

    Various methods for the design of advanced manufacturing systems (AMSs) are reviewed. The specifications for AMSs and problems inherent in their development are first discussed. Three models, the Computer Aided Manufacturing-International model, the National Bureau of Standards model, and the GRAI model, are considered in detail. Hierarchical modeling tools such as structured analysis and design techniques, Petri nets, and the Icam definition method are used in the development of integrated manufacturing models. Finally, the GRAI method is demonstrated in the design of specifications for the production management system of the Snecma AMS.

  20. Iterative Strain-Gage Balance Calibration Data Analysis for Extended Independent Variable Sets

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert Manfred

    2011-01-01

    A new method was developed that makes it possible to use an extended set of independent calibration variables for an iterative analysis of wind tunnel strain gage balance calibration data. The new method permits the application of the iterative analysis method whenever the total number of balance loads and other independent calibration variables is greater than the total number of measured strain gage outputs. Iteration equations used by the iterative analysis method have the limitation that the number of independent and dependent variables must match. The new method circumvents this limitation. It simply adds a missing dependent variable to the original data set by using an additional independent variable also as an additional dependent variable. Then, the desired solution of the regression analysis problem can be obtained that fits each gage output as a function of both the original and additional independent calibration variables. The final regression coefficients can be converted to data reduction matrix coefficients because the missing dependent variables were added to the data set without changing the regression analysis result for each gage output. Therefore, the new method still supports the application of the two load iteration equation choices that the iterative method traditionally uses for the prediction of balance loads during a wind tunnel test. An example is discussed in the paper that illustrates the application of the new method to a realistic simulation of temperature dependent calibration data set of a six component balance.

  1. Design of Biomedical Robots for Phenotype Prediction Problems

    PubMed Central

    deAndrés-Galiana, Enrique J.; Sonis, Stephen T.

    2016-01-01

    Abstract Genomics has been used with varying degrees of success in the context of drug discovery and in defining mechanisms of action for diseases like cancer and neurodegenerative and rare diseases in the quest for orphan drugs. To improve its utility, accuracy, and cost-effectiveness optimization of analytical methods, especially those that translate to clinically relevant outcomes, is critical. Here we define a novel tool for genomic analysis termed a biomedical robot in order to improve phenotype prediction, identifying disease pathogenesis and significantly defining therapeutic targets. Biomedical robot analytics differ from historical methods in that they are based on melding feature selection methods and ensemble learning techniques. The biomedical robot mathematically exploits the structure of the uncertainty space of any classification problem conceived as an ill-posed optimization problem. Given a classifier, there exist different equivalent small-scale genetic signatures that provide similar predictive accuracies. We perform the sensitivity analysis to noise of the biomedical robot concept using synthetic microarrays perturbed by different kinds of noises in expression and class assignment. Finally, we show the application of this concept to the analysis of different diseases, inferring the pathways and the correlation networks. The final aim of a biomedical robot is to improve knowledge discovery and provide decision systems to optimize diagnosis, treatment, and prognosis. This analysis shows that the biomedical robots are robust against different kinds of noises and particularly to a wrong class assignment of the samples. Assessing the uncertainty that is inherent to any phenotype prediction problem is the right way to address this kind of problem. PMID:27347715

  2. Analysis techniques for the evaluation of the neutrinoless double-β decay lifetime in 130Te with the CUORE-0 detector

    DOE PAGES

    Alduino, C.; Alfonso, K.; Artusa, D. R.; ...

    2016-04-25

    Here, we describe in detail the methods used to obtain the lower bound on the lifetime of neutrinoless double-beta (0νββ) decay in 130Te and the associated limit on the effective Majorana mass of the neutrino using the CUORE-0 detector. CUORE-0 is a bolometric detector array located at the Laboratori Nazionali del Gran Sasso that was designed to validate the background reduction techniques developed for CUORE, a next-generation experiment scheduled to come online in 2016. CUORE-0 is also a competitive 0νββ decay search in its own right and functions as a platform to further develop the analysis tools and procedures tomore » be used in CUORE. These include data collection, event selection and processing, as well as an evaluation of signal efficiency. In particular, we describe the amplitude evaluation, thermal gain stabilization, energy calibration methods, and the analysis event selection used to create our final 0νββ search spectrum. We define our high level analysis procedures, with emphasis on the new insights gained and challenges encountered. We outline in detail our fitting methods near the hypothesized 0νββ decay peak and catalog the main sources of systematic uncertainty. Finally, we derive the 0νββ decay half-life limits previously reported for CUORE-0, T 0ν 1/2 > 2.7×10 24yr, and in combination with the Cuoricino limit, T 0ν 1/2 > 4.0×10 24yr.« less

  3. Design of Biomedical Robots for Phenotype Prediction Problems.

    PubMed

    deAndrés-Galiana, Enrique J; Fernández-Martínez, Juan Luis; Sonis, Stephen T

    2016-08-01

    Genomics has been used with varying degrees of success in the context of drug discovery and in defining mechanisms of action for diseases like cancer and neurodegenerative and rare diseases in the quest for orphan drugs. To improve its utility, accuracy, and cost-effectiveness optimization of analytical methods, especially those that translate to clinically relevant outcomes, is critical. Here we define a novel tool for genomic analysis termed a biomedical robot in order to improve phenotype prediction, identifying disease pathogenesis and significantly defining therapeutic targets. Biomedical robot analytics differ from historical methods in that they are based on melding feature selection methods and ensemble learning techniques. The biomedical robot mathematically exploits the structure of the uncertainty space of any classification problem conceived as an ill-posed optimization problem. Given a classifier, there exist different equivalent small-scale genetic signatures that provide similar predictive accuracies. We perform the sensitivity analysis to noise of the biomedical robot concept using synthetic microarrays perturbed by different kinds of noises in expression and class assignment. Finally, we show the application of this concept to the analysis of different diseases, inferring the pathways and the correlation networks. The final aim of a biomedical robot is to improve knowledge discovery and provide decision systems to optimize diagnosis, treatment, and prognosis. This analysis shows that the biomedical robots are robust against different kinds of noises and particularly to a wrong class assignment of the samples. Assessing the uncertainty that is inherent to any phenotype prediction problem is the right way to address this kind of problem.

  4. Analysis techniques for the evaluation of the neutrinoless double- β decay lifetime in Te 130 with the CUORE-0 detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alduino, C.; Alfonso, K.; Artusa, D. R.

    2016-04-25

    We describe in detail the methods used to obtain the lower bound on the lifetime of neutrinoless double-beta ( 0 ν β β ) decay in 130 Te and the associated limit on the effective Majorana mass of the neutrino using the CUORE-0 detector. CUORE-0 is a bolometric detector array located at the Laboratori Nazionali del Gran Sasso that was designed to validate the background reduction techniques developed for CUORE, a next-generation experiment scheduled to come online in 2016. CUORE-0 is also a competitive 0 ν β β decay search in its own right and functions as a platform tomore » further develop the analysis tools and procedures to be used in CUORE. These include data collection, event selection and processing, as well as an evaluation of signal efficiency. In particular, we describe the amplitude evaluation, thermal gain stabilization, energy calibration methods, and the analysis event selection used to create our final 0 ν β β search spectrum. We define our high level analysis procedures, with emphasis on the new insights gained and challenges encountered. We outline in detail our fitting methods near the hypothesized 0 ν β β decay peak and catalog the main sources of systematic uncertainty. Finally, we derive the 0 ν β β decay half-life limits previously reported for CUORE-0, T 0 ν 1 / 2 > 2.7 × 10 24 yr , and in combination with the Cuoricino limit, T 0 ν 1 / 2 > 4.0 × 10 24 yr .« less

  5. Admixture Aberration Analysis: Application to Mapping in Admixed Population Using Pooled DNA

    NASA Astrophysics Data System (ADS)

    Bercovici, Sivan; Geiger, Dan

    Admixture mapping is a gene mapping approach used for the identification of genomic regions harboring disease susceptibility genes in the case of recently admixed populations such as African Americans. We present a novel method for admixture mapping, called admixture aberration analysis (AAA), that uses a DNA pool of affected admixed individuals. We demonstrate through simulations that AAA is a powerful and economical mapping method under a range of scenarios, capturing complex human diseases such as hypertension and end stage kidney disease. The method has a low false-positive rate and is robust to deviation from model assumptions. Finally, we apply AAA on 600 prostate cancer-affected African Americans, replicating a known risk locus. Simulation results indicate that the method can yield over 96% reduction in genotyping. Our method is implemented as a Java program called AAAmap and is freely available.

  6. Methodology in the Assessment of Construction and Development Investment Projects, Including the Graphic Multi-Criteria Analysis - a Systemic Approach

    NASA Astrophysics Data System (ADS)

    Szafranko, Elżbieta

    2017-10-01

    Assessment of variant solutions developed for a building investment project needs to be made at the stage of planning. While considering alternative solutions, the investor defines various criteria, but a direct evaluation of the degree of their fulfilment by developed variant solutions can be very difficult. In practice, there are different methods which enable the user to include a large number of parameters into an analysis, but their implementation can be challenging. Some methods require advanced mathematical computations, preceded by complicating input data processing, and the generated results may not lend themselves easily to interpretation. Hence, during her research, the author has developed a systemic approach, which involves several methods and whose goal is to compare their outcome. The final stage of the proposed method consists of graphic interpretation of results. The method has been tested on a variety of building and development projects.

  7. A Novel Method for Block Size Forensics Based on Morphological Operations

    NASA Astrophysics Data System (ADS)

    Luo, Weiqi; Huang, Jiwu; Qiu, Guoping

    Passive forensics analysis aims to find out how multimedia data is acquired and processed without relying on pre-embedded or pre-registered information. Since most existing compression schemes for digital images are based on block processing, one of the fundamental steps for subsequent forensics analysis is to detect the presence of block artifacts and estimate the block size for a given image. In this paper, we propose a novel method for blind block size estimation. A 2×2 cross-differential filter is first applied to detect all possible block artifact boundaries, morphological operations are then used to remove the boundary effects caused by the edges of the actual image contents, and finally maximum-likelihood estimation (MLE) is employed to estimate the block size. The experimental results evaluated on over 1300 nature images show the effectiveness of our proposed method. Compared with existing gradient-based detection method, our method achieves over 39% accuracy improvement on average.

  8. Fuzzy decision-making framework for treatment selection based on the combined QUALIFLEX-TODIM method

    NASA Astrophysics Data System (ADS)

    Ji, Pu; Zhang, Hong-yu; Wang, Jian-qiang

    2017-10-01

    Treatment selection is a multi-criteria decision-making problem of significant concern in the medical field. In this study, a fuzzy decision-making framework is established for treatment selection. The framework mitigates information loss by introducing single-valued trapezoidal neutrosophic numbers to denote evaluation information. Treatment selection has multiple criteria that remarkably exceed the alternatives. In consideration of this characteristic, the framework utilises the idea of the qualitative flexible multiple criteria method. Furthermore, it considers the risk-averse behaviour of a decision maker by employing a concordance index based on TODIM (an acronym in Portuguese of interactive and multi-criteria decision-making) method. A sensitivity analysis is performed to illustrate the robustness of the framework. Finally, a comparative analysis is conducted to compare the framework with several extant methods. Results indicate the advantages of the framework and its better performance compared with the extant methods.

  9. Problems with multiple use of transfer buffer in protein electrophoretic transfer.

    PubMed

    Dorri, Yaser; Kurien, Biji T; Scofield, R Hal

    2010-04-01

    Two-dimensional gel electrophoresis (2DE) and SDS-PAGE are the two most useful methods in protein separation. Proteins separated by 2DE or SDS-PAGE are usually transferred to membranes using a variety of methods, such as electrophoretic transfer, heat-mediated transfer, or nonelectrophoretic transfer, for specific protein detection and/or analysis. In a recent study, Pettegrew et al. claim to reuse transfer buffer containing methanol for at least five times for transferring proteins from SDS-PAGE to polyvinylidene difluoride. They add 150-200 ml fresh transfer solution each time for extended use as a result of loss of transfer buffer. Finally, they test efficiency of each protein transfer by chemiluminescence detection. Here, we comment on this report, as we believe this method is not accurate and useful for protein analysis, and it can cause background binding as well as inaccurate protein analysis.

  10. Location and Size Planning of Distributed Photovoltaic Generation in Distribution network System Based on K-means Clustering Analysis

    NASA Astrophysics Data System (ADS)

    Lu, Siqi; Wang, Xiaorong; Wu, Junyong

    2018-01-01

    The paper presents a method to generate the planning scenarios, which is based on K-means clustering analysis algorithm driven by data, for the location and size planning of distributed photovoltaic (PV) units in the network. Taken the power losses of the network, the installation and maintenance costs of distributed PV, the profit of distributed PV and the voltage offset as objectives and the locations and sizes of distributed PV as decision variables, Pareto optimal front is obtained through the self-adaptive genetic algorithm (GA) and solutions are ranked by a method called technique for order preference by similarity to an ideal solution (TOPSIS). Finally, select the planning schemes at the top of the ranking list based on different planning emphasis after the analysis in detail. The proposed method is applied to a 10-kV distribution network in Gansu Province, China and the results are discussed.

  11. Research on criticality analysis method of CNC machine tools components under fault rate correlation

    NASA Astrophysics Data System (ADS)

    Gui-xiang, Shen; Xian-zhuo, Zhao; Zhang, Ying-zhi; Chen-yu, Han

    2018-02-01

    In order to determine the key components of CNC machine tools under fault rate correlation, a system component criticality analysis method is proposed. Based on the fault mechanism analysis, the component fault relation is determined, and the adjacency matrix is introduced to describe it. Then, the fault structure relation is hierarchical by using the interpretive structure model (ISM). Assuming that the impact of the fault obeys the Markov process, the fault association matrix is described and transformed, and the Pagerank algorithm is used to determine the relative influence values, combined component fault rate under time correlation can obtain comprehensive fault rate. Based on the fault mode frequency and fault influence, the criticality of the components under the fault rate correlation is determined, and the key components are determined to provide the correct basis for equationting the reliability assurance measures. Finally, taking machining centers as an example, the effectiveness of the method is verified.

  12. Research on response spectrum of dam based on scenario earthquake

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoliang; Zhang, Yushan

    2017-10-01

    Taking a large hydropower station as an example, the response spectrum based on scenario earthquake is determined. Firstly, the potential source of greatest contribution to the site is determined on the basis of the results of probabilistic seismic hazard analysis (PSHA). Secondly, the magnitude and epicentral distance of the scenario earthquake are calculated according to the main faults and historical earthquake of the potential seismic source zone. Finally, the response spectrum of scenario earthquake is calculated using the Next Generation Attenuation (NGA) relations. The response spectrum based on scenario earthquake method is less than the probability-consistent response spectrum obtained by PSHA method. The empirical analysis shows that the response spectrum of scenario earthquake considers the probability level and the structural factors, and combines the advantages of the deterministic and probabilistic seismic hazard analysis methods. It is easy for people to accept and provide basis for seismic engineering of hydraulic engineering.

  13. The anisotropic Hooke's law for cancellous bone and wood.

    PubMed

    Yang, G; Kabel, J; van Rietbergen, B; Odgaard, A; Huiskes, R; Cowin, S C

    A method of data analysis for a set of elastic constant measurements is applied to data bases for wood and cancellous bone. For these materials the identification of the type of elastic symmetry is complicated by the variable composition of the material. The data analysis method permits the identification of the type of elastic symmetry to be accomplished independent of the examination of the variable composition. This method of analysis may be applied to any set of elastic constant measurements, but is illustrated here by application to hardwoods and softwoods, and to an extraordinary data base of cancellous bone elastic constants. The solid volume fraction or bulk density is the compositional variable for the elastic constants of these natural materials. The final results are the solid volume fraction dependent orthotropic Hooke's law for cancellous bone and a bulk density dependent one for hardwoods and softwoods.

  14. High-performance liquid chromatography/electrospray mass spectrometry for the analysis of modified bases in DNA: 7-(2-hydroxyethyl)guanine, the major ethylene oxide-DNA adduct.

    PubMed

    Leclercq, L; Laurent, C; De Pauw, E

    1997-05-15

    A method was developed for the analysis of 7-(2-hydroxyethyl)guanine (7HEG), the major DNA adduct formed after exposure to ethylene oxide (EO). The method is based on DNA neutral thermal hydrolysis, adduct micro-concentration, and final characterization and quantification by HPLC coupled to single-ion monitoring electrospray mass spectrometry (HPLC/SIR-ESMS). The method was found to be selective, sensitive, and easy to handle with no need for enzymatic digestion or previous sample derivatization. Detection limit was found to be close to 1 fmol of adduct injected (10(-10) M), thus allowing the detection of approximately three modified bases on 10(8) intact nucleotides in blood sample analysis. Quantification results are shown for 7HEG after calf thymus DNA and blood exposure to various doses of EO, in both cases obtaining clear dose-response relationships.

  15. Study of a Vocal Feature Selection Method and Vocal Properties for Discriminating Four Constitution Types

    PubMed Central

    Kim, Keun Ho; Ku, Boncho; Kang, Namsik; Kim, Young-Su; Jang, Jun-Su; Kim, Jong Yeol

    2012-01-01

    The voice has been used to classify the four constitution types, and to recognize a subject's health condition by extracting meaningful physical quantities, in traditional Korean medicine. In this paper, we propose a method of selecting the reliable variables from various voice features, such as frequency derivative features, frequency band ratios, and intensity, from vowels and a sentence. Further, we suggest a process to extract independent variables by eliminating explanatory variables and reducing their correlation and remove outlying data to enable reliable discriminant analysis. Moreover, the suitable division of data for analysis, according to the gender and age of subjects, is discussed. Finally, the vocal features are applied to a discriminant analysis to classify each constitution type. This method of voice classification can be widely used in the u-Healthcare system of personalized medicine and for improving diagnostic accuracy. PMID:22529874

  16. Geometrically derived difference formulae for the numerical integration of trajectory problems

    NASA Technical Reports Server (NTRS)

    Mcleod, R. J. Y.; Sanz-Serna, J. M.

    1981-01-01

    The term 'trajectory problem' is taken to include problems that can arise, for instance, in connection with contour plotting, or in the application of continuation methods, or during phase-plane analysis. Geometrical techniques are used to construct difference methods for these problems to produce in turn explicit and implicit circularly exact formulae. Based on these formulae, a predictor-corrector method is derived which, when compared with a closely related standard method, shows improved performance. It is found that this latter method produces spurious limit cycles, and this behavior is partly analyzed. Finally, a simple variable-step algorithm is constructed and tested.

  17. Quality by Design: Multidimensional exploration of the design space in high performance liquid chromatography method development for better robustness before validation.

    PubMed

    Monks, K; Molnár, I; Rieger, H-J; Bogáti, B; Szabó, E

    2012-04-06

    Robust HPLC separations lead to fewer analysis failures and better method transfer as well as providing an assurance of quality. This work presents the systematic development of an optimal, robust, fast UHPLC method for the simultaneous assay of two APIs of an eye drop sample and their impurities, in accordance with Quality by Design principles. Chromatography software is employed to effectively generate design spaces (Method Operable Design Regions), which are subsequently employed to determine the final method conditions and to evaluate robustness prior to validation. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. Transient analysis using conical shell elements

    NASA Technical Reports Server (NTRS)

    Yang, J. C. S.; Goeller, J. E.; Messick, W. T.

    1973-01-01

    The use of the NASTRAN conical shell element in static, eigenvalue, and direct transient analyses is demonstrated. The results of a NASTRAN static solution of an externally pressurized ring-stiffened cylinder agree well with a theoretical discontinuity analysis. Good agreement is also obtained between the NASTRAN direct transient response of a uniform cylinder to a dynamic end load and one-dimensional solutions obtained using a method of characteristics stress wave code and a standing wave solution. Finally, a NASTRAN eigenvalue analysis is performed on a hydroballistic model idealized with conical shell elements.

  19. Photoacoustic Spectroscopy Analysis of Traditional Chinese Medicine

    NASA Astrophysics Data System (ADS)

    Chen, Lu; Zhao, Bin-xing; Xiao, Hong-tao; Tong, Rong-sheng; Gao, Chun-ming

    2013-09-01

    Chinese medicine is a historic cultural legacy of China. It has made a significant contribution to medicine and healthcare for generations. The development of Chinese herbal medicine analysis is emphasized by the Chinese pharmaceutical industry. This study has carried out the experimental analysis of ten kinds of Chinese herbal powder including Fritillaria powder, etc., based on the photoacoustic spectroscopy (PAS) method. First, a photoacoustic spectroscopy system was designed and constructed, especially a highly sensitive solid photoacoustic cell was established. Second, the experimental setup was verified through the characteristic emission spectrum of the light source, obtained by using carbon as a sample in the photoacoustic cell. Finally, as the photoacoustic spectroscopy analysis of Fritillaria, etc., was completed, the specificity of the Chinese herb medicine analysis was verified. This study shows that the PAS can provide a valid, highly sensitive analytical method for the specificity of Chinese herb medicine without preparing and damaging samples.

  20. Integrating computer programs for engineering analysis and design

    NASA Technical Reports Server (NTRS)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  1. Network Analysis: Applications for the Developing Brain

    PubMed Central

    Chu-Shore, Catherine J.; Kramer, Mark A.; Bianchi, Matt T.; Caviness, Verne S.; Cash, Sydney S.

    2011-01-01

    Development of the human brain follows a complex trajectory of age-specific anatomical and physiological changes. The application of network analysis provides an illuminating perspective on the dynamic interregional and global properties of this intricate and complex system. Here, we provide a critical synopsis of methods of network analysis with a focus on developing brain networks. After discussing basic concepts and approaches to network analysis, we explore the primary events of anatomical cortical development from gestation through adolescence. Upon this framework, we describe early work revealing the evolution of age-specific functional brain networks in normal neurodevelopment. Finally, we review how these relationships can be altered in disease and perhaps even rectified with treatment. While this method of description and inquiry remains in early form, there is already substantial evidence that the application of network models and analysis to understanding normal and abnormal human neural development holds tremendous promise for future discovery. PMID:21303762

  2. Analysing concurrent transcranial magnetic stimulation and electroencephalographic data: A review and introduction to the open-source TESA software.

    PubMed

    Rogasch, Nigel C; Sullivan, Caley; Thomson, Richard H; Rose, Nathan S; Bailey, Neil W; Fitzgerald, Paul B; Farzan, Faranak; Hernandez-Pavon, Julio C

    2017-02-15

    The concurrent use of transcranial magnetic stimulation with electroencephalography (TMS-EEG) is growing in popularity as a method for assessing various cortical properties such as excitability, oscillations and connectivity. However, this combination of methods is technically challenging, resulting in artifacts both during recording and following typical EEG analysis methods, which can distort the underlying neural signal. In this article, we review the causes of artifacts in EEG recordings resulting from TMS, as well as artifacts introduced during analysis (e.g. as the result of filtering over high-frequency, large amplitude artifacts). We then discuss methods for removing artifacts, and ways of designing pipelines to minimise analysis-related artifacts. Finally, we introduce the TMS-EEG signal analyser (TESA), an open-source extension for EEGLAB, which includes functions that are specific for TMS-EEG analysis, such as removing and interpolating the TMS pulse artifact, removing and minimising TMS-evoked muscle activity, and analysing TMS-evoked potentials. The aims of TESA are to provide users with easy access to current TMS-EEG analysis methods and to encourage direct comparisons of these methods and pipelines. It is hoped that providing open-source functions will aid in both improving and standardising analysis across the field of TMS-EEG research. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Evaluation of standardized and applied variables in predicting treatment outcomes of polytrauma patients.

    PubMed

    Aksamija, Goran; Mulabdic, Adi; Rasic, Ismar; Muhovic, Samir; Gavric, Igor

    2011-01-01

    Polytrauma is defined as an injury where they are affected by at least two different organ systems or body, with at least one life-threatening injuries. Given the multilevel model care of polytrauma patients within KCUS are inevitable weaknesses in the management of this category of patients. To determine the dynamics of existing procedures in treatment of polytrauma patients on admission to KCUS, and based on statistical analysis of variables applied to determine and define the factors that influence the final outcome of treatment, and determine their mutual relationship, which may result in eliminating the flaws in the approach to the problem. The study was based on 263 polytrauma patients. Parametric and non-parametric statistical methods were used. Basic statistics were calculated, based on the calculated parameters for the final achievement of research objectives, multicoleration analysis, image analysis, discriminant analysis and multifactorial analysis were used. From the universe of variables for this study we selected sample of n = 25 variables, of which the first two modular, others belong to the common measurement space (n = 23) and in this paper defined as a system variable methods, procedures and assessments of polytrauma patients. After the multicoleration analysis, since the image analysis gave a reliable measurement results, we started the analysis of eigenvalues, that is defining the factors upon which they obtain information about the system solve the problem of the existing model and its correlation with treatment outcome. The study singled out the essential factors that determine the current organizational model of care, which may affect the treatment and better outcome of polytrauma patients. This analysis has shown the maximum correlative relationships between these practices and contributed to development guidelines that are defined by isolated factors.

  4. Is verbatim transcription of interview data always necessary?

    PubMed

    Halcomb, Elizabeth J; Davidson, Patricia M

    2006-02-01

    Verbatim transcription of interview data has become a common data management strategy in nursing research and is widely considered to be integral to the analysis and interpretation of verbal data. As the benefits of verbal data are becoming more widely embraced in health care research, interviews are being increasingly used to collect information for a wide range of purposes. In addition to purely qualitative investigations, there has been a significant increase in the conduct of mixed-method inquiries. This article examines the issues surrounding the conduct of interviews in mixed-method research, with particular emphasis on the transcription and data analysis phases of data management. It also debates on the necessity to transcribe all audiorecorded interview data verbatim, particularly in relation to mixed-method investigations. Finally, it provides an alternative method to verbatim transcription of managing audiorecorded interview data.

  5. The Games Universities Play (With Apologies to Dr. Berne). Working Paper.

    ERIC Educational Resources Information Center

    Skelton, John E.

    A preliminary study of alternative methods of organizing, managing, and financing computing at the nation's institutions of higher education is explored in the context of transactional analysis. The purpose and contents of the forthcoming final report (designed for university presidents) is described. The games, as intended and defined by the…

  6. Brief Strategic Family Therapy for Young People in Treatment for Drug Use

    ERIC Educational Resources Information Center

    Lindstrøm, Maia; Filges, Trine; Jørgensen, Anne-Marie Klint

    2015-01-01

    Purpose: This review evaluates the evidence on the effects of brief strategic family therapy (BSFT) on drug use reduction for young people in treatment for nonopioid drug use. Method: We followed Campbell Collaboration guidelines to prepare this review and ultimately located three studies for final analysis and interpretation. Results: The results…

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korte, Andrew R

    This thesis presents efforts to improve the methodology of matrix-assisted laser desorption ionization-mass spectrometry imaging (MALDI-MSI) as a method for analysis of metabolites from plant tissue samples. The first chapter consists of a general introduction to the technique of MALDI-MSI, and the sixth and final chapter provides a brief summary and an outlook on future work.

  8. 40 CFR 799.6786 - TSCA water solubility: Generator column method.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... quantitative) analysis of solvent extract in paragraph (c)(3)(iv) of this section. The design of the generator.... Finally, the design of most chemical tests and many ecological and health tests requires precise knowledge..., molality, and mole fraction. For example, to convert from weight/volume to molarity molecular mass is...

  9. 40 CFR 799.6786 - TSCA water solubility: Generator column method.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... quantitative) analysis of solvent extract in paragraph (c)(3)(iv) of this section. The design of the generator.... Finally, the design of most chemical tests and many ecological and health tests requires precise knowledge..., molality, and mole fraction. For example, to convert from weight/volume to molarity molecular mass is...

  10. 40 CFR 799.6786 - TSCA water solubility: Generator column method.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... quantitative) analysis of solvent extract in paragraph (c)(3)(iv) of this section. The design of the generator.... Finally, the design of most chemical tests and many ecological and health tests requires precise knowledge..., molality, and mole fraction. For example, to convert from weight/volume to molarity molecular mass is...

  11. Computer-Assisted, Programmed Text, and Lecture Modes of Instruction in Three Medical Training Courses: Comparative Evaluation. Final Report.

    ERIC Educational Resources Information Center

    Deignan, Gerard M.; And Others

    This report contains a comparative analysis of the differential effectiveness of computer-assisted instruction (CAI), programmed instructional text (PIT), and lecture methods of instruction in three medical courses--Medical Laboratory, Radiology, and Dental. The summative evaluation includes (1) multiple regression analyses conducted to predict…

  12. Revised Planning Methodology For Signalized Intersections And Operational Analysis Of Exclusive Left-Turn Lanes, A Simulation-Based Method, Part - I: Literature Review (Final Report)

    DOT National Transportation Integrated Search

    1996-04-01

    THE STUDY INVESTIGATES THE APPLICATION OF SIMULATION ALONG WITH FIELD OBSERVATIONS FOR ESTIMATION OF EXCLUSIVE LEFT-TURN SATURATION FLOW RATE AND CAPACITY. THE ENTIRE RESEARCH HAS COVERED THE FOLLOWING PRINCIPAL SUBJECTS: (1) A SATURATION FLOW MODEL ...

  13. 75 FR 47592 - Final Test Guideline; Product Performance of Skin-applied Insect Repellents of Insect and Other...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-06

    ... considerations affecting the design and conduct of repellent studies when human subjects are involved. Any... recommendations for the design and execution of studies to evaluate the performance of pesticide products intended... recommends appropriate study designs and methods for selecting subjects, statistical analysis, and reporting...

  14. Definition of Alaskan Aviation Training Requirements. Final Report.

    ERIC Educational Resources Information Center

    Mitchell, M. K.; And Others

    Because of high accident rates and the unique conditions faced in Arctic flying, a project was conducted to develop a training program for airline pilots flying over Alaska. Data were gathered, through the critical incident method in conjunction with traditional job-analysis procedures, about how experienced Alaskan pilots learned to cope with the…

  15. Temperature rise, sea level rise and increased radiative forcing - an application of cointegration methods

    NASA Astrophysics Data System (ADS)

    Schmith, Torben; Thejll, Peter; Johansen, Søren

    2016-04-01

    We analyse the statistical relationship between changes in global temperature, global steric sea level and radiative forcing in order to reveal causal relationships. There are in this, however, potential pitfalls due to the trending nature of the time series. We therefore apply a statistical method called cointegration analysis, originating from the field of econometrics, which is able to correctly handle the analysis of series with trends and other long-range dependencies. Further, we find a relationship between steric sea level and temperature and find that temperature causally depends on the steric sea level, which can be understood as a consequence of the large heat capacity of the ocean. This result is obtained both when analyzing observed data and data from a CMIP5 historical model run. Finally, we find that in the data from the historical run, the steric sea level, in turn, is driven by the external forcing. Finally, we demonstrate that combining these two results can lead to a novel estimate of radiative forcing back in time based on observations.

  16. Parameters of Models of Structural Transformations in Alloy Steel Under Welding Thermal Cycle

    NASA Astrophysics Data System (ADS)

    Kurkin, A. S.; Makarov, E. L.; Kurkin, A. B.; Rubtsov, D. E.; Rubtsov, M. E.

    2017-05-01

    A mathematical model of structural transformations in an alloy steel under the thermal cycle of multipass welding is suggested for computer implementation. The minimum necessary set of parameters for describing the transformations under heating and cooling is determined. Ferritic-pearlitic, bainitic and martensitic transformations under cooling of a steel are considered. A method for deriving the necessary temperature and time parameters of the model from the chemical composition of the steel is described. Published data are used to derive regression models of the temperature ranges and parameters of transformation kinetics in alloy steels. It is shown that the disadvantages of the active visual methods of analysis of the final phase composition of steels are responsible for inaccuracy and mismatch of published data. The hardness of a specimen, which correlates with some other mechanical properties of the material, is chosen as the most objective and reproducible criterion of the final phase composition. The models developed are checked by a comparative analysis of computational results and experimental data on the hardness of 140 alloy steels after cooling at various rates.

  17. Automatic Cell Segmentation in Fluorescence Images of Confluent Cell Monolayers Using Multi-object Geometric Deformable Model.

    PubMed

    Yang, Zhen; Bogovic, John A; Carass, Aaron; Ye, Mao; Searson, Peter C; Prince, Jerry L

    2013-03-13

    With the rapid development of microscopy for cell imaging, there is a strong and growing demand for image analysis software to quantitatively study cell morphology. Automatic cell segmentation is an important step in image analysis. Despite substantial progress, there is still a need to improve the accuracy, efficiency, and adaptability to different cell morphologies. In this paper, we propose a fully automatic method for segmenting cells in fluorescence images of confluent cell monolayers. This method addresses several challenges through a combination of ideas. 1) It realizes a fully automatic segmentation process by first detecting the cell nuclei as initial seeds and then using a multi-object geometric deformable model (MGDM) for final segmentation. 2) To deal with different defects in the fluorescence images, the cell junctions are enhanced by applying an order-statistic filter and principal curvature based image operator. 3) The final segmentation using MGDM promotes robust and accurate segmentation results, and guarantees no overlaps and gaps between neighboring cells. The automatic segmentation results are compared with manually delineated cells, and the average Dice coefficient over all distinguishable cells is 0.88.

  18. End-to-end automated microfluidic platform for synthetic biology: from design to functional analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linshiz, Gregory; Jensen, Erik; Stawski, Nina

    Synthetic biology aims to engineer biological systems for desired behaviors. The construction of these systems can be complex, often requiring genetic reprogramming, extensive de novo DNA synthesis, and functional screening. Here, we present a programmable, multipurpose microfluidic platform and associated software and apply the platform to major steps of the synthetic biology research cycle: design, construction, testing, and analysis. We show the platform’s capabilities for multiple automated DNA assembly methods, including a new method for Isothermal Hierarchical DNA Construction, and for Escherichia coli and Saccharomyces cerevisiae transformation. The platform enables the automated control of cellular growth, gene expression induction, andmore » proteogenic and metabolic output analysis. Finally, taken together, we demonstrate the microfluidic platform’s potential to provide end-to-end solutions for synthetic biology research, from design to functional analysis.« less

  19. Fault tree analysis of failure cause of crushing plant and mixing bed hall at Khoy cement factory in Iran☆

    PubMed Central

    Nouri.Gharahasanlou, Ali; Mokhtarei, Ashkan; Khodayarei, Aliasqar; Ataei, Mohammad

    2014-01-01

    Evaluating and analyzing the risk in the mining industry is a new approach for improving the machinery performance. Reliability, safety, and maintenance management based on the risk analysis can enhance the overall availability and utilization of the mining technological systems. This study investigates the failure occurrence probability of the crushing and mixing bed hall department at Azarabadegan Khoy cement plant by using fault tree analysis (FTA) method. The results of the analysis in 200 h operating interval show that the probability of failure occurrence for crushing, conveyor systems, crushing and mixing bed hall department is 73, 64, and 95 percent respectively and the conveyor belt subsystem found as the most probable system for failure. Finally, maintenance as a method of control and prevent the occurrence of failure is proposed. PMID:26779433

  20. End-to-end automated microfluidic platform for synthetic biology: from design to functional analysis

    DOE PAGES

    Linshiz, Gregory; Jensen, Erik; Stawski, Nina; ...

    2016-02-02

    Synthetic biology aims to engineer biological systems for desired behaviors. The construction of these systems can be complex, often requiring genetic reprogramming, extensive de novo DNA synthesis, and functional screening. Here, we present a programmable, multipurpose microfluidic platform and associated software and apply the platform to major steps of the synthetic biology research cycle: design, construction, testing, and analysis. We show the platform’s capabilities for multiple automated DNA assembly methods, including a new method for Isothermal Hierarchical DNA Construction, and for Escherichia coli and Saccharomyces cerevisiae transformation. The platform enables the automated control of cellular growth, gene expression induction, andmore » proteogenic and metabolic output analysis. Finally, taken together, we demonstrate the microfluidic platform’s potential to provide end-to-end solutions for synthetic biology research, from design to functional analysis.« less

  1. Observer-Pattern Modeling and Slow-Scale Bifurcation Analysis of Two-Stage Boost Inverters

    NASA Astrophysics Data System (ADS)

    Zhang, Hao; Wan, Xiaojin; Li, Weijie; Ding, Honghui; Yi, Chuanzhi

    2017-06-01

    This paper deals with modeling and bifurcation analysis of two-stage Boost inverters. Since the effect of the nonlinear interactions between source-stage converter and load-stage inverter causes the “hidden” second-harmonic current at the input of the downstream H-bridge inverter, an observer-pattern modeling method is proposed by removing time variance originating from both fundamental frequency and hidden second harmonics in the derived averaged equations. Based on the proposed observer-pattern model, the underlying mechanism of slow-scale instability behavior is uncovered with the help of eigenvalue analysis method. Then eigenvalue sensitivity analysis is used to select some key system parameters of two-stage Boost inverter, and some behavior boundaries are given to provide some design-oriented information for optimizing the circuit. Finally, these theoretical results are verified by numerical simulations and circuit experiment.

  2. Fault tree analysis of failure cause of crushing plant and mixing bed hall at Khoy cement factory in Iran.

    PubMed

    Nouri Gharahasanlou, Ali; Mokhtarei, Ashkan; Khodayarei, Aliasqar; Ataei, Mohammad

    2014-04-01

    Evaluating and analyzing the risk in the mining industry is a new approach for improving the machinery performance. Reliability, safety, and maintenance management based on the risk analysis can enhance the overall availability and utilization of the mining technological systems. This study investigates the failure occurrence probability of the crushing and mixing bed hall department at Azarabadegan Khoy cement plant by using fault tree analysis (FTA) method. The results of the analysis in 200 h operating interval show that the probability of failure occurrence for crushing, conveyor systems, crushing and mixing bed hall department is 73, 64, and 95 percent respectively and the conveyor belt subsystem found as the most probable system for failure. Finally, maintenance as a method of control and prevent the occurrence of failure is proposed.

  3. Creating a spatially-explicit index: a method for assessing the global wildfire-water risk

    NASA Astrophysics Data System (ADS)

    Robinne, François-Nicolas; Parisien, Marc-André; Flannigan, Mike; Miller, Carol; Bladon, Kevin D.

    2017-04-01

    The wildfire-water risk (WWR) has been defined as the potential for wildfires to adversely affect water resources that are important for downstream ecosystems and human water needs for adequate water quantity and quality, therefore compromising the security of their water supply. While tools and methods are numerous for watershed-scale risk analysis, the development of a toolbox for the large-scale evaluation of the wildfire risk to water security has only started recently. In order to provide managers and policy-makers with an adequate tool, we implemented a method for the spatial analysis of the global WWR based on the Driving forces-Pressures-States-Impacts-Responses (DPSIR) framework. This framework relies on the cause-and-effect relationships existing between the five categories of the DPSIR chain. As this approach heavily relies on data, we gathered an extensive set of spatial indicators relevant to fire-induced hydrological hazards and water consumption patterns by human and natural communities. When appropriate, we applied a hydrological routing function to our indicators in order to simulate downstream accumulation of potentially harmful material. Each indicator was then assigned a DPSIR category. We collapsed the information in each category using a principal component analysis in order to extract the most relevant pixel-based information provided by each spatial indicator. Finally, we compiled our five categories using an additive indexation process to produce a spatially-explicit index of the WWR. A thorough sensitivity analysis has been performed in order to understand the relationship between the final risk values and the spatial pattern of each category used during the indexation. For comparison purposes, we aggregated index scores by global hydrological regions, or hydrobelts, to get a sense of regional DPSIR specificities. This rather simple method does not necessitate the use of complex physical models and provides a scalable and efficient tool for the analysis of global water security issues.

  4. Design rainfall depth estimation through two regional frequency analysis methods in Hanjiang River Basin, China

    NASA Astrophysics Data System (ADS)

    Xu, Yue-Ping; Yu, Chaofeng; Zhang, Xujie; Zhang, Qingqing; Xu, Xiao

    2012-02-01

    Hydrological predictions in ungauged basins are of significant importance for water resources management. In hydrological frequency analysis, regional methods are regarded as useful tools in estimating design rainfall/flood for areas with only little data available. The purpose of this paper is to investigate the performance of two regional methods, namely the Hosking's approach and the cokriging approach, in hydrological frequency analysis. These two methods are employed to estimate 24-h design rainfall depths in Hanjiang River Basin, one of the largest tributaries of Yangtze River, China. Validation is made through comparing the results to those calculated from the provincial handbook approach which uses hundreds of rainfall gauge stations. Also for validation purpose, five hypothetically ungauged sites from the middle basin are chosen. The final results show that compared to the provincial handbook approach, the Hosking's approach often overestimated the 24-h design rainfall depths while the cokriging approach most of the time underestimated. Overall, the Hosking' approach produced more accurate results than the cokriging approach.

  5. Transverse heat transfer coefficient in the dual channel ITER TF CICCs Part II. Analysis of transient temperature responses observed during a heat slug propagation experiment

    NASA Astrophysics Data System (ADS)

    Lewandowska, Monika; Herzog, Robert; Malinowski, Leszek

    2015-01-01

    A heat slug propagation experiment in the final design dual channel ITER TF CICC was performed in the SULTAN test facility at EPFL-CRPP in Villigen PSI. We analyzed the data resulting from this experiment to determine the equivalent transverse heat transfer coefficient hBC between the bundle and the central channel of this cable. In the data analysis we used methods based on the analytical solutions of a problem of transient heat transfer in a dual-channel cable, similar to Renard et al. (2006) and Bottura et al. (2006). The observed experimental and other limits related to these methods are identified and possible modifications proposed. One result from our analysis is that the hBC values obtained with different methods differ by up to a factor of 2. We have also observed that the uncertainties of hBC in both methods considered are much larger than those reported earlier.

  6. Comparative analysis of laparoscopic and ultrasound-guided biopsy methods for gene expression analysis in transgenic goats.

    PubMed

    Melo, C H; Sousa, F C; Batista, R I P T; Sanchez, D J D; Souza-Fabjan, J M G; Freitas, V J F; Melo, L M; Teixeira, D I A

    2015-07-31

    The present study aimed to compare laparoscopic (LP) and ultrasound-guided (US) biopsy methods to obtain either liver or splenic tissue samples for ectopic gene expression analysis in transgenic goats. Tissue samples were collected from human granulocyte colony stimulating factor (hG-CSF)-transgenic bucks and submitted to real-time PCR for the endogenous genes (Sp1, Baff, and Gapdh) and the transgene (hG-CSF). Both LP and US biopsy methods were successful in obtaining liver and splenic samples that could be analyzed by PCR (i.e., sufficient sample sizes and RNA yield were obtained). Although the number of attempts made to obtain the tissue samples was similar (P > 0.05), LP procedures took considerably longer than the US method (P = 0.03). Finally, transgene transcripts were not detected in spleen or liver samples. Thus, for the phenotypic characterization of a transgenic goat line, investigation of ectopic gene expression can be made successfully by LP or US biopsy, avoiding the traditional approach of euthanasia.

  7. Performance analysis of a Principal Component Analysis ensemble classifier for Emotiv headset P300 spellers.

    PubMed

    Elsawy, Amr S; Eldawlatly, Seif; Taher, Mohamed; Aly, Gamal M

    2014-01-01

    The current trend to use Brain-Computer Interfaces (BCIs) with mobile devices mandates the development of efficient EEG data processing methods. In this paper, we demonstrate the performance of a Principal Component Analysis (PCA) ensemble classifier for P300-based spellers. We recorded EEG data from multiple subjects using the Emotiv neuroheadset in the context of a classical oddball P300 speller paradigm. We compare the performance of the proposed ensemble classifier to the performance of traditional feature extraction and classifier methods. Our results demonstrate the capability of the PCA ensemble classifier to classify P300 data recorded using the Emotiv neuroheadset with an average accuracy of 86.29% on cross-validation data. In addition, offline testing of the recorded data reveals an average classification accuracy of 73.3% that is significantly higher than that achieved using traditional methods. Finally, we demonstrate the effect of the parameters of the P300 speller paradigm on the performance of the method.

  8. [Local Regression Algorithm Based on Net Analyte Signal and Its Application in Near Infrared Spectral Analysis].

    PubMed

    Zhang, Hong-guang; Lu, Jian-gang

    2016-02-01

    Abstract To overcome the problems of significant difference among samples and nonlinearity between the property and spectra of samples in spectral quantitative analysis, a local regression algorithm is proposed in this paper. In this algorithm, net signal analysis method(NAS) was firstly used to obtain the net analyte signal of the calibration samples and unknown samples, then the Euclidean distance between net analyte signal of the sample and net analyte signal of calibration samples was calculated and utilized as similarity index. According to the defined similarity index, the local calibration sets were individually selected for each unknown sample. Finally, a local PLS regression model was built on each local calibration sets for each unknown sample. The proposed method was applied to a set of near infrared spectra of meat samples. The results demonstrate that the prediction precision and model complexity of the proposed method are superior to global PLS regression method and conventional local regression algorithm based on spectral Euclidean distance.

  9. A sensitive continuum analysis method for gamma ray spectra

    NASA Technical Reports Server (NTRS)

    Thakur, Alakh N.; Arnold, James R.

    1993-01-01

    In this work we examine ways to improve the sensitivity of the analysis procedure for gamma ray spectra with respect to small differences in the continuum (Compton) spectra. The method developed is applied to analyze gamma ray spectra obtained from planetary mapping by the Mars Observer spacecraft launched in September 1992. Calculated Mars simulation spectra and actual thick target bombardment spectra have been taken as test cases. The principle of the method rests on the extraction of continuum information from Fourier transforms of the spectra. We study how a better estimate of the spectrum from larger regions of the Mars surface will improve the analysis for smaller regions with poorer statistics. Estimation of signal within the continuum is done in the frequency domain which enables efficient and sensitive discrimination of subtle differences between two spectra. The process is compared to other methods for the extraction of information from the continuum. Finally we explore briefly the possible uses of this technique in other applications of continuum spectra.

  10. Taguchi Method Applied in Optimization of Shipley SJR 5740 Positive Resist Deposition

    NASA Technical Reports Server (NTRS)

    Hui, A.; Blosiu, J. O.; Wiberg, D. V.

    1998-01-01

    Taguchi Methods of Robust Design presents a way to optimize output process performance through an organized set of experiments by using orthogonal arrays. Analysis of variance and signal-to-noise ratio is used to evaluate the contribution of each of the process controllable parameters in the realization of the process optimization. In the photoresist deposition process, there are numerous controllable parameters that can affect the surface quality and thickness of the final photoresist layer.

  11. A new technique for spectrophotometric determination of pseudoephedrine and guaifenesin in syrup and synthetic mixture.

    PubMed

    Riahi, Siavash; Hadiloo, Farshad; Milani, Seyed Mohammad R; Davarkhah, Nazila; Ganjali, Mohammad R; Norouzi, Parviz; Seyfi, Payam

    2011-05-01

    The accuracy in predicting different chemometric methods was compared when applied on ordinary UV spectra and first order derivative spectra. Principal component regression (PCR) and partial least squares with one dependent variable (PLS1) and two dependent variables (PLS2) were applied on spectral data of pharmaceutical formula containing pseudoephedrine (PDP) and guaifenesin (GFN). The ability to derivative in resolved overlapping spectra chloropheniramine maleate was evaluated when multivariate methods are adopted for analysis of two component mixtures without using any chemical pretreatment. The chemometrics models were tested on an external validation dataset and finally applied to the analysis of pharmaceuticals. Significant advantages were found in analysis of the real samples when the calibration models from derivative spectra were used. It should also be mentioned that the proposed method is a simple and rapid way requiring no preliminary separation steps and can be used easily for the analysis of these compounds, especially in quality control laboratories. Copyright © 2011 John Wiley & Sons, Ltd.

  12. The p-version of the finite element method in incremental elasto-plastic analysis

    NASA Technical Reports Server (NTRS)

    Holzer, Stefan M.; Yosibash, Zohar

    1993-01-01

    Whereas the higher-order versions of the finite elements method (the p- and hp-version) are fairly well established as highly efficient methods for monitoring and controlling the discretization error in linear problems, little has been done to exploit their benefits in elasto-plastic structural analysis. Aspects of incremental elasto-plastic finite element analysis which are particularly amenable to improvements by the p-version is discussed. These theoretical considerations are supported by several numerical experiments. First, an example for which an analytical solution is available is studied. It is demonstrated that the p-version performs very well even in cycles of elasto-plastic loading and unloading, not only as compared to the traditional h-version but also in respect to the exact solution. Finally, an example of considerable practical importance - the analysis of a cold-worked lug - is presented which demonstrates how the modeling tools offered by higher-order finite element techniques can contribute to an improved approximation of practical problems.

  13. Retinal status analysis method based on feature extraction and quantitative grading in OCT images.

    PubMed

    Fu, Dongmei; Tong, Hejun; Zheng, Shuang; Luo, Ling; Gao, Fulin; Minar, Jiri

    2016-07-22

    Optical coherence tomography (OCT) is widely used in ophthalmology for viewing the morphology of the retina, which is important for disease detection and assessing therapeutic effect. The diagnosis of retinal diseases is based primarily on the subjective analysis of OCT images by trained ophthalmologists. This paper describes an OCT images automatic analysis method for computer-aided disease diagnosis and it is a critical part of the eye fundus diagnosis. This study analyzed 300 OCT images acquired by Optovue Avanti RTVue XR (Optovue Corp., Fremont, CA). Firstly, the normal retinal reference model based on retinal boundaries was presented. Subsequently, two kinds of quantitative methods based on geometric features and morphological features were proposed. This paper put forward a retinal abnormal grading decision-making method which was used in actual analysis and evaluation of multiple OCT images. This paper showed detailed analysis process by four retinal OCT images with different abnormal degrees. The final grading results verified that the analysis method can distinguish abnormal severity and lesion regions. This paper presented the simulation of the 150 test images, where the results of analysis of retinal status showed that the sensitivity was 0.94 and specificity was 0.92.The proposed method can speed up diagnostic process and objectively evaluate the retinal status. This paper aims on studies of retinal status automatic analysis method based on feature extraction and quantitative grading in OCT images. The proposed method can obtain the parameters and the features that are associated with retinal morphology. Quantitative analysis and evaluation of these features are combined with reference model which can realize the target image abnormal judgment and provide a reference for disease diagnosis.

  14. Quantitative Analysis of Ca, Mg, and K in the Roots of Angelica pubescens f. biserrata by Laser-Induced Breakdown Spectroscopy Combined with Artificial Neural Networks

    NASA Astrophysics Data System (ADS)

    Wang, J.; Shi, M.; Zheng, P.; Xue, Sh.; Peng, R.

    2018-03-01

    Laser-induced breakdown spectroscopy has been applied for the quantitative analysis of Ca, Mg, and K in the roots of Angelica pubescens Maxim. f. biserrata Shan et Yuan used in traditional Chinese medicine. Ca II 317.993 nm, Mg I 517.268 nm, and K I 769.896 nm spectral lines have been chosen to set up calibration models for the analysis using the external standard and artificial neural network methods. The linear correlation coefficients of the predicted concentrations versus the standard concentrations of six samples determined by the artificial neural network method are 0.9896, 0.9945, and 0.9911 for Ca, Mg, and K, respectively, which are better than for the external standard method. The artificial neural network method also gives better performance comparing with the external standard method for the average and maximum relative errors, average relative standard deviations, and most maximum relative standard deviations of the predicted concentrations of Ca, Mg, and K in the six samples. Finally, it is proved that the artificial neural network method gives better performance compared to the external standard method for the quantitative analysis of Ca, Mg, and K in the roots of Angelica pubescens.

  15. Final stress analysis report ultraviolet spectrometer S169

    NASA Technical Reports Server (NTRS)

    Cooper, S.

    1971-01-01

    The stress analysis report verifies the structural integrity of the Apollo S-169 UV-spectrometer experiment. The methods by which the various members were analyzed are described. A detailed summary of results for the individual structural elements appears in the form of a table of minimum margins of safety. No negative margins of safety were experienced. It is concluded that the component structure is more than adequate to withstand the environmental load conditions given in the design criteria.

  16. A New Cluster Analysis-Marker-Controlled Watershed Method for Separating Particles of Granular Soils.

    PubMed

    Alam, Md Ferdous; Haque, Asadul

    2017-10-18

    An accurate determination of particle-level fabric of granular soils from tomography data requires a maximum correct separation of particles. The popular marker-controlled watershed separation method is widely used to separate particles. However, the watershed method alone is not capable of producing the maximum separation of particles when subjected to boundary stresses leading to crushing of particles. In this paper, a new separation method, named as Monash Particle Separation Method (MPSM), has been introduced. The new method automatically determines the optimal contrast coefficient based on cluster evaluation framework to produce the maximum accurate separation outcomes. Finally, the particles which could not be separated by the optimal contrast coefficient were separated by integrating cuboid markers generated from the clustering by Gaussian mixture models into the routine watershed method. The MPSM was validated on a uniformly graded sand volume subjected to one-dimensional compression loading up to 32 MPa. It was demonstrated that the MPSM is capable of producing the best possible separation of particles required for the fabric analysis.

  17. Harmonic analysis of electrified railway based on improved HHT

    NASA Astrophysics Data System (ADS)

    Wang, Feng

    2018-04-01

    In this paper, the causes and harms of the current electric locomotive electrical system harmonics are firstly studied and analyzed. Based on the characteristics of the harmonics in the electrical system, the Hilbert-Huang transform method is introduced. Based on the in-depth analysis of the empirical mode decomposition method and the Hilbert transform method, the reasons and solutions to the endpoint effect and modal aliasing problem in the HHT method are explored. For the endpoint effect of HHT, this paper uses point-symmetric extension method to extend the collected data; In allusion to the modal aliasing problem, this paper uses the high frequency harmonic assistant method to preprocess the signal and gives the empirical formula of high frequency auxiliary harmonic. Finally, combining the suppression of HHT endpoint effect and modal aliasing problem, an improved HHT method is proposed and simulated by matlab. The simulation results show that the improved HHT is effective for the electric locomotive power supply system.

  18. The comparative analysis of the current-meter method and the pressure-time method used for discharge measurements in the Kaplan turbine penstocks

    NASA Astrophysics Data System (ADS)

    Adamkowski, A.; Krzemianowski, Z.

    2012-11-01

    The paper presents experiences gathered during many years of utilizing the current-meter and pressure-time methods for flow rate measurements in many hydropower plants. The integration techniques used in these both methods are different from the recommendations contained in the relevant international standards, mainly from the graphical and arithmetical ones. The results of the comparative analysis of both methods applied at the same time during the hydraulic performance tests of two Kaplan turbines in one of the Polish hydropower plant are presented in the final part of the paper. In the case of the pressure-time method application, the concrete penstocks of the tested turbines required installing a special measuring instrumentation inside the penstock. The comparison has shown a satisfactory agreement between the results of discharge measurements executed using the both considered methods. Maximum differences between the discharge values have not exceeded 1.0 % and the average differences have not been greater than 0.5 %.

  19. Ranking metrics in gene set enrichment analysis: do they matter?

    PubMed

    Zyla, Joanna; Marczyk, Michal; Weiner, January; Polanska, Joanna

    2017-05-12

    There exist many methods for describing the complex relation between changes of gene expression in molecular pathways or gene ontologies under different experimental conditions. Among them, Gene Set Enrichment Analysis seems to be one of the most commonly used (over 10,000 citations). An important parameter, which could affect the final result, is the choice of a metric for the ranking of genes. Applying a default ranking metric may lead to poor results. In this work 28 benchmark data sets were used to evaluate the sensitivity and false positive rate of gene set analysis for 16 different ranking metrics including new proposals. Furthermore, the robustness of the chosen methods to sample size was tested. Using k-means clustering algorithm a group of four metrics with the highest performance in terms of overall sensitivity, overall false positive rate and computational load was established i.e. absolute value of Moderated Welch Test statistic, Minimum Significant Difference, absolute value of Signal-To-Noise ratio and Baumgartner-Weiss-Schindler test statistic. In case of false positive rate estimation, all selected ranking metrics were robust with respect to sample size. In case of sensitivity, the absolute value of Moderated Welch Test statistic and absolute value of Signal-To-Noise ratio gave stable results, while Baumgartner-Weiss-Schindler and Minimum Significant Difference showed better results for larger sample size. Finally, the Gene Set Enrichment Analysis method with all tested ranking metrics was parallelised and implemented in MATLAB, and is available at https://github.com/ZAEDPolSl/MrGSEA . Choosing a ranking metric in Gene Set Enrichment Analysis has critical impact on results of pathway enrichment analysis. The absolute value of Moderated Welch Test has the best overall sensitivity and Minimum Significant Difference has the best overall specificity of gene set analysis. When the number of non-normally distributed genes is high, using Baumgartner-Weiss-Schindler test statistic gives better outcomes. Also, it finds more enriched pathways than other tested metrics, which may induce new biological discoveries.

  20. Comparing multiple competing interventions in the absence of randomized trials using clinical risk-benefit analysis

    PubMed Central

    2012-01-01

    Background To demonstrate the use of risk-benefit analysis for comparing multiple competing interventions in the absence of randomized trials, we applied this approach to the evaluation of five anticoagulants to prevent thrombosis in patients undergoing orthopedic surgery. Methods Using a cost-effectiveness approach from a clinical perspective (i.e. risk benefit analysis) we compared thromboprophylaxis with warfarin, low molecular weight heparin, unfractionated heparin, fondaparinux or ximelagatran in patients undergoing major orthopedic surgery, with sub-analyses according to surgery type. Proportions and variances of events defining risk (major bleeding) and benefit (thrombosis averted) were obtained through a meta-analysis and used to define beta distributions. Monte Carlo simulations were conducted and used to calculate incremental risks, benefits, and risk-benefit ratios. Finally, net clinical benefit was calculated for all replications across a range of risk-benefit acceptability thresholds, with a reference range obtained by estimating the case fatality rate - ratio of thrombosis to bleeding. Results The analysis showed that compared to placebo ximelagatran was superior to other options but final results were influenced by type of surgery, since ximelagatran was superior in total knee replacement but not in total hip replacement. Conclusions Using simulation and economic techniques we demonstrate a method that allows comparing multiple competing interventions in the absence of randomized trials with multiple arms by determining the option with the best risk-benefit profile. It can be helpful in clinical decision making since it incorporates risk, benefit, and personal risk acceptance. PMID:22233221

  1. Structural evolution in the crystallization of rapid cooling silver melt

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Z.A., E-mail: ze.tian@gmail.com; Laboratory for Simulation and Modelling of Particulate Systems School of Materials Science and Engineering, University of New South Wales, Sydney, NSW 2052; Dong, K.J.

    2015-03-15

    The structural evolution in a rapid cooling process of silver melt has been investigated at different scales by adopting several analysis methods. The results testify Ostwald’s rule of stages and Frank conjecture upon icosahedron with many specific details. In particular, the cluster-scale analysis by a recent developed method called LSCA (the Largest Standard Cluster Analysis) clarified the complex structural evolution occurred in crystallization: different kinds of local clusters (such as ico-like (ico is the abbreviation of icosahedron), ico-bcc like (bcc, body-centred cubic), bcc, bcc-like structures) in turn have their maximal numbers as temperature decreases. And in a rather wide temperaturemore » range the icosahedral short-range order (ISRO) demonstrates a saturated stage (where the amount of ico-like structures keeps stable) that breeds metastable bcc clusters. As the precursor of crystallization, after reaching the maximal number bcc clusters finally decrease, resulting in the final solid being a mixture mainly composed of fcc/hcp (face-centred cubic and hexagonal-closed packed) clusters and to a less degree, bcc clusters. This detailed geometric picture for crystallization of liquid metal is believed to be useful to improve the fundamental understanding of liquid–solid phase transition. - Highlights: • A comprehensive structural analysis is conducted focusing on crystallization. • The involved atoms in our analysis are more than 90% for all samples concerned. • A series of distinct intermediate states are found in crystallization of silver melt. • A novelty icosahedron-saturated state breeds the metastable bcc state.« less

  2. Interactive K-Means Clustering Method Based on User Behavior for Different Analysis Target in Medicine.

    PubMed

    Lei, Yang; Yu, Dai; Bin, Zhang; Yang, Yang

    2017-01-01

    Clustering algorithm as a basis of data analysis is widely used in analysis systems. However, as for the high dimensions of the data, the clustering algorithm may overlook the business relation between these dimensions especially in the medical fields. As a result, usually the clustering result may not meet the business goals of the users. Then, in the clustering process, if it can combine the knowledge of the users, that is, the doctor's knowledge or the analysis intent, the clustering result can be more satisfied. In this paper, we propose an interactive K -means clustering method to improve the user's satisfactions towards the result. The core of this method is to get the user's feedback of the clustering result, to optimize the clustering result. Then, a particle swarm optimization algorithm is used in the method to optimize the parameters, especially the weight settings in the clustering algorithm to make it reflect the user's business preference as possible. After that, based on the parameter optimization and adjustment, the clustering result can be closer to the user's requirement. Finally, we take an example in the breast cancer, to testify our method. The experiments show the better performance of our algorithm.

  3. Effects of gas temperature on nozzle damping experiments on cold-flow rocket motors

    NASA Astrophysics Data System (ADS)

    Sun, Bing-bing; Li, Shi-peng; Su, Wan-xing; Li, Jun-wei; Wang, Ning-fei

    2016-09-01

    In order to explore the impact of gas temperature on the nozzle damping characteristics of solid rocket motor, numerical simulations were carried out by an experimental motor in Naval Ordnance Test Station of China Lake in California. Using the pulse decay method, different cases were numerically studied via Fluent along with UDF (User Defined Functions). Firstly, mesh sensitivity analysis and monitor position-independent analysis were carried out for the computer code validation. Then, the numerical method was further validated by comparing the calculated results and experimental data. Finally, the effects of gas temperature on the nozzle damping characteristics were studied in this paper. The results indicated that the gas temperature had cooperative effects on the nozzle damping and there had great differences between cold flow and hot fire test. By discussion and analysis, it was found that the changing of mainstream velocity and the natural acoustic frequency resulted from gas temperature were the key factors that affected the nozzle damping, while the alteration of the mean pressure had little effect. Thus, the high pressure condition could be replaced by low pressure to reduce the difficulty of the test. Finally, the relation of the coefficients "alpha" between the cold flow and hot fire was got.

  4. Design of A Cyclone Separator Using Approximation Method

    NASA Astrophysics Data System (ADS)

    Sin, Bong-Su; Choi, Ji-Won; Lee, Kwon-Hee

    2017-12-01

    A Separator is a device installed in industrial applications to separate mixed objects. The separator of interest in this research is a cyclone type, which is used to separate a steam-brine mixture in a geothermal plant. The most important performance of the cyclone separator is the collection efficiency. The collection efficiency in this study is predicted by performing the CFD (Computational Fluid Dynamics) analysis. This research defines six shape design variables to maximize the collection efficiency. Thus, the collection efficiency is set up as the objective function in optimization process. Since the CFD analysis requires a lot of calculation time, it is impossible to obtain the optimal solution by linking the gradient-based optimization algorithm. Thus, two approximation methods are introduced to obtain an optimum design. In this process, an L18 orthogonal array is adopted as a DOE method, and kriging interpolation method is adopted to generate the metamodel for the collection efficiency. Based on the 18 analysis results, the relative importance of each variable to the collection efficiency is obtained through the ANOVA (analysis of variance). The final design is suggested considering the results obtained from two optimization methods. The fluid flow analysis of the cyclone separator is conducted by using the commercial CFD software, ANSYS-CFX.

  5. Using Real-time Event Tracking Sensitivity Analysis to Overcome Sensor Measurement Uncertainties of Geo-Information Management in Drilling Disasters

    NASA Astrophysics Data System (ADS)

    Tavakoli, S.; Poslad, S.; Fruhwirth, R.; Winter, M.

    2012-04-01

    This paper introduces an application of a novel EventTracker platform for instantaneous Sensitivity Analysis (SA) of large scale real-time geo-information. Earth disaster management systems demand high quality information to aid a quick and timely response to their evolving environments. The idea behind the proposed EventTracker platform is the assumption that modern information management systems are able to capture data in real-time and have the technological flexibility to adjust their services to work with specific sources of data/information. However, to assure this adaptation in real time, the online data should be collected, interpreted, and translated into corrective actions in a concise and timely manner. This can hardly be handled by existing sensitivity analysis methods because they rely on historical data and lazy processing algorithms. In event-driven systems, the effect of system inputs on its state is of value, as events could cause this state to change. This 'event triggering' situation underpins the logic of the proposed approach. Event tracking sensitivity analysis method describes the system variables and states as a collection of events. The higher the occurrence of an input variable during the trigger of event, the greater its potential impact will be on the final analysis of the system state. Experiments were designed to compare the proposed event tracking sensitivity analysis with existing Entropy-based sensitivity analysis methods. The results have shown a 10% improvement in a computational efficiency with no compromise for accuracy. It has also shown that the computational time to perform the sensitivity analysis is 0.5% of the time required compared to using the Entropy-based method. The proposed method has been applied to real world data in the context of preventing emerging crises at drilling rigs. One of the major purposes of such rigs is to drill boreholes to explore oil or gas reservoirs with the final scope of recovering the content of such reservoirs; both in onshore regions as well as in offshore regions. Drilling a well is always guided by technical, economic and security constraints to prevent crew, equipment and environment from injury, damage and pollution. Although risk assessment and local practice provides a high degree of security, uncertainty is given by the behaviour of the formation which may cause crucial situations at the rig. To overcome such uncertainties real-time sensor measurements form a base to predict and thus prevent such crises, the proposed method supports the identification of the data necessary for that.

  6. Automated Production of Movies on a Cluster of Computers

    NASA Technical Reports Server (NTRS)

    Nail, Jasper; Le, Duong; Nail, William L.; Nail, William

    2008-01-01

    A method of accelerating and facilitating production of video and film motion-picture products, and software and generic designs of computer hardware to implement the method, are undergoing development. The method provides for automation of most of the tedious and repetitive tasks involved in editing and otherwise processing raw digitized imagery into final motion-picture products. The method was conceived to satisfy requirements, in industrial and scientific testing, for rapid processing of multiple streams of simultaneously captured raw video imagery into documentation in the form of edited video imagery and video derived data products for technical review and analysis. In the production of such video technical documentation, unlike in production of motion-picture products for entertainment, (1) it is often necessary to produce multiple video derived data products, (2) there are usually no second chances to repeat acquisition of raw imagery, (3) it is often desired to produce final products within minutes rather than hours, days, or months, and (4) consistency and quality, rather than aesthetics, are the primary criteria for judging the products. In the present method, the workflow has both serial and parallel aspects: processing can begin before all the raw imagery has been acquired, each video stream can be subjected to different stages of processing simultaneously on different computers that may be grouped into one or more cluster(s), and the final product may consist of multiple video streams. Results of processing on different computers are shared, so that workers can collaborate effectively.

  7. Fuzzy Comprehensive Evaluation Method Applied in the Real Estate Investment Risks Research

    NASA Astrophysics Data System (ADS)

    ML(Zhang Minli), Zhang; Wp(Yang Wenpo), Yang

    Real estate investment is a high-risk and high returned of economic activity, the key of real estate analysis is the identification of their types of investment risk and the risk of different types of effective prevention. But, as the financial crisis sweeping the world, the real estate industry also faces enormous risks, how effective and correct evaluation of real estate investment risks becomes the multitudinous scholar concern[1]. In this paper, real estate investment risks were summarized and analyzed, and comparative analysis method is discussed and finally presented fuzzy comprehensive evaluation method, not only in theory has the advantages of science, in the application also has the reliability, for real estate investment risk assessment provides an effective means for investors in real estate investing guidance on risk factors and forecasts.

  8. Structural analysis of zeolite NaA synthesized by a cost-effective hydrothermal method using kaolin and its use as water softener.

    PubMed

    Loiola, A R; Andrade, J C R A; Sasaki, J M; da Silva, L R D

    2012-02-01

    Zeolite 4A (LTA) has been successfully synthesized by a hydrothermal method, where kaolin was used as silica and alumina source. The synthesized zeolite was characterized by X-ray diffraction (XRD), scanning electron microscopy (SEM), laser granulometry, and FTIR spectroscopy. XRD data from the Rietveld refinement method confirmed only one crystallographic phase. Zeolite A morphology was observed by SEM analysis, and it showed well-defined crystals with slightly different sizes but with the same cubic shape. Particle size distribution of the crystals was confirmed by laser granulometry, whereas FTIR spectroscopy revealed significant structural differences between the starting material and the final zeolite product used as water softener. Copyright © 2010 Elsevier Inc. All rights reserved.

  9. Efficient three-dimensional resist profile-driven source mask optimization optical proximity correction based on Abbe-principal component analysis and Sylvester equation

    NASA Astrophysics Data System (ADS)

    Lin, Pei-Chun; Yu, Chun-Chang; Chen, Charlie Chung-Ping

    2015-01-01

    As one of the critical stages of a very large scale integration fabrication process, postexposure bake (PEB) plays a crucial role in determining the final three-dimensional (3-D) profiles and lessening the standing wave effects. However, the full 3-D chemically amplified resist simulation is not widely adopted during the postlayout optimization due to the long run-time and huge memory usage. An efficient simulation method is proposed to simulate the PEB while considering standing wave effects and resolution enhancement techniques, such as source mask optimization and subresolution assist features based on the Sylvester equation and Abbe-principal component analysis method. Simulation results show that our algorithm is 20× faster than the conventional Gaussian convolution method.

  10. Intensity-Based Registration for Lung Motion Estimation

    NASA Astrophysics Data System (ADS)

    Cao, Kunlin; Ding, Kai; Amelon, Ryan E.; Du, Kaifang; Reinhardt, Joseph M.; Raghavan, Madhavan L.; Christensen, Gary E.

    Image registration plays an important role within pulmonary image analysis. The task of registration is to find the spatial mapping that brings two images into alignment. Registration algorithms designed for matching 4D lung scans or two 3D scans acquired at different inflation levels can catch the temporal changes in position and shape of the region of interest. Accurate registration is critical to post-analysis of lung mechanics and motion estimation. In this chapter, we discuss lung-specific adaptations of intensity-based registration methods for 3D/4D lung images and review approaches for assessing registration accuracy. Then we introduce methods for estimating tissue motion and studying lung mechanics. Finally, we discuss methods for assessing and quantifying specific volume change, specific ventilation, strain/ stretch information and lobar sliding.

  11. Combined FDTD-Monte Carlo analysis and a novel design for ZnO scintillator rods in polycarbonate membrane for X-ray imaging

    NASA Astrophysics Data System (ADS)

    Mohammadian-Behbahani, Mohammad-Reza; Saramad, Shahyar; Mohammadi, Mohammad

    2017-05-01

    A combination of Finite Difference Time Domain (FDTD) and Monte Carlo (MC) methods is proposed for simulation and analysis of ZnO microscintillators grown in polycarbonate membrane. A planar 10 keV X-ray source irradiating the detector is simulated by MC method, which provides the amount of absorbed X-ray energy in the assembly. The transport of generated UV scintillation light and its propagation in the detector was studied by the FDTD method. Detector responses to different probable scintillation sites and under different energies of X-ray source from 10 to 25 keV are reported. Finally, the tapered geometry for the scintillators is proposed, which shows enhanced spatial resolution in comparison to cylindrical geometry for imaging applications.

  12. Fractal analysis of GPS time series for early detection of disastrous seismic events

    NASA Astrophysics Data System (ADS)

    Filatov, Denis M.; Lyubushin, Alexey A.

    2017-03-01

    A new method of fractal analysis of time series for estimating the chaoticity of behaviour of open stochastic dynamical systems is developed. The method is a modification of the conventional detrended fluctuation analysis (DFA) technique. We start from analysing both methods from the physical point of view and demonstrate the difference between them which results in a higher accuracy of the new method compared to the conventional DFA. Then, applying the developed method to estimate the measure of chaoticity of a real dynamical system - the Earth's crust, we reveal that the latter exhibits two distinct mechanisms of transition to a critical state: while the first mechanism has already been known due to numerous studies of other dynamical systems, the second one is new and has not previously been described. Using GPS time series, we demonstrate efficiency of the developed method in identification of critical states of the Earth's crust. Finally we employ the method to solve a practically important task: we show how the developed measure of chaoticity can be used for early detection of disastrous seismic events and provide a detailed discussion of the numerical results, which are shown to be consistent with outcomes of other researches on the topic.

  13. Efficient sensitivity analysis method for chaotic dynamical systems

    NASA Astrophysics Data System (ADS)

    Liao, Haitao

    2016-05-01

    The direct differentiation and improved least squares shadowing methods are both developed for accurately and efficiently calculating the sensitivity coefficients of time averaged quantities for chaotic dynamical systems. The key idea is to recast the time averaged integration term in the form of differential equation before applying the sensitivity analysis method. An additional constraint-based equation which forms the augmented equations of motion is proposed to calculate the time averaged integration variable and the sensitivity coefficients are obtained as a result of solving the augmented differential equations. The application of the least squares shadowing formulation to the augmented equations results in an explicit expression for the sensitivity coefficient which is dependent on the final state of the Lagrange multipliers. The LU factorization technique to calculate the Lagrange multipliers leads to a better performance for the convergence problem and the computational expense. Numerical experiments on a set of problems selected from the literature are presented to illustrate the developed methods. The numerical results demonstrate the correctness and effectiveness of the present approaches and some short impulsive sensitivity coefficients are observed by using the direct differentiation sensitivity analysis method.

  14. Analysis and synthesis of bianisotropic metasurfaces by using analytical approach based on equivalent parameters

    NASA Astrophysics Data System (ADS)

    Danaeifar, Mohammad; Granpayeh, Nosrat

    2018-03-01

    An analytical method is presented to analyze and synthesize bianisotropic metasurfaces. The equivalent parameters of metasurfaces in terms of meta-atom properties and other specifications of metasurfaces are derived. These parameters are related to electric, magnetic, and electromagnetic/magnetoelectric dipole moments of the bianisotropic media, and they can simplify the analysis of complicated and multilayer structures. A metasurface of split ring resonators is studied as an example demonstrating the proposed method. The optical properties of the meta-atom are explored, and the calculated polarizabilities are applied to find the reflection coefficient and the equivalent parameters of the metasurface. Finally, a structure consisting of two metasurfaces of the split ring resonators is provided, and the proposed analytical method is applied to derive the reflection coefficient. The validity of this analytical approach is verified by full-wave simulations which demonstrate good accuracy of the equivalent parameter method. This method can be used in the analysis and synthesis of bianisotropic metasurfaces with different materials and in different frequency ranges by considering electric, magnetic, and electromagnetic/magnetoelectric dipole moments.

  15. Assessing differential expression in two-color microarrays: a resampling-based empirical Bayes approach.

    PubMed

    Li, Dongmei; Le Pape, Marc A; Parikh, Nisha I; Chen, Will X; Dye, Timothy D

    2013-01-01

    Microarrays are widely used for examining differential gene expression, identifying single nucleotide polymorphisms, and detecting methylation loci. Multiple testing methods in microarray data analysis aim at controlling both Type I and Type II error rates; however, real microarray data do not always fit their distribution assumptions. Smyth's ubiquitous parametric method, for example, inadequately accommodates violations of normality assumptions, resulting in inflated Type I error rates. The Significance Analysis of Microarrays, another widely used microarray data analysis method, is based on a permutation test and is robust to non-normally distributed data; however, the Significance Analysis of Microarrays method fold change criteria are problematic, and can critically alter the conclusion of a study, as a result of compositional changes of the control data set in the analysis. We propose a novel approach, combining resampling with empirical Bayes methods: the Resampling-based empirical Bayes Methods. This approach not only reduces false discovery rates for non-normally distributed microarray data, but it is also impervious to fold change threshold since no control data set selection is needed. Through simulation studies, sensitivities, specificities, total rejections, and false discovery rates are compared across the Smyth's parametric method, the Significance Analysis of Microarrays, and the Resampling-based empirical Bayes Methods. Differences in false discovery rates controls between each approach are illustrated through a preterm delivery methylation study. The results show that the Resampling-based empirical Bayes Methods offer significantly higher specificity and lower false discovery rates compared to Smyth's parametric method when data are not normally distributed. The Resampling-based empirical Bayes Methods also offers higher statistical power than the Significance Analysis of Microarrays method when the proportion of significantly differentially expressed genes is large for both normally and non-normally distributed data. Finally, the Resampling-based empirical Bayes Methods are generalizable to next generation sequencing RNA-seq data analysis.

  16. The NASA/industry Design Analysis Methods for Vibrations (DAMVIBS) program: McDonnell-Douglas Helicopter Company achievements

    NASA Technical Reports Server (NTRS)

    Toossi, Mostafa; Weisenburger, Richard; Hashemi-Kia, Mostafa

    1993-01-01

    This paper presents a summary of some of the work performed by McDonnell Douglas Helicopter Company under NASA Langley-sponsored rotorcraft structural dynamics program known as DAMVIBS (Design Analysis Methods for VIBrationS). A set of guidelines which is applicable to dynamic modeling, analysis, testing, and correlation of both helicopter airframes and a large variety of structural finite element models is presented. Utilization of these guidelines and the key features of their applications to vibration modeling of helicopter airframes are discussed. Correlation studies with the test data, together with the development and applications of a set of efficient finite element model checkout procedures, are demonstrated on a large helicopter airframe finite element model. Finally, the lessons learned and the benefits resulting from this program are summarized.

  17. Analysis of Big Data in Gait Biomechanics: Current Trends and Future Directions.

    PubMed

    Phinyomark, Angkoon; Petri, Giovanni; Ibáñez-Marcelo, Esther; Osis, Sean T; Ferber, Reed

    2018-01-01

    The increasing amount of data in biomechanics research has greatly increased the importance of developing advanced multivariate analysis and machine learning techniques, which are better able to handle "big data". Consequently, advances in data science methods will expand the knowledge for testing new hypotheses about biomechanical risk factors associated with walking and running gait-related musculoskeletal injury. This paper begins with a brief introduction to an automated three-dimensional (3D) biomechanical gait data collection system: 3D GAIT, followed by how the studies in the field of gait biomechanics fit the quantities in the 5 V's definition of big data: volume, velocity, variety, veracity, and value. Next, we provide a review of recent research and development in multivariate and machine learning methods-based gait analysis that can be applied to big data analytics. These modern biomechanical gait analysis methods include several main modules such as initial input features, dimensionality reduction (feature selection and extraction), and learning algorithms (classification and clustering). Finally, a promising big data exploration tool called "topological data analysis" and directions for future research are outlined and discussed.

  18. A Two-Stage Method to Determine Optimal Product Sampling considering Dynamic Potential Market

    PubMed Central

    Hu, Zhineng; Lu, Wei; Han, Bing

    2015-01-01

    This paper develops an optimization model for the diffusion effects of free samples under dynamic changes in potential market based on the characteristics of independent product and presents a two-stage method to figure out the sampling level. The impact analysis of the key factors on the sampling level shows that the increase of the external coefficient or internal coefficient has a negative influence on the sampling level. And the changing rate of the potential market has no significant influence on the sampling level whereas the repeat purchase has a positive one. Using logistic analysis and regression analysis, the global sensitivity analysis gives a whole analysis of the interaction of all parameters, which provides a two-stage method to estimate the impact of the relevant parameters in the case of inaccuracy of the parameters and to be able to construct a 95% confidence interval for the predicted sampling level. Finally, the paper provides the operational steps to improve the accuracy of the parameter estimation and an innovational way to estimate the sampling level. PMID:25821847

  19. Si-O-C materials prepared with a sol-gel method for negative electrode of lithium battery

    NASA Astrophysics Data System (ADS)

    Liu, Xiang; Xie, Kai; Zheng, Chun-man; Wang, Jun; Jing, Zhaoqing

    2012-09-01

    A sol-gel method is employed to prepare high capacity Si-O-C materials. A blend of polysiloxane and divinylbenzene is uniformly spread in the ethanol solution of triethoxysilane and diethoxymethylsilane, which is then hydrolyzed, crosslinked and finally pyrolyzed at 1000 °C in a hydrogen atmosphere to obtain the final composite materials. The resultant materials, as indicated by elemental analysis, mainly consist of Si-O-C glass phase, in which the dominant silicon species is identified to SiO4 units by 29Si magic angle spinning nuclear magnetic resonance and Si (2p) X-ray photoelectron spectroscopy. The Si-O-C materials exhibit a stable reversible capacity of ca. 900 mAh g-1, originating from lithium storage in SiO4 units, with a coulombic efficiency of 98.5%.

  20. Research summary

    NASA Technical Reports Server (NTRS)

    Siegmann, W. L.; Robertson, J. S.; Jacobson, M. J.

    1993-01-01

    The final report for progress during the period from 15 Nov. 1988 to 14 Nov. 1991 is presented. Research on methods for analysis of sound propagation through the atmosphere and on results obtained from application of our methods are summarized. Ten written documents of NASA research are listed, and these include publications, manuscripts accepted, submitted, or in preparation for publication, and reports. Twelve presentations of results, either at scientific conferences or at research or technical organizations, since the start of the grant period are indicated. Names of organizations to which software produced under the grant was distributed are provided, and the current arrangement whereby the software is being distributed to the scientific community is also described. Finally, the names of seven graduate students who worked on NASA research and received Rensselaer degrees during the grant period, along with their current employers are given.

  1. Maintenance method and its critical issues for a fast-ignition laser fusion reactor based on a dry wall chamber

    NASA Astrophysics Data System (ADS)

    Someya, Y.; Matsumoto, T.; Okano, K.; Asaoka, Y.; Hiwatari, R.; Goto, T.; Ogawa, Y.

    2008-05-01

    The neutronics analysis has been carried out for feasibility study of the FALCON-D concept by Monte Carlo N-paticle transport code (MCNP), in order to inspect the cooling performance of in-vessel and ex-vessel components, and a connection pipe between Vacuum Vessel and reactor room. The nuclear heating rate in the Vacuum Vessel was at the same level as that of NBI duct of the ITER. The temperature of the connection pipe was found to be 345·, ·which was smaller than the melting point of structure materials (F82H). Moreover, the radiation damage of the final optics was also investigated. We propose a sliding changer concept for replacement. This method could be adapted for the replacement of one FPY cycle in the final optics system.

  2. Monitoring of an antigen manufacturing process.

    PubMed

    Zavatti, Vanessa; Budman, Hector; Legge, Raymond; Tamer, Melih

    2016-06-01

    Fluorescence spectroscopy in combination with multivariate statistical methods was employed as a tool for monitoring the manufacturing process of pertactin (PRN), one of the virulence factors of Bordetella pertussis utilized in whopping cough vaccines. Fluorophores such as amino acids and co-enzymes were detected throughout the process. The fluorescence data collected at different stages of the fermentation and purification process were treated employing principal component analysis (PCA). Through PCA, it was feasible to identify sources of variability in PRN production. Then, partial least square (PLS) was employed to correlate the fluorescence spectra obtained from pure PRN samples and the final protein content measured by a Kjeldahl test from these samples. In view that a statistically significant correlation was found between fluorescence and PRN levels, this approach could be further used as a method to predict the final protein content.

  3. Investigation of computational and spectral analysis methods for aeroacoustic wave propagation

    NASA Technical Reports Server (NTRS)

    Vanel, Florence O.

    1995-01-01

    Most computational fluid dynamics (CFD) schemes are not adequately accurate for solving aeroacoustics problems, which have wave amplitudes several orders of magnitude smaller yet with frequencies larger than the flow field variations generating the sound. Hence, a computational aeroacoustics (CAA) algorithm should have minimal dispersion and dissipation features. A dispersion relation preserving (DRP) scheme is, therefore, applied to solve the linearized Euler equations in order to simulate the propagation of three types of waves, namely: acoustic, vorticity, and entropy waves. The scheme is derived using an optimization procedure to ensure that the numerical derivatives preserve the wave number and angular frequency of the partial differential equations being discretized. Consequently, simulated waves propagate with the correct wave speeds and exhibit their appropriate properties. A set of radiation and outflow boundary conditions, compatible with the DRP scheme and derived from the asymptotic solutions of the governing equations, are also implemented. Numerical simulations are performed to test the effectiveness of the DRP scheme and its boundary conditions. The computed solutions are shown to agree favorably with the exact solutions. The major restriction appears to be that the dispersion relations can be preserved only for waves with wave lengths longer than four or five spacings. The boundary conditions are found to be transparent to the outgoing disturbances. However, when the disturbance source is placed closer to a boundary, small acoustic reflections start appearing. CAA generates enormous amounts of temporal data which needs to be reduced to understand the physical problem being simulated. Spectral analysis is one approach that helps us in extracting information which often can not be easily interpreted in the time domain. Thus, three different methods for the spectral analysis of numerically generated aeroacoustic data are studied. First, the capabilities of two traditional methods for spectral analysis, namely, the Blackman-Tukey method and periodogram method, are compared in estimating the spectra of a simple-periodic process. The periodogram is then applied to analyze transitory-deterministic processes. Finally, these two methods are compared with a more recent method, referred as the Weighted-Overlapped-Segment-Averaging (WOSA) method, in estimating the spectra of a chaotic (random-like) process. From the demonstrative case for the spectral analyses of data generated by simple-periodic process, the periodogram method is found to give a better estimate of the steep-sloped spectra than the Blackman-Tukey method. Also, for this problem, the Hanning window is found to perform better with the periodogram method than with the Blackman-Tukey method. Finally, for the spectral analysis of data generated by the chaotic process, the periodogram method does not perform well, whereas, the WOSA and Blackman-Tukey methods give equivalently good results.

  4. Geomatic Archaeological Reconstruction and a Hybrid Viewer for the Archaelogical Site of CÁPARRA (spain)

    NASA Astrophysics Data System (ADS)

    Tejeda-Sánchez, C.; Muñoz-Nieto, A.; Rodríguez-Gonzálvez, P.

    2018-05-01

    Visualization and analysis use to be the final steps in Geomatics. This paper shows the workflow followed to set up a hybrid 3D archaeological viewer. Data acquisition of the site survey was done by means of low-cost close-range photogrammetric methods. With the aim not only to satisfy the general public but also the technicians, a large group of Geomatic products has been obtained (2d plans, 3d models, orthophotos, CAD models coming from vectorization, virtual anastylosis, and cross sections). Finally, all these products have been integrated into a three-dimensional archaeological information system. The hybrid archaeological viewer designed allows a metric and quality approach to the scientific analysis of the ruins, improving, thanks to the implementation of a database, and its potential for queries, the benefits of an ordinary topographic survey.

  5. Final analysis of proton form factor ratio data at Q2=4.0, 4.8, and 5.6 GeV2

    NASA Astrophysics Data System (ADS)

    Puckett, A. J. R.; Brash, E. J.; Gayou, O.; Jones, M. K.; Pentchev, L.; Perdrisat, C. F.; Punjabi, V.; Aniol, K. A.; Averett, T.; Benmokhtar, F.; Bertozzi, W.; Bimbot, L.; Calarco, J. R.; Cavata, C.; Chai, Z.; Chang, C.-C.; Chang, T.; Chen, J. P.; Chudakov, E.; De Leo, R.; Dieterich, S.; Endres, R.; Epstein, M. B.; Escoffier, S.; Fissum, K. G.; Fonvieille, H.; Frullani, S.; Gao, J.; Garibaldi, F.; Gilad, S.; Gilman, R.; Glamazdin, A.; Glashausser, C.; Gomez, J.; Hansen, J.-O.; Higinbotham, D.; Huber, G. M.; Iodice, M.; de Jager, C. W.; Jiang, X.; Khandaker, M.; Kozlov, S.; Kramer, K. M.; Kumbartzki, G.; LeRose, J. J.; Lhuillier, D.; Lindgren, R. A.; Liyanage, N.; Lolos, G. J.; Margaziotis, D. J.; Marie, F.; Markowitz, P.; McCormick, K.; Michaels, R.; Milbrath, B. D.; Nanda, S. K.; Neyret, D.; Piskunov, N. M.; Ransome, R. D.; Raue, B. A.; Roché, R.; Rvachev, M.; Salgado, C.; Sirca, S.; Sitnik, I.; Strauch, S.; Todor, L.; Tomasi-Gustafsson, E.; Urciuoli, G. M.; Voskanyan, H.; Wijesooriya, K.; Wojtsekhowski, B. B.; Zheng, X.; Zhu, L.

    2012-04-01

    Precise measurements of the proton electromagnetic form factor ratio R=μpGEp/GMp using the polarization transfer method at Jefferson Lab have revolutionized the understanding of nucleon structure by revealing the strong decrease of R with momentum transfer Q2 for Q2≳1 GeV2, in strong disagreement with previous extractions of R from cross-section measurements. In particular, the polarization transfer results have exposed the limits of applicability of the one-photon-exchange approximation and highlighted the role of quark orbital angular momentum in the nucleon structure. The GEp-II experiment in Jefferson Lab's Hall A measured R at four Q2 values in the range 3.5GeV2≤Q2≤5.6GeV2. A possible discrepancy between the originally published GEp-II results and more recent measurements at higher Q2 motivated a new analysis of the GEp-II data. This article presents the final results of the GEp-II experiment, including details of the new analysis, an expanded description of the apparatus, and an overview of theoretical progress since the original publication. The key result of the final analysis is a systematic increase in the results for R, improving the consistency of the polarization transfer data in the high-Q2 region. This increase is the result of an improved selection of elastic events which largely removes the systematic effect of the inelastic contamination, underestimated by the original analysis.

  6. Methods for network meta-analysis of continuous outcomes using individual patient data: a case study in acupuncture for chronic pain.

    PubMed

    Saramago, Pedro; Woods, Beth; Weatherly, Helen; Manca, Andrea; Sculpher, Mark; Khan, Kamran; Vickers, Andrew J; MacPherson, Hugh

    2016-10-06

    Network meta-analysis methods, which are an extension of the standard pair-wise synthesis framework, allow for the simultaneous comparison of multiple interventions and consideration of the entire body of evidence in a single statistical model. There are well-established advantages to using individual patient data to perform network meta-analysis and methods for network meta-analysis of individual patient data have already been developed for dichotomous and time-to-event data. This paper describes appropriate methods for the network meta-analysis of individual patient data on continuous outcomes. This paper introduces and describes network meta-analysis of individual patient data models for continuous outcomes using the analysis of covariance framework. Comparisons are made between this approach and change score and final score only approaches, which are frequently used and have been proposed in the methodological literature. A motivating example on the effectiveness of acupuncture for chronic pain is used to demonstrate the methods. Individual patient data on 28 randomised controlled trials were synthesised. Consistency of endpoints across the evidence base was obtained through standardisation and mapping exercises. Individual patient data availability avoided the use of non-baseline-adjusted models, allowing instead for analysis of covariance models to be applied and thus improving the precision of treatment effect estimates while adjusting for baseline imbalance. The network meta-analysis of individual patient data using the analysis of covariance approach is advocated to be the most appropriate modelling approach for network meta-analysis of continuous outcomes, particularly in the presence of baseline imbalance. Further methods developments are required to address the challenge of analysing aggregate level data in the presence of baseline imbalance.

  7. Development of a multianalyte method based on micro-matrix-solid-phase dispersion for the analysis of fragrance allergens and preservatives in personal care products.

    PubMed

    Celeiro, Maria; Guerra, Eugenia; Lamas, J Pablo; Lores, Marta; Garcia-Jares, Carmen; Llompart, Maria

    2014-05-30

    An effective, simple and low cost sample preparation method based on matrix solid-phase dispersion (MSPD) followed by gas chromatography-mass spectrometry (GC-MS) or gas chromatography-triple quadrupole-mass spectrometry (GC-MS/MS) has been developed for the rapid simultaneous determination of 38 cosmetic ingredients, 25 fragrance allergens and 13 preservatives. All target substances are frequently used in cosmetics and personal care products and they are subjected to use restrictions or labeling requirements according to the EU Cosmetic Directive. The extraction procedure was optimized on real non-spiked rinse-off and leave-on cosmetic products by means of experimental designs. The final miniaturized process required the use of only 0.1g of sample and 1 mL of organic solvent, obtaining a final extract ready for analysis. The micro-MSPD method was validated showing satisfactory performance by GC-MS and GC-MS/MS analysis. The use of GC coupled to triple quadrupole mass detection allowed to reach very low detection limits (low ng g(-1)) improving, at the same time, method selectivity. In an attempt to improve the chromatographic analysis of preservatives, the inclusion of a derivatization step was also assessed. The proposed method was applied to a broad range of cosmetics and personal care products (shampoos, body milk, moisturizing milk, toothpaste, hand creams, gloss lipstick, sunblock, deodorants and liquid soaps among others), demonstrating the extended use of these substances. The concentration levels were ranging from the sub parts per million to the parts per mill. The number of target fragrance allergens per samples was quite high (up to 16). Several fragrances (linalool, farnesol, hexylcinnamal, and benzyl benzoate) have been detected at levels >0.1% (1,000 μg g(-1)). As regards preservatives, phenoxyethanol was the most frequently found additive reaching quite high concentration (>1,500 μg g(-1)) in five cosmetic products. BHT was detected in eight samples, in two of them (a baby care product and a lipstick) at high concentrations (>1,000 μg g(-1)). Methyl paraben was also found at high levels (>1,700 μg g(-1)) in three leave-on samples. Finally, triclosan was found at the maximum concentration limit (0.3%) laid down by the European regulation in two deodorant samples, and the total paraben concentration was close to the maximum concentration permitted (0.8%) in one leave-on sample (body milk). Copyright © 2014 Elsevier B.V. All rights reserved.

  8. The study on the Layout of the Charging Station in Chengdu

    NASA Astrophysics Data System (ADS)

    Cai, yun; Zhang, wanquan; You, wei; Mao, pan

    2018-03-01

    In this paper, the comprehensive analysis of the factors affecting the layout of the electric car, considering the principle of layout of the charging station. Using queuing theory in operational research to establish mathematical model and basing on the principle of saving resource and convenient owner to optimize site number. Combining the theory of center to determine the service radius, Using the Gravity method to determine the initial location, Finally using the method of center of gravity to locate the charging station’s location.

  9. A Baseline Analysis of In-Transit Shipping Time into and Through the Fifth Fleet Area of Operation With Respect to the Supply Chain Last Nautical Mile

    DTIC Science & Technology

    2011-12-01

    Figure 7. Final Delivery Methods ...................................................................................15 Figure 8. C-2 Greyhound (From USN...Delivery Methods COD aircraft are C-2 Greyhounds that can (only) deliver directly to aircraft carriers. The C-2 has a 1,300 nm range and a payload of...C-2 Greyhound (From USN, 2004a) 17 Figure 9. SH-60 Seahawk conducting VERTREP (From USN, 2004b) The MSC ships load material in Bahrain, Jebel

  10. Quantitative analysis of phylloquinone (vitamin K1) in soy bean oils by high-performance liquid chromatography.

    PubMed

    Zonta, F; Stancher, B

    1985-07-19

    A high-performance liquid chromatographic method for determining phylloquinone (vitamin K1) in soy bean oils is described. Resolution of vitamin K1 from interfering peaks of the matrix was obtained after enzymatic digestion, extraction and liquid-solid chromatography on alumina. An isocratic reversed-phase chromatography with UV detection was used in the final stage. The quantitation was carried out by the standard addition method, and the recovery of the whole procedure was 88.2%.

  11. Dosimetry and microdosimetry using COTS ICs: A comparative study

    NASA Technical Reports Server (NTRS)

    Scheick, L.; Swift, G.; Guertin, S.; Roth, D.; McNulty, P.; Nguyen, D.

    2002-01-01

    A new method using an array of MOS transistors formeasuring dose absorbed from ionizing radiation is compared to previous dosimetric methods., The accuracy and precision of dosimetry based on COTS SRAMs, DRAMs, and WPROMs are compared and contrasted. Applications of these devices in various space missions will be discussed. TID results are presented for this summary and microdosimetricresults will be added to the full paper. Finally, an analysis of the optimal condition for a digital dosimeter will be presented.

  12. [Application of immunologic methods to the analysis of bio-leaching bacteria].

    PubMed

    Coto, O; Fernández, A I; León, T; Rodríguez, D

    1994-09-01

    Pure cultures of Thiobacillus ferrooxidans and mixed cultures of Thiobacillus ferrooxidans and Leptospirillum ferrooxidans isolated from the Matahambre mine (Cuba) were used to fit immunodiffusion and immunoelectron microscopy to the study of iron oxidizing bacteria. The possibilities, advantages and limits of those techniques have been studied from both the identification and the serological characterization points of view. Finally, the efficiency of these methods was tested by applying them to the identification of microorganisms from acidic waters from the mine.

  13. New integrated and multiscale decision-aiding framework in a context of imperfect information: application to the assessment of torrent checkdams' effectiveness.

    NASA Astrophysics Data System (ADS)

    Tacnet, Jean-Marc; Carladous, Simon; Dezert, Jean; Batton-Hubert, Mireille

    2017-04-01

    Mountain natural phenomena (e.g. torrential floods) put people and buildings at risk. Civil engineering protection works such as torrent check-dams are designed to mitigate those natural risks. Protection works act on both causes and effects of phenomena to reduce consequences and therefore risks. For instance, check-dams control sediment production and liquid/solid flow of torrential floods: several series of dams are located in the headwaters of a watershed, each having specific functions. All those works are damaged by time passing and flood impacts. Effectiveness assessment is needed to define, compare or choose strategies for investment and maintenance which are essential issues in risk management process. Decision support tools are expected to analyze at different scales both their technical effectiveness (related to their structural state and functional effects on phenomena such as stopping, braking, guiding, etc.) and their economic efficiency through comparison between benefits and costs. Several methods, often based on expert knowledge, have already been developed to care about decision under risk. But uncertainty has also to be considered, since decisions are indeed often taken in a context of lack of information and knowledge on natural phenomena, heterogeneity of available information and, finally, reliability of sources. First methods derived from classical industrial contexts, such as dependability analysis, are used to formalize expert knowledge used for decision-making. After having defined the concept of effectiveness, dependability analysis are used to identify decision contexts and problems: criteria and indicators are identified in relation with structural or functional features. Then, innovative and multi-scales multi-criteria decision-making methods (MCDMs) and frameworks are proposed to help assessing protection works effectiveness. They combine classical MCDM approaches, belief function, fuzzy sets and possibility theories. Those methods allow to make decisions based on heterogeneous, imprecise and uncertain evaluation of criteria provided by more or less reliable sources in an uncertain context: COWA-ER (Cautious Ordered Weighted Averaging with Evidential Reasoning), Fuzzy-Cautious OWA or ER-MCDA (Evidential Reasoning for Multi Criteria Decision Analysis) are thus applied to several scales of torrent check-dams' effectiveness assessment. Those methods are then improved for a better knowledge representation and final decision. Enhanced methods are then associated together. Finally, individual problems and associated methods are integrated in a generic methodology to move from torrential protective single measure effectiveness assessment to complete protection systems at watershed scale.

  14. A brain-region-based meta-analysis method utilizing the Apriori algorithm.

    PubMed

    Niu, Zhendong; Nie, Yaoxin; Zhou, Qian; Zhu, Linlin; Wei, Jieyao

    2016-05-18

    Brain network connectivity modeling is a crucial method for studying the brain's cognitive functions. Meta-analyses can unearth reliable results from individual studies. Meta-analytic connectivity modeling is a connectivity analysis method based on regions of interest (ROIs) which showed that meta-analyses could be used to discover brain network connectivity. In this paper, we propose a new meta-analysis method that can be used to find network connectivity models based on the Apriori algorithm, which has the potential to derive brain network connectivity models from activation information in the literature, without requiring ROIs. This method first extracts activation information from experimental studies that use cognitive tasks of the same category, and then maps the activation information to corresponding brain areas by using the automatic anatomical label atlas, after which the activation rate of these brain areas is calculated. Finally, using these brain areas, a potential brain network connectivity model is calculated based on the Apriori algorithm. The present study used this method to conduct a mining analysis on the citations in a language review article by Price (Neuroimage 62(2):816-847, 2012). The results showed that the obtained network connectivity model was consistent with that reported by Price. The proposed method is helpful to find brain network connectivity by mining the co-activation relationships among brain regions. Furthermore, results of the co-activation relationship analysis can be used as a priori knowledge for the corresponding dynamic causal modeling analysis, possibly achieving a significant dimension-reducing effect, thus increasing the efficiency of the dynamic causal modeling analysis.

  15. A SVM-based quantitative fMRI method for resting-state functional network detection.

    PubMed

    Song, Xiaomu; Chen, Nan-kuei

    2014-09-01

    Resting-state functional magnetic resonance imaging (fMRI) aims to measure baseline neuronal connectivity independent of specific functional tasks and to capture changes in the connectivity due to neurological diseases. Most existing network detection methods rely on a fixed threshold to identify functionally connected voxels under the resting state. Due to fMRI non-stationarity, the threshold cannot adapt to variation of data characteristics across sessions and subjects, and generates unreliable mapping results. In this study, a new method is presented for resting-state fMRI data analysis. Specifically, the resting-state network mapping is formulated as an outlier detection process that is implemented using one-class support vector machine (SVM). The results are refined by using a spatial-feature domain prototype selection method and two-class SVM reclassification. The final decision on each voxel is made by comparing its probabilities of functionally connected and unconnected instead of a threshold. Multiple features for resting-state analysis were extracted and examined using an SVM-based feature selection method, and the most representative features were identified. The proposed method was evaluated using synthetic and experimental fMRI data. A comparison study was also performed with independent component analysis (ICA) and correlation analysis. The experimental results show that the proposed method can provide comparable or better network detection performance than ICA and correlation analysis. The method is potentially applicable to various resting-state quantitative fMRI studies. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Influence of preconsolidation on consolidation quality after stamp forming of C/PEEK composites

    NASA Astrophysics Data System (ADS)

    Slange, T. K.; Warnet, L.; Grouve, W. J. B.; Akkerman, R.

    2016-10-01

    Stamp forming is a rapid manufacturing technology used to shape flat blanks of thermoplastic composite material into three-dimensional components. Currently, expensive autoclave and press consolidation are used to preconsolidate blanks. This study investigates the influence of preconsolidation on final consolidation quality after stamp forming and explores the potential of alternative blank manufacturing methods that could reduce part costs. Blanks were manufactured using various blank manufacturing methods and subsequently were stamp formed. The consolidation quality both before and after stamp forming was compared, where the focus was on void content as the main measure for consolidation quality. The void content was characterized through thickness and density measurements, as well as by microscopy analysis. Results indicate that preconsolidation quality does have an influence on the final consolidation quality. This is due to the severe deconsolidation and limited reconsolidation during stamp forming. Nevertheless, the potential of automated fiber placement and ultrasonic spot welding as alternative blank manufacturing methods was demonstrated.

  17. Methodological Issues in Questionnaire Design.

    PubMed

    Song, Youngshin; Son, Youn Jung; Oh, Doonam

    2015-06-01

    The process of designing a questionnaire is complicated. Many questionnaires on nursing phenomena have been developed and used by nursing researchers. The purpose of this paper was to discuss questionnaire design and factors that should be considered when using existing scales. Methodological issues were discussed, such as factors in the design of questions, steps in developing questionnaires, wording and formatting methods for items, and administrations methods. How to use existing scales, how to facilitate cultural adaptation, and how to prevent socially desirable responding were discussed. Moreover, the triangulation method in questionnaire development was introduced. Steps were recommended for designing questions such as appropriately operationalizing key concepts for the target population, clearly formatting response options, generating items and confirming final items through face or content validity, sufficiently piloting the questionnaire using item analysis, demonstrating reliability and validity, finalizing the scale, and training the administrator. Psychometric properties and cultural equivalence should be evaluated prior to administration when using an existing questionnaire and performing cultural adaptation. In the context of well-defined nursing phenomena, logical and systematic methods will contribute to the development of simple and precise questionnaires.

  18. Subunit mass analysis for monitoring antibody oxidation.

    PubMed

    Sokolowska, Izabela; Mo, Jingjie; Dong, Jia; Lewis, Michael J; Hu, Ping

    2017-04-01

    Methionine oxidation is a common posttranslational modification (PTM) of monoclonal antibodies (mAbs). Oxidation can reduce the in-vivo half-life, efficacy and stability of the product. Peptide mapping is commonly used to monitor the levels of oxidation, but this is a relatively time-consuming method. A high-throughput, automated subunit mass analysis method was developed to monitor antibody methionine oxidation. In this method, samples were treated with IdeS, EndoS and dithiothreitol to generate three individual IgG subunits (light chain, Fd' and single chain Fc). These subunits were analyzed by reversed phase-ultra performance liquid chromatography coupled with an online quadrupole time-of-flight mass spectrometer and the levels of oxidation on each subunit were quantitated based on the deconvoluted mass spectra using the UNIFI software. The oxidation results obtained by subunit mass analysis correlated well with the results obtained by peptide mapping. Method qualification demonstrated that this subunit method had excellent repeatability and intermediate precision. In addition, UNIFI software used in this application allows automated data acquisition and processing, which makes this method suitable for high-throughput process monitoring and product characterization. Finally, subunit mass analysis revealed the different patterns of Fc methionine oxidation induced by chemical and photo stress, which makes it attractive for investigating the root cause of oxidation.

  19. [Research progress on mechanical performance evaluation of artificial intervertebral disc].

    PubMed

    Li, Rui; Wang, Song; Liao, Zhenhua; Liu, Weiqiang

    2018-03-01

    The mechanical properties of artificial intervertebral disc (AID) are related to long-term reliability of prosthesis. There are three testing methods involved in the mechanical performance evaluation of AID based on different tools: the testing method using mechanical simulator, in vitro specimen testing method and finite element analysis method. In this study, the testing standard, testing equipment and materials of AID were firstly introduced. Then, the present status of AID static mechanical properties test (static axial compression, static axial compression-shear), dynamic mechanical properties test (dynamic axial compression, dynamic axial compression-shear), creep and stress relaxation test, device pushout test, core pushout test, subsidence test, etc. were focused on. The experimental techniques using in vitro specimen testing method and testing results of available artificial discs were summarized. The experimental methods and research status of finite element analysis were also summarized. Finally, the research trends of AID mechanical performance evaluation were forecasted. The simulator, load, dynamic cycle, motion mode, specimen and test standard would be important research fields in the future.

  20. An inverse method for determining the spatially resolved properties of viscoelastic–viscoplastic three-dimensional printed materials

    PubMed Central

    Chen, X.; Ashcroft, I. A.; Wildman, R. D.; Tuck, C. J.

    2015-01-01

    A method using experimental nanoindentation and inverse finite-element analysis (FEA) has been developed that enables the spatial variation of material constitutive properties to be accurately determined. The method was used to measure property variation in a three-dimensional printed (3DP) polymeric material. The accuracy of the method is dependent on the applicability of the constitutive model used in the inverse FEA, hence four potential material models: viscoelastic, viscoelastic–viscoplastic, nonlinear viscoelastic and nonlinear viscoelastic–viscoplastic were evaluated, with the latter enabling the best fit to experimental data. Significant changes in material properties were seen in the depth direction of the 3DP sample, which could be linked to the degree of cross-linking within the material, a feature inherent in a UV-cured layer-by-layer construction method. It is proposed that the method is a powerful tool in the analysis of manufacturing processes with potential spatial property variation that will also enable the accurate prediction of final manufactured part performance. PMID:26730216

  1. An inverse method for determining the spatially resolved properties of viscoelastic-viscoplastic three-dimensional printed materials.

    PubMed

    Chen, X; Ashcroft, I A; Wildman, R D; Tuck, C J

    2015-11-08

    A method using experimental nanoindentation and inverse finite-element analysis (FEA) has been developed that enables the spatial variation of material constitutive properties to be accurately determined. The method was used to measure property variation in a three-dimensional printed (3DP) polymeric material. The accuracy of the method is dependent on the applicability of the constitutive model used in the inverse FEA, hence four potential material models: viscoelastic, viscoelastic-viscoplastic, nonlinear viscoelastic and nonlinear viscoelastic-viscoplastic were evaluated, with the latter enabling the best fit to experimental data. Significant changes in material properties were seen in the depth direction of the 3DP sample, which could be linked to the degree of cross-linking within the material, a feature inherent in a UV-cured layer-by-layer construction method. It is proposed that the method is a powerful tool in the analysis of manufacturing processes with potential spatial property variation that will also enable the accurate prediction of final manufactured part performance.

  2. Designing Real-Time Systems in Ada (Trademark).

    DTIC Science & Technology

    1986-01-01

    e a. T * .K Ada .e 6 4J (FINAL REPORT) Real - Time Systems in Ada* Abstract Real-time software differs from other kinds of software in the sense that it...1-2 1.2.2 Functional Focus ...... ................ 1-2 1.3 ROLE OF ADA IN REAL - TIME SYSTEMS DESIGN. ..... 1-3 1.4 SCOPE OF THIS...MODELS OF REAL TIME SYSTEMS 8.1 REQUIREMENTS FOR TEMPORAL BEHAVIOR ANALYSIS . 8-1 8.2 METHODS OF TEMPORAL BEHAVIOR ANALYSIS.... ....... 8-4 8.3

  3. Anatomising proton NMR spectra with pure shift 2D J-spectroscopy: A cautionary tale

    NASA Astrophysics Data System (ADS)

    Kiraly, Peter; Foroozandeh, Mohammadali; Nilsson, Mathias; Morris, Gareth A.

    2017-09-01

    Analysis of proton NMR spectra has been a key tool in structure determination for over 60 years. A classic tool is 2D J-spectroscopy, but common problems are the difficulty of obtaining the absorption mode lineshapes needed for accurate results, and the need for a 45° shear of the final 2D spectrum. A novel 2D NMR method is reported here that allows straightforward determination of homonuclear couplings, using a modified version of the PSYCHE method to suppress couplings in the direct dimension. The method illustrates the need for care when combining pure shift data acquisition with multiple pulse methods.

  4. Untangling Autophagy Measurements: All Fluxed Up

    PubMed Central

    Gottlieb, Roberta A.; Andres, Allen M.; Sin, Jon; Taylor, David

    2015-01-01

    Autophagy is an important physiological process in the heart, and alterations in autophagic activity can exacerbate or mitigate injury during various pathological processes. Methods to assess autophagy have changed rapidly as the field of research has expanded. As with any new field, methods and standards for data analysis and interpretation evolve as investigators acquire experience and insight. The purpose of this review is to summarize current methods to measure autophagy, selective mitochondrial autophagy (mitophagy), and autophagic flux. We will examine several published studies where confusion arose in in data interpretation, in order to illustrate the challenges. Finally we will discuss methods to assess autophagy in vivo and in patients. PMID:25634973

  5. Structural design of composite rotor blades with consideration of manufacturability, durability, and manufacturing uncertainties

    NASA Astrophysics Data System (ADS)

    Li, Leihong

    A modular structural design methodology for composite blades is developed. This design method can be used to design composite rotor blades with sophisticate geometric cross-sections. This design method hierarchically decomposed the highly-coupled interdisciplinary rotor analysis into global and local levels. In the global level, aeroelastic response analysis and rotor trim are conduced based on multi-body dynamic models. In the local level, variational asymptotic beam sectional analysis methods are used for the equivalent one-dimensional beam properties. Compared with traditional design methodology, the proposed method is more efficient and accurate. Then, the proposed method is used to study three different design problems that have not been investigated before. The first is to add manufacturing constraints into design optimization. The introduction of manufacturing constraints complicates the optimization process. However, the design with manufacturing constraints benefits the manufacturing process and reduces the risk of violating major performance constraints. Next, a new design procedure for structural design against fatigue failure is proposed. This procedure combines the fatigue analysis with the optimization process. The durability or fatigue analysis employs a strength-based model. The design is subject to stiffness, frequency, and durability constraints. Finally, the manufacturing uncertainty impacts on rotor blade aeroelastic behavior are investigated, and a probabilistic design method is proposed to control the impacts of uncertainty on blade structural performance. The uncertainty factors include dimensions, shapes, material properties, and service loads.

  6. Error Propagation Made Easy--Or at Least Easier

    ERIC Educational Resources Information Center

    Gardenier, George H.; Gui, Feng; Demas, James N.

    2011-01-01

    Complex error propagation is reduced to formula and data entry into a Mathcad worksheet or an Excel spreadsheet. The Mathcad routine uses both symbolic calculus analysis and Monte Carlo methods to propagate errors in a formula of up to four variables. Graphical output is used to clarify the contributions to the final error of each of the…

  7. A novel method for the determination of sildenafil (Viagra(R)) and its metabolite (UK-103,320) in postmortem specimens using LC/MS/MS and LC/MS/MS/MS : final report.

    DOT National Transportation Integrated Search

    2000-05-01

    During the investigation of aviation accidents, postmortem samples from victims are submitted to the FAAs Civil Aeromedical Institute for drug analysis. Because new drugs are continually being released to the market, it is our laboratorys respo...

  8. Analysis Of The IJCNN 2011 UTL Challenge

    DTIC Science & Technology

    2012-01-13

    large datasets from various application domains: handwriting recognition, image recognition, video processing, text processing, and ecology. The goal...validation and final evaluation sets consist of 4096 examples each. Dataset Domain Features Sparsity Devel. Transf. AVICENNA Handwriting 120 0% 150205...documents [3]. Transfer learning methods could accelerate the application of handwriting recognizers to historical manuscript by reducing the need for

  9. Development and Evaluation of an Analytical Method for the Determination of Total Atmospheric Mercury. Final Report.

    ERIC Educational Resources Information Center

    Chase, D. L.; And Others

    Total mercury in ambient air can be collected in iodine monochloride, but the subsequent analysis is relatively complex and tedious, and contamination from reagents and containers is a problem. A sliver wool collector, preceded by a catalytic pyrolysis furnace, gives good recovery of mercury and simplifies the analytical step. An instrumental…

  10. A Profile Examination System for Physician Extenders and a Method for Curricular Prescription. Final Report.

    ERIC Educational Resources Information Center

    Smull, Ned W.; And Others

    The basic purposes of the Profile Examination for Physician Extenders (PEPE) project included: (1) development of a computerized test item bank from which Profile Examinations could be generated, (2) review and analysis of curricula for the allied health groups, and (3) assessment of the reliability and validity of the Profile Examinations. The…

  11. Safe Preparation of HCl and DCl for IR Spectroscopy

    ERIC Educational Resources Information Center

    Furlong, William R.; Grubbs, W. Tandy

    2005-01-01

    The widely used method of synthesizing HCl and DCl gases for infrared analysis by hydrolysis of benzoyl chloride includes a potentially dangerous final step whereby the frozen product is allowed to heat and expand into an infrared gas cell. The subsequent rapid rise in vapor pressure can "pop" open glass joints in the vacuum line and vent the…

  12. SPREADSHEET METHOD FOR EVALUATION OF BIOCHEMICAL REACTION RATE COEFFICIENTS AND THEIR UNCERTAINTIES BY WEIGHTED NONLINEAR LEAST-SQUARES ANALYSIS OF THE INTEGRATED MONOD EQUATION. (R825689C036)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  13. The feasibility of solar energy usage on Red River Army Depot. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crowder, G.W.

    This feasibility study considers the usage of solar energy to heat and cool the main office buildings on the Red River Army Depot, Texarkana Texas. Solar energy costs are compared with the present heating and cooling system costs with an economic analysis using the annual worth and present worth methods. (GRA)

  14. Status and analysis of test standard for on-board charger

    NASA Astrophysics Data System (ADS)

    Hou, Shuai; Liu, Haiming; Jiang, Li; Chen, Xichen; Ma, Junjie; Zhao, Bing; Wu, Zaiyuan

    2018-05-01

    This paper analyzes the test standards of on -board charger (OBC). In the process of testing, we found that there exists some problems in test method and functional status, such as failed to follow up the latest test standards, estimated loosely, rectification uncertainty and consistency. Finally, putting forward some own viewpoints on these problems.

  15. Statistics-based email communication security behavior recognition

    NASA Astrophysics Data System (ADS)

    Yi, Junkai; Su, Yueyang; Zhao, Xianghui

    2017-08-01

    With the development of information technology, e-mail has become a popular communication medium. It has great significant to determine the relationship between the two sides of the communication. Firstly, this paper analysed and processed the content and attachment of e-mail using the skill of steganalysis and malware analysis. And it also conducts the following feature extracting and behaviour model establishing which based on Naive Bayesian theory. Then a behaviour analysis method was employed to calculate and evaluate the communication security. Finally, some experiments about the accuracy of the behavioural relationship of communication identifying has been carried out. The result shows that this method has a great effects and correctness as eighty-four percent.

  16. High frequency vibration analysis by the complex envelope vectorization.

    PubMed

    Giannini, O; Carcaterra, A; Sestieri, A

    2007-06-01

    The complex envelope displacement analysis (CEDA) is a procedure to solve high frequency vibration and vibro-acoustic problems, providing the envelope of the physical solution. CEDA is based on a variable transformation mapping the high frequency oscillations into signals of low frequency content and has been successfully applied to one-dimensional systems. However, the extension to plates and vibro-acoustic fields met serious difficulties so that a general revision of the theory was carried out, leading finally to a new method, the complex envelope vectorization (CEV). In this paper the CEV method is described, underlying merits and limits of the procedure, and a set of applications to vibration and vibro-acoustic problems of increasing complexity are presented.

  17. Multistability and instability analysis of recurrent neural networks with time-varying delays.

    PubMed

    Zhang, Fanghai; Zeng, Zhigang

    2018-01-01

    This paper provides new theoretical results on the multistability and instability analysis of recurrent neural networks with time-varying delays. It is shown that such n-neuronal recurrent neural networks have exactly [Formula: see text] equilibria, [Formula: see text] of which are locally exponentially stable and the others are unstable, where k 0 is a nonnegative integer such that k 0 ≤n. By using the combination method of two different divisions, recurrent neural networks can possess more dynamic properties. This method improves and extends the existing results in the literature. Finally, one numerical example is provided to show the superiority and effectiveness of the presented results. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Extended GTST-MLD for aerospace system safety analysis.

    PubMed

    Guo, Chiming; Gong, Shiyu; Tan, Lin; Guo, Bo

    2012-06-01

    The hazards caused by complex interactions in the aerospace system have become a problem that urgently needs to be settled. This article introduces a method for aerospace system hazard interaction identification based on extended GTST-MLD (goal tree-success tree-master logic diagram) during the design stage. GTST-MLD is a functional modeling framework with a simple architecture. Ontology is used to extend the ability of system interaction description in GTST-MLD by adding the system design knowledge and the past accident experience. From the level of functionality and equipment, respectively, this approach can help the technician detect potential hazard interactions. Finally, a case is used to show the method. © 2011 Society for Risk Analysis.

  19. Canister Storage Building (CSB) Design Basis Accident Analysis Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    CROWE, R.D.; PIEPHO, M.G.

    2000-03-23

    This document provided the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report''. All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  20. The application of visible absorption spectroscopy to the analysis of uranium in aqueous solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colletti, Lisa Michelle; Copping, Roy; Garduno, Katherine

    Through assay analysis into an excess of 1 M H 2SO 4 at fixed temperature a technique has been developed for uranium concentration analysis by visible absorption spectroscopy over an assay concentration range of 1.8 – 13.4 mgU/g. Once implemented for a particular spectrophotometer and set of spectroscopic cells this technique promises to provide more rapid results than a classical method such as Davies-Gray (DG) titration analysis. While not as accurate and precise as the DG method, a comparative analysis study reveals that the spectroscopic method can analyze for uranium in well characterized uranyl(VI) solution samples to within 0.3% ofmore » the DG results. For unknown uranium solutions in which sample purity is less well defined agreement between the developed spectroscopic method and DG analysis is within 0.5%. The technique can also be used to detect the presence of impurities that impact the colorimetric analysis, as confirmed through the analysis of ruthenium contamination. Finally, extending the technique to other assay solution, 1 M HNO 3, HCl and Na 2CO 3, has also been shown to be viable. As a result, of the four aqueous media the carbonate solution yields the largest molar absorptivity value at the most intensely absorbing band, with the least impact of temperature.« less

  1. The application of visible absorption spectroscopy to the analysis of uranium in aqueous solutions

    DOE PAGES

    Colletti, Lisa Michelle; Copping, Roy; Garduno, Katherine; ...

    2017-07-18

    Through assay analysis into an excess of 1 M H 2SO 4 at fixed temperature a technique has been developed for uranium concentration analysis by visible absorption spectroscopy over an assay concentration range of 1.8 – 13.4 mgU/g. Once implemented for a particular spectrophotometer and set of spectroscopic cells this technique promises to provide more rapid results than a classical method such as Davies-Gray (DG) titration analysis. While not as accurate and precise as the DG method, a comparative analysis study reveals that the spectroscopic method can analyze for uranium in well characterized uranyl(VI) solution samples to within 0.3% ofmore » the DG results. For unknown uranium solutions in which sample purity is less well defined agreement between the developed spectroscopic method and DG analysis is within 0.5%. The technique can also be used to detect the presence of impurities that impact the colorimetric analysis, as confirmed through the analysis of ruthenium contamination. Finally, extending the technique to other assay solution, 1 M HNO 3, HCl and Na 2CO 3, has also been shown to be viable. As a result, of the four aqueous media the carbonate solution yields the largest molar absorptivity value at the most intensely absorbing band, with the least impact of temperature.« less

  2. Review of Computational Stirling Analysis Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.

    2004-01-01

    Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

  3. Bridges for Pedestrians with Random Parameters using the Stochastic Finite Elements Analysis

    NASA Astrophysics Data System (ADS)

    Szafran, J.; Kamiński, M.

    2017-02-01

    The main aim of this paper is to present a Stochastic Finite Element Method analysis with reference to principal design parameters of bridges for pedestrians: eigenfrequency and deflection of bridge span. They are considered with respect to random thickness of plates in boxed-section bridge platform, Young modulus of structural steel and static load resulting from crowd of pedestrians. The influence of the quality of the numerical model in the context of traditional FEM is shown also on the example of a simple steel shield. Steel structures with random parameters are discretized in exactly the same way as for the needs of traditional Finite Element Method. Its probabilistic version is provided thanks to the Response Function Method, where several numerical tests with random parameter values varying around its mean value enable the determination of the structural response and, thanks to the Least Squares Method, its final probabilistic moments.

  4. A computationally efficient modelling of laminar separation bubbles

    NASA Technical Reports Server (NTRS)

    Maughmer, Mark D.

    1988-01-01

    The goal of this research is to accurately predict the characteristics of the laminar separation bubble and its effects on airfoil performance. To this end, a model of the bubble is under development and will be incorporated in the analysis section of the Eppler and Somers program. As a first step in this direction, an existing bubble model was inserted into the program. It was decided to address the problem of the short bubble before attempting the prediction of the long bubble. In the second place, an integral boundary-layer method is believed more desirable than a finite difference approach. While these two methods achieve similar prediction accuracy, finite-difference methods tend to involve significantly longer computer run times than the integral methods. Finally, as the boundary-layer analysis in the Eppler and Somers program employs the momentum and kinetic energy integral equations, a short-bubble model compatible with these equations is most preferable.

  5. Multiplexed LC-MS/MS analysis of horse plasma proteins to study doping in sport.

    PubMed

    Barton, Chris; Beck, Paul; Kay, Richard; Teale, Phil; Roberts, Jane

    2009-06-01

    The development of protein biomarkers for the indirect detection of doping in horse is a potential solution to doping threats such as gene and protein doping. A method for biomarker candidate discovery in horse plasma is presented using targeted analysis of proteotypic peptides from horse proteins. These peptides were first identified in a novel list of the abundant proteins in horse plasma. To monitor these peptides, an LC-MS/MS method using multiple reaction monitoring was developed to study the quantity of 49 proteins in horse plasma in a single run. The method was optimised and validated, and then applied to a population of race-horses to study protein variance within a population. The method was finally applied to longitudinal time courses of horse plasma collected after administration of an anabolic steroid to demonstrate utility for hypothesis-driven discovery of doping biomarker candidates.

  6. Assessment of Intralaminar Progressive Damage and Failure Analysis Using an Efficient Evaluation Framework

    NASA Technical Reports Server (NTRS)

    Hyder, Imran; Schaefer, Joseph; Justusson, Brian; Wanthal, Steve; Leone, Frank; Rose, Cheryl

    2017-01-01

    Reducing the timeline for development and certification for composite structures has been a long standing objective of the aerospace industry. This timeline can be further exacerbated when attempting to integrate new fiber-reinforced composite materials due to the large number of testing required at every level of design. computational progressive damage and failure analysis (PDFA) attempts to mitigate this effect; however, new PDFA methods have been slow to be adopted in industry since material model evaluation techniques have not been fully defined. This study presents an efficient evaluation framework which uses a piecewise verification and validation (V&V) approach for PDFA methods. Specifically, the framework is applied to evaluate PDFA research codes within the context of intralaminar damage. Methods are incrementally taken through various V&V exercises specifically tailored to study PDFA intralaminar damage modeling capability. Finally, methods are evaluated against a defined set of success criteria to highlight successes and limitations.

  7. Standard Reference Line Combined with One-Point Calibration-Free Laser-Induced Breakdown Spectroscopy (CF-LIBS) to Quantitatively Analyze Stainless and Heat Resistant Steel.

    PubMed

    Fu, Hongbo; Wang, Huadong; Jia, Junwei; Ni, Zhibo; Dong, Fengzhong

    2018-01-01

    Due to the influence of major elements' self-absorption, scarce observable spectral lines of trace elements, and relative efficiency correction of experimental system, accurate quantitative analysis with calibration-free laser-induced breakdown spectroscopy (CF-LIBS) is in fact not easy. In order to overcome these difficulties, standard reference line (SRL) combined with one-point calibration (OPC) is used to analyze six elements in three stainless-steel and five heat-resistant steel samples. The Stark broadening and Saha - Boltzmann plot of Fe are used to calculate the electron density and the plasma temperature, respectively. In the present work, we tested the original SRL method, the SRL with the OPC method, and intercept with the OPC method. The final calculation results show that the latter two methods can effectively improve the overall accuracy of quantitative analysis and the detection limits of trace elements.

  8. Vessel extraction in retinal images using automatic thresholding and Gabor Wavelet.

    PubMed

    Ali, Aziah; Hussain, Aini; Wan Zaki, Wan Mimi Diyana

    2017-07-01

    Retinal image analysis has been widely used for early detection and diagnosis of multiple systemic diseases. Accurate vessel extraction in retinal image is a crucial step towards a fully automated diagnosis system. This work affords an efficient unsupervised method for extracting blood vessels from retinal images by combining existing Gabor Wavelet (GW) method with automatic thresholding. Green channel image is extracted from color retinal image and used to produce Gabor feature image using GW. Both green channel image and Gabor feature image undergo vessel-enhancement step in order to highlight blood vessels. Next, the two vessel-enhanced images are transformed to binary images using automatic thresholding before combined to produce the final vessel output. Combining the images results in significant improvement of blood vessel extraction performance compared to using individual image. Effectiveness of the proposed method was proven via comparative analysis with existing methods validated using publicly available database, DRIVE.

  9. Fast gradient HPLC/MS separation of phenolics in green tea to monitor their degradation.

    PubMed

    Šilarová, Petra; Česlová, Lenka; Meloun, Milan

    2017-12-15

    The degradation of catechins and other phenolics in green tea infusions were monitored using fast HPLC/MS separation. The final separation was performed within 2.5min using Ascentis Express C18 column (50mm×2.1mm i.d.) packed with 2μm porous shell particles. Degradation was studied in relation to the temperature of water (70, 80, 90°C) and the standing time of the infusion (up to 6h). Along with chromatographic separation, the antioxidant properties of the infusions were monitored using two spectrophotometric methods. During staying of green tea infusion, the degradation of some catechins probably to gallic acid was observed. Finally, the influence of tea bag storage on antioxidant properties of green tea was evaluated. Rapid degradation of antioxidants after 3weeks was observed. The principal component analysis, factor analysis and discriminant analysis were used for the statistical evaluation of obtained experimental data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. The Relationship Between Serum Endocan Levels With the Presence of Slow Coronary Flow: A Cross-Sectional Study.

    PubMed

    Kundi, Harun; Gok, Murat; Kiziltunc, Emrullah; Topcuoglu, Canan; Cetin, Mustafa; Cicekcioglu, Hulya; Ugurlu, Burcu; Ulusoy, Feridun Vasfi

    2017-07-01

    The aim of this study was to investigate the relationship between endocan levels with the presence of slow coronary flow (SCF). In this cross-sectional study, a total of 88 patients, who admitted to our hospital, were included in this study. Of these, 53 patients with SCF and 35 patients with normal coronary flow were included in the final analysis. Coronary flow rates of all patients were determined by the Timi Frame Count (TFC) method. In correlation analysis, endocan levels revealed a significantly positive correlation with high sensitive C-reactive protein and corrected TFC. In multivariate logistic regression analysis, the endocan levels were found as independently associated with the presence of SCF. Finally, using a cutoff level of 2.3, endocan level predicted the presence of SCF with a sensitivity of 77.2% and specificity of 75.2%. In conclusion, our study showed that higher endocan levels were significantly and independently related to the presence of SCF.

  11. Determination of Shear Wave Velocity in Offshore Terengganu for Ground Response Analysis

    NASA Astrophysics Data System (ADS)

    Mazlina, M.; Liew, M. S.; Adnan, A.; Harahap, I. S. H.; Hamid, N. A.

    2018-04-01

    Amount of vibration received in any location can be analysed by conducting ground response analysis. Even though there are three different methods available in this analysis, One Dimensional ground response analysis method has been widely used. Shear wave velocity is one of the key parameters in this analysis. A lot of correlations have been formulated to determine shear wave velocity with cone penetration test. In this study, correlations developed for Quaternary geological age have been selected. Six equations have been adopted comprise of all soil and soil type dependent correlations. Two platforms sites consist of clay and combination of clay and sand have been analysed. Shear velocity to be used in ground response analysis has been obtained. Results have been illustrated in graphs where shear velocity for each case has been plotted. In avoiding under or over predicting of shear wave velocity, the average of all soil and soil type dependent results will be used as final Vs value.

  12. Key Spatial Relations-based Focused Crawling (KSRs-FC) for Borderlands Situation Analysis

    NASA Astrophysics Data System (ADS)

    Hou, D. Y.; Wu, H.; Chen, J.; Li, R.

    2013-11-01

    Place names play an important role in Borderlands Situation topics, while current focused crawling methods treat them in the same way as other common keywords, which may lead to the omission of many useful web pages. In the paper, place names in web pages and their spatial relations were firstly discussed. Then, a focused crawling method named KSRs-FC was proposed to deal with the collection of situation information about borderlands. In this method, place names and common keywords were represented separately, and some of the spatial relations related to web pages crawling were used in the relevance calculation between the given topic and web pages. Furthermore, an information collection system for borderlands situation analysis was developed based on KSRs-FC. Finally, F-Score method was adopted to quantitatively evaluate this method by comparing with traditional method. Experimental results showed that the F-Score value of the proposed method increased by 11% compared to traditional method with the same sample data. Obviously, KSRs-FC method can effectively reduce the misjudgement of relevant webpages.

  13. An image analysis of TLC patterns for quality control of saffron based on soil salinity effect: A strategy for data (pre)-processing.

    PubMed

    Sereshti, Hassan; Poursorkh, Zahra; Aliakbarzadeh, Ghazaleh; Zarre, Shahin; Ataolahi, Sahar

    2018-01-15

    Quality of saffron, a valuable food additive, could considerably affect the consumers' health. In this work, a novel preprocessing strategy for image analysis of saffron thin layer chromatographic (TLC) patterns was introduced. This includes performing a series of image pre-processing techniques on TLC images such as compression, inversion, elimination of general baseline (using asymmetric least squares (AsLS)), removing spots shift and concavity (by correlation optimization warping (COW)), and finally conversion to RGB chromatograms. Subsequently, an unsupervised multivariate data analysis including principal component analysis (PCA) and k-means clustering was utilized to investigate the soil salinity effect, as a cultivation parameter, on saffron TLC patterns. This method was used as a rapid and simple technique to obtain the chemical fingerprints of saffron TLC images. Finally, the separated TLC spots were chemically identified using high-performance liquid chromatography-diode array detection (HPLC-DAD). Accordingly, the saffron quality from different areas of Iran was evaluated and classified. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Analysis of the sleep quality of elderly people using biomedical signals.

    PubMed

    Moreno-Alsasua, L; Garcia-Zapirain, B; Mendez-Zorrilla, A

    2015-01-01

    This paper presents a technical solution that analyses sleep signals captured by biomedical sensors to find possible disorders during rest. Specifically, the method evaluates electrooculogram (EOG) signals, skin conductance (GSR), air flow (AS), and body temperature. Next, a quantitative sleep quality analysis determines significant changes in the biological signals, and any similarities between them in a given time period. Filtering techniques such as the Fourier transform method and IIR filters process the signal and identify significant variations. Once these changes have been identified, all significant data is compared and a quantitative and statistical analysis is carried out to determine the level of a person's rest. To evaluate the correlation and significant differences, a statistical analysis has been calculated showing correlation between EOG and AS signals (p=0,005), EOG, and GSR signals (p=0,037) and, finally, the EOG and Body temperature (p=0,04). Doctors could use this information to monitor changes within a patient.

  15. Therapeutic change in interaction: conversation analysis of a transforming sequence.

    PubMed

    Voutilainen, Liisa; Perakyla, Anssi; Ruusuvuori, Johanna

    2011-05-01

    A process of change within a single case of cognitive-constructivist therapy is analyzed by means of conversation analysis (CA). The focus is on a process of change in the sequences of interaction, which consist of the therapist's conclusion and the patient's response to it. In the conclusions, the therapist investigates and challenges the patient's tendency to transform her feelings of disappointment and anger into self-blame. Over the course of the therapy, the patient's responses to these conclusions are recast: from the patient first rejecting the conclusion, to then being ambivalent, and finally to agreeing with the therapist. On the basis of this case study, we suggest that an analysis that focuses on sequences of talk that are interactionally similar offers a sensitive method to investigate the manifestation of therapeutic change. It is suggested that this line of research can complement assimilation analysis and other methods of analyzing changes in a client's talk.

  16. LHCb trigger streams optimization

    NASA Astrophysics Data System (ADS)

    Derkach, D.; Kazeev, N.; Neychev, R.; Panin, A.; Trofimov, I.; Ustyuzhanin, A.; Vesterinen, M.

    2017-10-01

    The LHCb experiment stores around 1011 collision events per year. A typical physics analysis deals with a final sample of up to 107 events. Event preselection algorithms (lines) are used for data reduction. Since the data are stored in a format that requires sequential access, the lines are grouped into several output file streams, in order to increase the efficiency of user analysis jobs that read these data. The scheme efficiency heavily depends on the stream composition. By putting similar lines together and balancing the stream sizes it is possible to reduce the overhead. We present a method for finding an optimal stream composition. The method is applied to a part of the LHCb data (Turbo stream) on the stage where it is prepared for user physics analysis. This results in an expected improvement of 15% in the speed of user analysis jobs, and will be applied on data to be recorded in 2017.

  17. Overview of Sensitivity Analysis and Shape Optimization for Complex Aerodynamic Configurations

    NASA Technical Reports Server (NTRS)

    Newman, Perry A.; Newman, James C., III; Barnwell, Richard W.; Taylor, Arthur C., III; Hou, Gene J.-W.

    1998-01-01

    This paper presents a brief overview of some of the more recent advances in steady aerodynamic shape-design sensitivity analysis and optimization, based on advanced computational fluid dynamics. The focus here is on those methods particularly well- suited to the study of geometrically complex configurations and their potentially complex associated flow physics. When nonlinear state equations are considered in the optimization process, difficulties are found in the application of sensitivity analysis. Some techniques for circumventing such difficulties are currently being explored and are included here. Attention is directed to methods that utilize automatic differentiation to obtain aerodynamic sensitivity derivatives for both complex configurations and complex flow physics. Various examples of shape-design sensitivity analysis for unstructured-grid computational fluid dynamics algorithms are demonstrated for different formulations of the sensitivity equations. Finally, the use of advanced, unstructured-grid computational fluid dynamics in multidisciplinary analyses and multidisciplinary sensitivity analyses within future optimization processes is recommended and encouraged.

  18. Graduate Attributes for Master's Programs in Health Services and Policy Research: Results of a National Consultation

    PubMed Central

    Morgan, Steve; Orr, Karen; Mah, Catherine

    2010-01-01

    Objective: Our objective was to identify desirable attributes to be developed through graduate training in health services and policy research (HSPR) by identifying the knowledge, skills and abilities thought to be keys to success in HSPR-related careers. We aimed for a framework clear enough to serve as a touchstone for HSPR training programs across Canada yet flexible enough to permit diversity of specialization across and within those programs. Methods: Our approach involved several stages of data collection and analysis: a review of literature; telephone interviews with opinion leaders; online surveys of HSPR students, recent graduates and employers; an invitational workshop; and an interactive panel at a national conference. Our final framework was arrived at through an iterative process of thematic analysis, reflection on invited feedback from consultation participants and triangulation with existing competency frameworks. Results: Our final result was a framework that identifies traits, knowledge and abilities of master's-level graduates who are capable of fostering health system improvement through planning, management, analysis or monitoring that is informed by credible evidence and relevant theory. These attributes are organized into three levels: generic graduate attributes, knowledge related to health and health systems and, finally, attributes related to the application of knowledge for health system improvement. The HSPR-specific attributes include not only an understanding of HSPR theories and methods but also the skills related to the practical application of knowledge in the complex environments of health system decision-making and healthcare policy. Conclusion: Master's-level HSPR training programs should prepare students to pose and seek answers to important questions and provide them with the skills necessary to apply their knowledge within complex decision-making environments. PMID:21804839

  19. CT scan range estimation using multiple body parts detection: let PACS learn the CT image content.

    PubMed

    Wang, Chunliang; Lundström, Claes

    2016-02-01

    The aim of this study was to develop an efficient CT scan range estimation method that is based on the analysis of image data itself instead of metadata analysis. This makes it possible to quantitatively compare the scan range of two studies. In our study, 3D stacks are first projected to 2D coronal images via a ray casting-like process. Trained 2D body part classifiers are then used to recognize different body parts in the projected image. The detected candidate regions go into a structure grouping process to eliminate false-positive detections. Finally, the scale and position of the patient relative to the projected figure are estimated based on the detected body parts via a structural voting. The start and end lines of the CT scan are projected to a standard human figure. The position readout is normalized so that the bottom of the feet represents 0.0, and the top of the head is 1.0. Classifiers for 18 body parts were trained using 184 CT scans. The final application was tested on 136 randomly selected heterogeneous CT scans. Ground truth was generated by asking two human observers to mark the start and end positions of each scan on the standard human figure. When compared with the human observers, the mean absolute error of the proposed method is 1.2% (max: 3.5%) and 1.6% (max: 5.4%) for the start and end positions, respectively. We proposed a scan range estimation method using multiple body parts detection and relative structure position analysis. In our preliminary tests, the proposed method delivered promising results.

  20. Large-scale-system effectiveness analysis. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, A.D.; Ayoub, A.K.; Foster, J.W.

    1979-11-01

    Objective of the research project has been the investigation and development of methods for calculating system reliability indices which have absolute, and measurable, significance to consumers. Such indices are a necessary prerequisite to any scheme for system optimization which includes the economic consequences of consumer service interruptions. A further area of investigation has been joint consideration of generation and transmission in reliability studies. Methods for finding or estimating the probability distributions of some measures of reliability performance have been developed. The application of modern Monte Carlo simulation methods to compute reliability indices in generating systems has been studied.

  1. The method of projected characteristics for the evolution of magnetic arches

    NASA Technical Reports Server (NTRS)

    Nakagawa, Y.; Hu, Y. Q.; Wu, S. T.

    1987-01-01

    A numerical method of solving fully nonlinear MHD equation is described. In particular, the formulation based on the newly developed method of projected characteristics (Nakagawa, 1981) suitable to study the evolution of magnetic arches due to motions of their foot-points is presented. The final formulation is given in the form of difference equations; therefore, the analysis of numerical stability is also presented. Further, the most important derivation of physically self-consistent, time-dependent boundary conditions (i.e. the evolving boundary equations) is given in detail, and some results obtained with such boundary equations are reported.

  2. Augmenting the one-shot framework by additional constraints

    DOE PAGES

    Bosse, Torsten

    2016-05-12

    The (multistep) one-shot method for design optimization problems has been successfully implemented for various applications. To this end, a slowly convergent primal fixed-point iteration of the state equation is augmented by an adjoint iteration and a corresponding preconditioned design update. In this paper we present a modification of the method that allows for additional equality constraints besides the usual state equation. Finally, a retardation analysis and the local convergence of the method in terms of necessary and sufficient conditions are given, which depend on key characteristics of the underlying problem and the quality of the utilized preconditioner.

  3. Augmenting the one-shot framework by additional constraints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosse, Torsten

    The (multistep) one-shot method for design optimization problems has been successfully implemented for various applications. To this end, a slowly convergent primal fixed-point iteration of the state equation is augmented by an adjoint iteration and a corresponding preconditioned design update. In this paper we present a modification of the method that allows for additional equality constraints besides the usual state equation. Finally, a retardation analysis and the local convergence of the method in terms of necessary and sufficient conditions are given, which depend on key characteristics of the underlying problem and the quality of the utilized preconditioner.

  4. Determination of vitamin C in foods: current state of method validation.

    PubMed

    Spínola, Vítor; Llorent-Martínez, Eulogio J; Castilho, Paula C

    2014-11-21

    Vitamin C is one of the most important vitamins, so reliable information about its content in foodstuffs is a concern to both consumers and quality control agencies. However, the heterogeneity of food matrixes and the potential degradation of this vitamin during its analysis create enormous challenges. This review addresses the development and validation of high-performance liquid chromatography methods for vitamin C analysis in food commodities, during the period 2000-2014. The main characteristics of vitamin C are mentioned, along with the strategies adopted by most authors during sample preparation (freezing and acidification) to avoid vitamin oxidation. After that, the advantages and handicaps of different analytical methods are discussed. Finally, the main aspects concerning method validation for vitamin C analysis are critically discussed. Parameters such as selectivity, linearity, limit of quantification, and accuracy were studied by most authors. Recovery experiments during accuracy evaluation were in general satisfactory, with usual values between 81 and 109%. However, few methods considered vitamin C stability during the analytical process, and the study of the precision was not always clear or complete. Potential future improvements regarding proper method validation are indicated to conclude this review. Copyright © 2014. Published by Elsevier B.V.

  5. Analysis and optimization of gyrokinetic toroidal simulations on homogenous and heterogenous platforms

    DOE PAGES

    Ibrahim, Khaled Z.; Madduri, Kamesh; Williams, Samuel; ...

    2013-07-18

    The Gyrokinetic Toroidal Code (GTC) uses the particle-in-cell method to efficiently simulate plasma microturbulence. This paper presents novel analysis and optimization techniques to enhance the performance of GTC on large-scale machines. We introduce cell access analysis to better manage locality vs. synchronization tradeoffs on CPU and GPU-based architectures. Finally, our optimized hybrid parallel implementation of GTC uses MPI, OpenMP, and NVIDIA CUDA, achieves up to a 2× speedup over the reference Fortran version on multiple parallel systems, and scales efficiently to tens of thousands of cores.

  6. Comparison of Analysis, Simulation, and Measurement of Wire-to-Wire Crosstalk. Part 1

    NASA Technical Reports Server (NTRS)

    Bradley, Arthur T.; Yavoich, Brian James; Hodson, Shame M.; Godley, Richard Franklin

    2010-01-01

    In this investigation, we compare crosstalk analysis, simulation, and measurement results for electrically short configurations. Methods include hand calculations, PSPICE simulations, Microstripes transient field solver, and empirical measurement. In total, four representative physical configurations are examined, including a single wire over a ground plane, a twisted pair over a ground plane, generator plus receptor wires inside a cylindrical conduit, and a single receptor wire inside a cylindrical conduit. Part 1 addresses the first two cases, and Part 2 addresses the final two. Agreement between the analysis, simulation, and test data is shown to be very good.

  7. Exploratory Mediation Analysis via Regularization

    PubMed Central

    Serang, Sarfaraz; Jacobucci, Ross; Brimhall, Kim C.; Grimm, Kevin J.

    2017-01-01

    Exploratory mediation analysis refers to a class of methods used to identify a set of potential mediators of a process of interest. Despite its exploratory nature, conventional approaches are rooted in confirmatory traditions, and as such have limitations in exploratory contexts. We propose a two-stage approach called exploratory mediation analysis via regularization (XMed) to better address these concerns. We demonstrate that this approach is able to correctly identify mediators more often than conventional approaches and that its estimates are unbiased. Finally, this approach is illustrated through an empirical example examining the relationship between college acceptance and enrollment. PMID:29225454

  8. 75 FR 13 - Alternate Fracture Toughness Requirements for Protection Against Pressurized Thermal Shock Events

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-04

    ...The Nuclear Regulatory Commission (NRC) is amending its regulations to provide alternate fracture toughness requirements for protection against pressurized thermal shock (PTS) events for pressurized water reactor (PWR) pressure vessels. This final rule provides alternate PTS requirements based on updated analysis methods. This action is desirable because the existing requirements are based on unnecessarily conservative probabilistic fracture mechanics analyses. This action reduces regulatory burden for those PWR licensees who expect to exceed the existing requirements before the expiration of their licenses, while maintaining adequate safety, and may choose to comply with the final rule as an alternative to complying with the existing requirements.

  9. A New Method for Nonlinear and Nonstationary Time Series Analysis and Its Application to the Earthquake and Building Response Records

    NASA Technical Reports Server (NTRS)

    Huang, Norden E.

    1999-01-01

    A new method for analyzing nonlinear and nonstationary data has been developed. The key part of the method is the Empirical Mode Decomposition method with which any complicated data set can be decomposed into a finite and often small number of Intrinsic Mode Functions (IMF). An IMF is defined as any function having the same numbers of zero-crossing and extrema, and also having symmetric envelopes defined by the local maxima and minima respectively. The IMF also admits well-behaved Hilbert transform. This decomposition method is adaptive, and, therefore, highly efficient. Since the decomposition is based on the local characteristic time scale of the data, it is applicable to nonlinear and nonstationary processes. With the Hilbert transform, the Intrinsic Mode Functions yield instantaneous frequencies as functions of time that give sharp identifications of imbedded structures. The final presentation of the results is an energy-frequency-time distribution, designated as the Hilbert Spectrum, Example of application of this method to earthquake and building response will be given. The results indicate those low frequency components, totally missed by the Fourier analysis, are clearly identified by the new method. Comparisons with Wavelet and window Fourier analysis show the new method offers much better temporal and frequency resolutions.

  10. Robust gene selection methods using weighting schemes for microarray data analysis.

    PubMed

    Kang, Suyeon; Song, Jongwoo

    2017-09-02

    A common task in microarray data analysis is to identify informative genes that are differentially expressed between two different states. Owing to the high-dimensional nature of microarray data, identification of significant genes has been essential in analyzing the data. However, the performances of many gene selection techniques are highly dependent on the experimental conditions, such as the presence of measurement error or a limited number of sample replicates. We have proposed new filter-based gene selection techniques, by applying a simple modification to significance analysis of microarrays (SAM). To prove the effectiveness of the proposed method, we considered a series of synthetic datasets with different noise levels and sample sizes along with two real datasets. The following findings were made. First, our proposed methods outperform conventional methods for all simulation set-ups. In particular, our methods are much better when the given data are noisy and sample size is small. They showed relatively robust performance regardless of noise level and sample size, whereas the performance of SAM became significantly worse as the noise level became high or sample size decreased. When sufficient sample replicates were available, SAM and our methods showed similar performance. Finally, our proposed methods are competitive with traditional methods in classification tasks for microarrays. The results of simulation study and real data analysis have demonstrated that our proposed methods are effective for detecting significant genes and classification tasks, especially when the given data are noisy or have few sample replicates. By employing weighting schemes, we can obtain robust and reliable results for microarray data analysis.

  11. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    USGS Publications Warehouse

    Anderson, Ryan; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott M.; Morris, Richard V.; Ehlmann, Bethany L.; Dyar, M. Darby

    2017-01-01

    Accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “sub-model” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. The sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.

  12. A developed nearly analytic discrete method for forward modeling in the frequency domain

    NASA Astrophysics Data System (ADS)

    Liu, Shaolin; Lang, Chao; Yang, Hui; Wang, Wenshuai

    2018-02-01

    High-efficiency forward modeling methods play a fundamental role in full waveform inversion (FWI). In this paper, the developed nearly analytic discrete (DNAD) method is proposed to accelerate frequency-domain forward modeling processes. We first derive the discretization of frequency-domain wave equations via numerical schemes based on the nearly analytic discrete (NAD) method to obtain a linear system. The coefficients of numerical stencils are optimized to make the linear system easier to solve and to minimize computing time. Wavefield simulation and numerical dispersion analysis are performed to compare the numerical behavior of DNAD method with that of the conventional NAD method. The results demonstrate the superiority of our proposed method. Finally, the DNAD method is implemented in frequency-domain FWI, and high-resolution inverse results are obtained.

  13. A method for improving reliability and relevance of LCA reviews: the case of life-cycle greenhouse gas emissions of tap and bottled water.

    PubMed

    Fantin, Valentina; Scalbi, Simona; Ottaviano, Giuseppe; Masoni, Paolo

    2014-04-01

    The purpose of this study is to propose a method for harmonising Life Cycle Assessment (LCA) literature studies on the same product or on different products fulfilling the same function for a reliable and meaningful comparison of their life-cycle environmental impacts. The method is divided in six main steps which aim to rationalize and quicken the efforts needed to carry out the comparison. The steps include: 1) a clear definition of the goal and scope of the review; 2) critical review of the references; 3) identification of significant parameters that have to be harmonised; 4) harmonisation of the parameters; 5) statistical analysis to support the comparison; 6) results and discussion. This approach was then applied to the comparative analysis of the published LCA studies on tap and bottled water production, focussing on Global Warming Potential (GWP) results, with the aim to identify the environmental preferable alternative. A statistical analysis with Wilcoxon's test confirmed that the difference between harmonised GWP values of tap and bottled water was significant. The results obtained from the comparison of the harmonised mean GWP results showed that tap water always has the best environmental performance, even in case of high energy-consuming technologies for drinking water treatments. The strength of the method is that it enables both performing a deep analysis of the LCA literature and obtaining more consistent comparisons across the published LCAs. For these reasons, it can be a valuable tool which provides useful information for both practitioners and decision makers. Finally, its application to the case study allowed both to supply a description of systems variability and to evaluate the importance of several key parameters for tap and bottled water production. The comparative review of LCA studies, with the inclusion of a statistical decision test, can validate and strengthen the final statements of the comparison. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Computer-assisted uncertainty assessment of k0-NAA measurement results

    NASA Astrophysics Data System (ADS)

    Bučar, T.; Smodiš, B.

    2008-10-01

    In quantifying measurement uncertainty of measurement results obtained by the k0-based neutron activation analysis ( k0-NAA), a number of parameters should be considered and appropriately combined in deriving the final budget. To facilitate this process, a program ERON (ERror propagatiON) was developed, which computes uncertainty propagation factors from the relevant formulae and calculates the combined uncertainty. The program calculates uncertainty of the final result—mass fraction of an element in the measured sample—taking into account the relevant neutron flux parameters such as α and f, including their uncertainties. Nuclear parameters and their uncertainties are taken from the IUPAC database (V.P. Kolotov and F. De Corte, Compilation of k0 and related data for NAA). Furthermore, the program allows for uncertainty calculations of the measured parameters needed in k0-NAA: α (determined with either the Cd-ratio or the Cd-covered multi-monitor method), f (using the Cd-ratio or the bare method), Q0 (using the Cd-ratio or internal comparator method) and k0 (using the Cd-ratio, internal comparator or the Cd subtraction method). The results of calculations can be printed or exported to text or MS Excel format for further analysis. Special care was taken to make the calculation engine portable by having possibility of its incorporation into other applications (e.g., DLL and WWW server). Theoretical basis and the program are described in detail, and typical results obtained under real measurement conditions are presented.

  15. Sentiment analysis of Chinese microblogging based on sentiment ontology: a case study of `7.23 Wenzhou Train Collision'

    NASA Astrophysics Data System (ADS)

    Shi, Wei; Wang, Hongwei; He, Shaoyi

    2013-12-01

    Sentiment analysis of microblogging texts can facilitate both organisations' public opinion monitoring and governments' response strategies development. Nevertheless, most of the existing analysis methods are conducted on Twitter, lacking of sentiment analysis of Chinese microblogging (Weibo), and they generally rely on a large number of manually annotated training or machine learning to perform sentiment classification, yielding with difficulties in application. This paper addresses these problems and employs a sentiment ontology model to examine sentiment analysis of Chinese microblogging. We conduct a sentiment analysis of all public microblogging posts about '7.23 Wenzhou Train Collision' broadcasted by Sina microblogging users between 23 July and 1 August 2011. For every day in this time period, we first extract eight dimensions of sentiment (expect, joy, love, surprise, anxiety, sorrow, angry, and hate), and then build fuzzy sentiment ontology based on HowNet and semantic similarity for sentiment analysis; we also establish computing methods of influence and sentiment of microblogging texts; and we finally explore the change of public sentiment after '7.23 Wenzhou Train Collision'. The results show that the established sentiment analysis method has excellent application, and the change of different emotional values can reflect the success or failure of guiding the public opinion by the government.

  16. Localization of optic disc and fovea in retinal images using intensity based line scanning analysis.

    PubMed

    Kamble, Ravi; Kokare, Manesh; Deshmukh, Girish; Hussin, Fawnizu Azmadi; Mériaudeau, Fabrice

    2017-08-01

    Accurate detection of diabetic retinopathy (DR) mainly depends on identification of retinal landmarks such as optic disc and fovea. Present methods suffer from challenges like less accuracy and high computational complexity. To address this issue, this paper presents a novel approach for fast and accurate localization of optic disc (OD) and fovea using one-dimensional scanned intensity profile analysis. The proposed method utilizes both time and frequency domain information effectively for localization of OD. The final OD center is located using signal peak-valley detection in time domain and discontinuity detection in frequency domain analysis. However, with the help of detected OD location, the fovea center is located using signal valley analysis. Experiments were conducted on MESSIDOR dataset, where OD was successfully located in 1197 out of 1200 images (99.75%) and fovea in 1196 out of 1200 images (99.66%) with an average computation time of 0.52s. The large scale evaluation has been carried out extensively on nine publicly available databases. The proposed method is highly efficient in terms of quickly and accurately localizing OD and fovea structure together compared with the other state-of-the-art methods. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Analysis of polycyclic aromatic hydrocarbons extracted from air particulate matter using a temperature programmable injector coupled to GC-C-IRMS.

    PubMed

    Mikolajczuk, Agnieszka; Przyk, Elzbieta Perez; Geypens, Benny; Berglund, Michael; Taylor, Philip

    2010-03-01

    Compound specific isotopic analysis (CSIA) can provide information about the origin of analysed compounds - in this case, polycyclic aromatic hydrocarbons (PAHs). In the study, PAHs were extracted from three dust samples: winter and summer filter dust and tunnel dust. The measurement was performed using the method validated in our laboratory using pure, solid compounds and EPA 610 reference assortment. CSIA required an appropriate clean-up method to avoid an unresolved complex in the gas chromatographic analysis usually found in the chromatography of PAHs. Extensive sample clean-up for this particular matrix was found to be necessary to obtain good gas chromatography-combustion-isotope ratio mass spectrometry analysis results. The sample purification method included two steps in which the sample is cleaned up and the aliphatic and aromatic hydrocarbons are separated. The concentration of PAHs in the measured samples was low; so a large volume injection technique (100 microl) was applied. The delta(VPDB)(13)C was measured with a final uncertainty smaller than 1 per thousand. Comparison of the delta(VPDB)(13)C signatures of PAHs extracted from different dust samples was feasible with this method and, doing so, significant differences were observed.

  18. Singularity analysis based on wavelet transform of fractal measures for identifying geochemical anomaly in mineral exploration

    NASA Astrophysics Data System (ADS)

    Chen, Guoxiong; Cheng, Qiuming

    2016-02-01

    Multi-resolution and scale-invariance have been increasingly recognized as two closely related intrinsic properties endowed in geofields such as geochemical and geophysical anomalies, and they are commonly investigated by using multiscale- and scaling-analysis methods. In this paper, the wavelet-based multiscale decomposition (WMD) method was proposed to investigate the multiscale natures of geochemical pattern from large scale to small scale. In the light of the wavelet transformation of fractal measures, we demonstrated that the wavelet approximation operator provides a generalization of box-counting method for scaling analysis of geochemical patterns. Specifically, the approximation coefficient acts as the generalized density-value in density-area fractal modeling of singular geochemical distributions. Accordingly, we presented a novel local singularity analysis (LSA) using the WMD algorithm which extends the conventional moving averaging to a kernel-based operator for implementing LSA. Finally, the novel LSA was validated using a case study dealing with geochemical data (Fe2O3) in stream sediments for mineral exploration in Inner Mongolia, China. In comparison with the LSA implemented using the moving averaging method the novel LSA using WMD identified improved weak geochemical anomalies associated with mineralization in covered area.

  19. SCGICAR: Spatial concatenation based group ICA with reference for fMRI data analysis.

    PubMed

    Shi, Yuhu; Zeng, Weiming; Wang, Nizhuan

    2017-09-01

    With the rapid development of big data, the functional magnetic resonance imaging (fMRI) data analysis of multi-subject is becoming more and more important. As a kind of blind source separation technique, group independent component analysis (GICA) has been widely applied for the multi-subject fMRI data analysis. However, spatial concatenated GICA is rarely used compared with temporal concatenated GICA due to its disadvantages. In this paper, in order to overcome these issues and to consider that the ability of GICA for fMRI data analysis can be improved by adding a priori information, we propose a novel spatial concatenation based GICA with reference (SCGICAR) method to take advantage of the priori information extracted from the group subjects, and then the multi-objective optimization strategy is used to implement this method. Finally, the post-processing means of principal component analysis and anti-reconstruction are used to obtain group spatial component and individual temporal component in the group, respectively. The experimental results show that the proposed SCGICAR method has a better performance on both single-subject and multi-subject fMRI data analysis compared with classical methods. It not only can detect more accurate spatial and temporal component for each subject of the group, but also can obtain a better group component on both temporal and spatial domains. These results demonstrate that the proposed SCGICAR method has its own advantages in comparison with classical methods, and it can better reflect the commonness of subjects in the group. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Compressive strength of human openwedges: a selection method

    NASA Astrophysics Data System (ADS)

    Follet, H.; Gotteland, M.; Bardonnet, R.; Sfarghiu, A. M.; Peyrot, J.; Rumelhart, C.

    2004-02-01

    A series of 44 samples of bone wedges of human origin, intended for allograft openwedge osteotomy and obtained without particular precautions during hip arthroplasty were re-examined. After viral inactivity chemical treatment, lyophilisation and radio-sterilisation (intended to produce optimal health safety), the compressive strength, independent of age, sex and the height of the sample (or angle of cut), proved to be too widely dispersed [ 10{-}158 MPa] in the first study. We propose a method for selecting samples which takes into account their geometry (width, length, thicknesses, cortical surface area). Statistical methods (Principal Components Analysis PCA, Hierarchical Cluster Analysis, Multilinear regression) allowed final selection of 29 samples having a mean compressive strength σ_{max} =103 MPa ± 26 and with variation [ 61{-}158 MPa] . These results are equivalent or greater than average materials currently used in openwedge osteotomy.

  1. Mechanical performance and parameter sensitivity analysis of 3D braided composites joints.

    PubMed

    Wu, Yue; Nan, Bo; Chen, Liang

    2014-01-01

    3D braided composite joints are the important components in CFRP truss, which have significant influence on the reliability and lightweight of structures. To investigate the mechanical performance of 3D braided composite joints, a numerical method based on the microscopic mechanics is put forward, the modeling technologies, including the material constants selection, element type, grid size, and the boundary conditions, are discussed in detail. Secondly, a method for determination of ultimate bearing capacity is established, which can consider the strength failure. Finally, the effect of load parameters, geometric parameters, and process parameters on the ultimate bearing capacity of joints is analyzed by the global sensitivity analysis method. The results show that the main pipe diameter thickness ratio γ, the main pipe diameter D, and the braided angle α are sensitive to the ultimate bearing capacity N.

  2. Recent developments in nickel electrode analysis

    NASA Technical Reports Server (NTRS)

    Whiteley, Richard V.; Daman, M. E.; Kaiser, E. Q.

    1991-01-01

    Three aspects of nickel electrode analysis for Nickel-Hydrogen and Nickel-Cadmium battery cell applications are addressed: (1) the determination of active material; (2) charged state nickel (as NiOOH + CoOOH); and (3) potassium ion content in the electrode. Four deloading procedures are compared for completeness of active material removal, and deloading conditions for efficient active material analyses are established. Two methods for charged state nickel analysis are compared: the current NASA procedure and a new procedure based on the oxidation of sodium oxalate by the charged material. Finally, a method for determining potassium content in an electrode sample by flame photometry is presented along with analytical results illustrating differences in potassium levels from vendor to vendor and the effects of stress testing on potassium content in the electrode. The relevance of these analytical procedures to electrode performance is reviewed.

  3. Analysis of Uncertainty and Variability in Finite Element Computational Models for Biomedical Engineering: Characterization and Propagation

    PubMed Central

    Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González

    2016-01-01

    Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering. PMID:27872840

  4. Analysis of Uncertainty and Variability in Finite Element Computational Models for Biomedical Engineering: Characterization and Propagation.

    PubMed

    Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González

    2016-01-01

    Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering.

  5. #fitspo on Instagram: A mixed-methods approach using Netlytic and photo analysis, uncovering the online discussion and author/image characteristics.

    PubMed

    Santarossa, Sara; Coyne, Paige; Lisinski, Carly; Woodruff, Sarah J

    2016-11-01

    The #fitspo 'tag' is a recent trend on Instagram, which is used on posts to motivate others towards a healthy lifestyle through exercise/eating habits. This study used a mixed-methods approach consisting of text and network analysis via the Netlytic program ( N = 10,000 #fitspo posts), and content analysis of #fitspo images ( N = 122) was used to examine author and image characteristics. Results suggest that #fitspo posts may motivate through appearance-mediated themes, as the largest content categories (based on the associated text) were 'feeling good' and 'appearance'. Furthermore, #fitspo posts may create peer influence/support as personal (opposed to non-personal) accounts were associated with higher popularity of images (i.e. number of likes/followers). Finally, most images contained posed individuals with some degree of objectification.

  6. A systematic evaluation of normalization methods in quantitative label-free proteomics.

    PubMed

    Välikangas, Tommi; Suomi, Tomi; Elo, Laura L

    2018-01-01

    To date, mass spectrometry (MS) data remain inherently biased as a result of reasons ranging from sample handling to differences caused by the instrumentation. Normalization is the process that aims to account for the bias and make samples more comparable. The selection of a proper normalization method is a pivotal task for the reliability of the downstream analysis and results. Many normalization methods commonly used in proteomics have been adapted from the DNA microarray techniques. Previous studies comparing normalization methods in proteomics have focused mainly on intragroup variation. In this study, several popular and widely used normalization methods representing different strategies in normalization are evaluated using three spike-in and one experimental mouse label-free proteomic data sets. The normalization methods are evaluated in terms of their ability to reduce variation between technical replicates, their effect on differential expression analysis and their effect on the estimation of logarithmic fold changes. Additionally, we examined whether normalizing the whole data globally or in segments for the differential expression analysis has an effect on the performance of the normalization methods. We found that variance stabilization normalization (Vsn) reduced variation the most between technical replicates in all examined data sets. Vsn also performed consistently well in the differential expression analysis. Linear regression normalization and local regression normalization performed also systematically well. Finally, we discuss the choice of a normalization method and some qualities of a suitable normalization method in the light of the results of our evaluation. © The Author 2016. Published by Oxford University Press.

  7. Separation and preconcentration of the rare-earth elements and yttrium from geological materials by ion-exchange and sequential acid elution

    USGS Publications Warehouse

    Crock, J.G.; Lichte, F.E.; Riddle, G.O.; Beech, C.L.

    1986-01-01

    The abundance of rare-earth elements (REE) and yttrium in geological materials is generally low, and most samples contain elements that interfere in the determination of the REE and Y, so a separation and/or preconcentration step is often necessary. This is often achieved by ion-exchange chromatography with either nitric or hydrochloric acid. It is advantageous, however, to use both acids sequentially. The final solution thus obtained contains only the REE and Y, with minor amounts of Al, Ba, Ca, Sc, Sr and Ti. Elements that potentially interfere, such as Be, Co, Cr, Fe, Mn, Th, U, V and Zr, are virtually eliminated. Inductively-coupled argon plasma atomic-emission spectroscopy can then be used for a final precise and accurate measurement. The method can also be used with other instrumental methods of analysis. ?? 1986.

  8. Risk assessment of failure modes of gas diffuser liner of V94.2 siemens gas turbine by FMEA method

    NASA Astrophysics Data System (ADS)

    Mirzaei Rafsanjani, H.; Rezaei Nasab, A.

    2012-05-01

    Failure of welding connection of gas diffuser liner and exhaust casing is one of the failure modes of V94.2 gas turbines which are happened in some power plants. This defect is one of the uncertainties of customers when they want to accept the final commissioning of this product. According to this, the risk priority of this failure evaluated by failure modes and effect analysis (FMEA) method to find out whether this failure is catastrophic for turbine performance and is harmful for humans. By using history of 110 gas turbines of this model which are used in some power plants, the severity number, occurrence number and detection number of failure determined and consequently the Risk Priority Number (RPN) of failure determined. Finally, critically matrix of potential failures is created and illustrated that failure modes are located in safe zone.

  9. Parametric study on single shot peening by dimensional analysis method incorporated with finite element method

    NASA Astrophysics Data System (ADS)

    Wu, Xian-Qian; Wang, Xi; Wei, Yan-Peng; Song, Hong-Wei; Huang, Chen-Guang

    2012-06-01

    Shot peening is a widely used surface treatment method by generating compressive residual stress near the surface of metallic materials to increase fatigue life and resistance to corrosion fatigue, cracking, etc. Compressive residual stress and dent profile are important factors to evaluate the effectiveness of shot peening process. In this paper, the influence of dimensionless parameters on maximum compressive residual stress and maximum depth of the dent were investigated. Firstly, dimensionless relations of processing parameters that affect the maximum compressive residual stress and the maximum depth of the dent were deduced by dimensional analysis method. Secondly, the influence of each dimensionless parameter on dimensionless variables was investigated by the finite element method. Furthermore, related empirical formulas were given for each dimensionless parameter based on the simulation results. Finally, comparison was made and good agreement was found between the simulation results and the empirical formula, which shows that a useful approach is provided in this paper for analyzing the influence of each individual parameter.

  10. Development, validation and determination of multiclass pesticide residues in cocoa beans using gas chromatography and liquid chromatography tandem mass spectrometry.

    PubMed

    Zainudin, Badrul Hisyam; Salleh, Salsazali; Mohamed, Rahmat; Yap, Ken Choy; Muhamad, Halimah

    2015-04-01

    An efficient and rapid method for the analysis of pesticide residues in cocoa beans using gas and liquid chromatography-tandem mass spectrometry was developed, validated and applied to imported and domestic cocoa beans samples collected over 2 years from smallholders and Malaysian ports. The method was based on solvent extraction method and covers 26 pesticides (insecticides, fungicides, and herbicides) of different chemical classes. The recoveries for all pesticides at 10 and 50 μg/kg were in the range of 70-120% with relative standard deviations of less than 20%. Good selectivity and sensitivity were obtained with method limit of quantification of 10 μg/kg. The expanded uncertainty measurements were in the range of 4-25%. Finally, the proposed method was successfully applied for the routine analysis of pesticide residues in cocoa beans via a monitoring study where 10% of them was found positive for chlorpyrifos, ametryn and metalaxyl. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Space Subdivision in Indoor Mobile Laser Scanning Point Clouds Based on Scanline Analysis.

    PubMed

    Zheng, Yi; Peter, Michael; Zhong, Ruofei; Oude Elberink, Sander; Zhou, Quan

    2018-06-05

    Indoor space subdivision is an important aspect of scene analysis that provides essential information for many applications, such as indoor navigation and evacuation route planning. Until now, most proposed scene understanding algorithms have been based on whole point clouds, which has led to complicated operations, high computational loads and low processing speed. This paper presents novel methods to efficiently extract the location of openings (e.g., doors and windows) and to subdivide space by analyzing scanlines. An opening detection method is demonstrated that analyses the local geometric regularity in scanlines to refine the extracted opening. Moreover, a space subdivision method based on the extracted openings and the scanning system trajectory is described. Finally, the opening detection and space subdivision results are saved as point cloud labels which will be used for further investigations. The method has been tested on a real dataset collected by ZEB-REVO. The experimental results validate the completeness and correctness of the proposed method for different indoor environment and scanning paths.

  12. Error and Complexity Analysis for a Collocation-Grid-Projection Plus Precorrected-FFT Algorithm for Solving Potential Integral Equations with LaPlace or Helmholtz Kernels

    NASA Technical Reports Server (NTRS)

    Phillips, J. R.

    1996-01-01

    In this paper we derive error bounds for a collocation-grid-projection scheme tuned for use in multilevel methods for solving boundary-element discretizations of potential integral equations. The grid-projection scheme is then combined with a precorrected FFT style multilevel method for solving potential integral equations with 1/r and e(sup ikr)/r kernels. A complexity analysis of this combined method is given to show that for homogeneous problems, the method is order n natural log n nearly independent of the kernel. In addition, it is shown analytically and experimentally that for an inhomogeneity generated by a very finely discretized surface, the combined method slows to order n(sup 4/3). Finally, examples are given to show that the collocation-based grid-projection plus precorrected-FFT scheme is competitive with fast-multipole algorithms when considering realistic problems and 1/r kernels, but can be used over a range of spatial frequencies with only a small performance penalty.

  13. Combustor kinetic energy efficiency analysis of the hypersonic research engine data

    NASA Astrophysics Data System (ADS)

    Hoose, K. V.

    1993-11-01

    A one-dimensional method for measuring combustor performance is needed to facilitate design and development scramjet engines. A one-dimensional kinetic energy efficiency method is used for measuring inlet and nozzle performance. The objective of this investigation was to assess the use of kinetic energy efficiency as an indicator for scramjet combustor performance. A combustor kinetic energy efficiency analysis was performed on the Hypersonic Research Engine (HRE) data. The HRE data was chosen for this analysis due to its thorough documentation and availability. The combustor, inlet, and nozzle kinetic energy efficiency values were utilized to determine an overall engine kinetic energy efficiency. Finally, a kinetic energy effectiveness method was developed to eliminate thermochemical losses from the combustion of fuel and air. All calculated values exhibit consistency over the flight speed range. Effects from fuel injection, altitude, angle of attack, subsonic-supersonic combustion transition, and inlet spike position are shown and discussed. The results of analyzing the HRE data indicate that the kinetic energy efficiency method is effective as a measure of scramjet combustor performance.

  14. An EGR performance evaluation and decision-making approach based on grey theory and grey entropy analysis

    PubMed Central

    2018-01-01

    Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization. PMID:29377956

  15. Analysis of Classes of Superlinear Semipositone Problems with Nonlinear Boundary Conditions

    NASA Astrophysics Data System (ADS)

    Morris, Quinn A.

    We study positive radial solutions for classes of steady state reaction diffusion problems on the exterior of a ball with both Dirichlet and nonlinear boundary conditions. We consider p-Laplacian problems (p > 1) with reaction terms which are superlinear at infinity and semipositone. In the case p = 2, using variational methods, we establish the existence of a solution, and via detailed analysis of the Green's function, we prove the positivity of the solution. In the case p ≠ 2, we again use variational methods to establish the existence of a solution, but the positivity of the solution is achieved via sophisticated a priori estimates. In the case p ≠ 2, the Green's function analysis is no longer available. Our results significantly enhance the literature on superlinear semipositone problems. Finally, we provide algorithms for the numerical generation of exact bifurcation curves for one-dimensional problems. In the autonomous case, we extend and analyze a quadrature method, and using nonlinear solvers in Mathematica, generate bifurcation curves. In the nonautonomous case, we employ shooting methods in Mathematica to generate bifurcation curves.

  16. An EGR performance evaluation and decision-making approach based on grey theory and grey entropy analysis.

    PubMed

    Zu, Xianghuan; Yang, Chuanlei; Wang, Hechun; Wang, Yinyan

    2018-01-01

    Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization.

  17. The secret lives of experiments: methods reporting in the fMRI literature.

    PubMed

    Carp, Joshua

    2012-10-15

    Replication of research findings is critical to the progress of scientific understanding. Accordingly, most scientific journals require authors to report experimental procedures in sufficient detail for independent researchers to replicate their work. To what extent do research reports in the functional neuroimaging literature live up to this standard? The present study evaluated methods reporting and methodological choices across 241 recent fMRI articles. Many studies did not report critical methodological details with regard to experimental design, data acquisition, and analysis. Further, many studies were underpowered to detect any but the largest statistical effects. Finally, data collection and analysis methods were highly flexible across studies, with nearly as many unique analysis pipelines as there were studies in the sample. Because the rate of false positive results is thought to increase with the flexibility of experimental designs, the field of functional neuroimaging may be particularly vulnerable to false positives. In sum, the present study documented significant gaps in methods reporting among fMRI studies. Improved methodological descriptions in research reports would yield significant benefits for the field. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. Quantification of Microbial Phenotypes

    PubMed Central

    Martínez, Verónica S.; Krömer, Jens O.

    2016-01-01

    Metabolite profiling technologies have improved to generate close to quantitative metabolomics data, which can be employed to quantitatively describe the metabolic phenotype of an organism. Here, we review the current technologies available for quantitative metabolomics, present their advantages and drawbacks, and the current challenges to generate fully quantitative metabolomics data. Metabolomics data can be integrated into metabolic networks using thermodynamic principles to constrain the directionality of reactions. Here we explain how to estimate Gibbs energy under physiological conditions, including examples of the estimations, and the different methods for thermodynamics-based network analysis. The fundamentals of the methods and how to perform the analyses are described. Finally, an example applying quantitative metabolomics to a yeast model by 13C fluxomics and thermodynamics-based network analysis is presented. The example shows that (1) these two methods are complementary to each other; and (2) there is a need to take into account Gibbs energy errors. Better estimations of metabolic phenotypes will be obtained when further constraints are included in the analysis. PMID:27941694

  19. [Analysis and experimental verification of sensitivity and SNR of laser warning receiver].

    PubMed

    Zhang, Ji-Long; Wang, Ming; Tian, Er-Ming; Li, Xiao; Wang, Zhi-Bin; Zhang, Yue

    2009-01-01

    In order to countermeasure increasingly serious threat from hostile laser in modern war, it is urgent to do research on laser warning technology and system, and the sensitivity and signal to noise ratio (SNR) are two important performance parameters in laser warning system. In the present paper, based on the signal statistical detection theory, a method for calculation of the sensitivity and SNR in coherent detection laser warning receiver (LWR) has been proposed. Firstly, the probabilities of the laser signal and receiver noise were analyzed. Secondly, based on the threshold detection theory and Neyman-Pearson criteria, the signal current equation was established by introducing detection probability factor and false alarm rate factor, then, the mathematical expressions of sensitivity and SNR were deduced. Finally, by using method, the sensitivity and SNR of the sinusoidal grating laser warning receiver developed by our group were analyzed, and the theoretic calculation and experimental results indicate that the SNR analysis method is feasible, and can be used in performance analysis of LWR.

  20. All-inkjet-printed thin-film transistors: manufacturing process reliability by root cause analysis.

    PubMed

    Sowade, Enrico; Ramon, Eloi; Mitra, Kalyan Yoti; Martínez-Domingo, Carme; Pedró, Marta; Pallarès, Jofre; Loffredo, Fausta; Villani, Fulvia; Gomes, Henrique L; Terés, Lluís; Baumann, Reinhard R

    2016-09-21

    We report on the detailed electrical investigation of all-inkjet-printed thin-film transistor (TFT) arrays focusing on TFT failures and their origins. The TFT arrays were manufactured on flexible polymer substrates in ambient condition without the need for cleanroom environment or inert atmosphere and at a maximum temperature of 150 °C. Alternative manufacturing processes for electronic devices such as inkjet printing suffer from lower accuracy compared to traditional microelectronic manufacturing methods. Furthermore, usually printing methods do not allow the manufacturing of electronic devices with high yield (high number of functional devices). In general, the manufacturing yield is much lower compared to the established conventional manufacturing methods based on lithography. Thus, the focus of this contribution is set on a comprehensive analysis of defective TFTs printed by inkjet technology. Based on root cause analysis, we present the defects by developing failure categories and discuss the reasons for the defects. This procedure identifies failure origins and allows the optimization of the manufacturing resulting finally to a yield improvement.

  1. Problems of Mathematical Finance by Stochastic Control Methods

    NASA Astrophysics Data System (ADS)

    Stettner, Łukasz

    The purpose of this paper is to present main ideas of mathematics of finance using the stochastic control methods. There is an interplay between stochastic control and mathematics of finance. On the one hand stochastic control is a powerful tool to study financial problems. On the other hand financial applications have stimulated development in several research subareas of stochastic control in the last two decades. We start with pricing of financial derivatives and modeling of asset prices, studying the conditions for the absence of arbitrage. Then we consider pricing of defaultable contingent claims. Investments in bonds lead us to the term structure modeling problems. Special attention is devoted to historical static portfolio analysis called Markowitz theory. We also briefly sketch dynamic portfolio problems using viscosity solutions to Hamilton-Jacobi-Bellman equation, martingale-convex analysis method or stochastic maximum principle together with backward stochastic differential equation. Finally, long time portfolio analysis for both risk neutral and risk sensitive functionals is introduced.

  2. Evaluation of SAGE Electrochromic Devices: Cooperative Research and Development Final Report, CRADA Number CRD-15-579

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tenent, Robert C.

    2017-12-06

    NREL will conduct durability testing of Sage Electrochromics dynamic windows products using American Society for Testing and Materials (ASTM) standard methods and drive parameters as defined by Sage. Window units will be tested and standard analysis performed. Data will be summarized and reported back to Sage at the end of the testing period.

  3. Final Report for Dynamic Models for Causal Analysis of Panel Data. Preface.

    ERIC Educational Resources Information Center

    Hannan, Michael T.; Tuma, Nancy Brandon

    This document introduces research aimed to explore methods that could be used to make inferences about causual effects of educational change over time when data are from an educational panel. This preface, the first in a series of 14 chapters described in SO 011 760-772, discusses an educational research project designed to examine affects of…

  4. Recovery Act Hospital Alteration Project at Naval Air Station Jacksonville

    DTIC Science & Technology

    2010-12-07

    QMAD Quantitative Methods and Analysis Division RLF Rogers Lovelock & Fritz, Incorporated SE Southeast SF Square Feet SOW Statement of Work TMA TRICARE...Finally, the contractor, Rogers Lovelock & Fritz, Incorporated, reported the recipient information required by the Recovery Act. What We Recommend...contractor, Rogers Lovelock & Fritz, Incorporated (RLF), reported the recipient information required by the Recovery Act. Planning: Initially, Project

  5. Nonlinear Dynamics and Control of Flexible Structures

    DTIC Science & Technology

    1991-03-01

    of which might be used for space applications. This project was a collaborative one involving structural, electrical and mechanical engineers and...methods for vibration analysis and new models to analyze chaotic dynamics in nonlinear structures with large deformations and friction forces. Finally... electrical and mechanical engineers and resulted in nine doctoral dissertations and two masters theses wholly or partially supported by this grant

  6. Numerical study of influence of hydrogen backflow on krypton Hall effect thruster plasma focusing

    NASA Astrophysics Data System (ADS)

    Yan, Shilin; Ding, Yongjie; Wei, Liqiu; Hu, Yanlin; Li, Jie; Ning, Zhongxi; Yu, Daren

    2017-03-01

    The influence of backflow hydrogen on plasma plume focusing of a krypton Hall effect thruster is studied via a numerical simulation method. Theoretical analysis indicates that hydrogen participates in the plasma discharge process, changes the potential and ionization distribution in the thruster discharge cavity, and finally affects the plume focusing within a vacuum vessel.

  7. Costs in Serving Handicapped Children in Head Start: An Analysis of Methods and Cost Estimates. Final Report.

    ERIC Educational Resources Information Center

    Syracuse Univ., NY. Div. of Special Education and Rehabilitation.

    An evaluation of the costs of serving handicapped children in Head Start was based on information collected in conjunction with on-site visits to regular Head Start programs, experimental programs, and specially selected model preschool programs, and from questionnaires completed by 1,353 grantees and delegate agencies of regular Head Start…

  8. An effective fuzzy kernel clustering analysis approach for gene expression data.

    PubMed

    Sun, Lin; Xu, Jiucheng; Yin, Jiaojiao

    2015-01-01

    Fuzzy clustering is an important tool for analyzing microarray data. A major problem in applying fuzzy clustering method to microarray gene expression data is the choice of parameters with cluster number and centers. This paper proposes a new approach to fuzzy kernel clustering analysis (FKCA) that identifies desired cluster number and obtains more steady results for gene expression data. First of all, to optimize characteristic differences and estimate optimal cluster number, Gaussian kernel function is introduced to improve spectrum analysis method (SAM). By combining subtractive clustering with max-min distance mean, maximum distance method (MDM) is proposed to determine cluster centers. Then, the corresponding steps of improved SAM (ISAM) and MDM are given respectively, whose superiority and stability are illustrated through performing experimental comparisons on gene expression data. Finally, by introducing ISAM and MDM into FKCA, an effective improved FKCA algorithm is proposed. Experimental results from public gene expression data and UCI database show that the proposed algorithms are feasible for cluster analysis, and the clustering accuracy is higher than the other related clustering algorithms.

  9. Observing real-world groups in the virtual field: The analysis of online discussion.

    PubMed

    Giles, David C

    2016-09-01

    This article sets out to establish the naturalistic study of online social communication as a substantive topic in social psychology and to discuss the challenges of developing methods for a formal analysis of the structural and interactional features of message threads on discussion forums. I begin by outlining the essential features of online communication and specifically discussion forum data, and the important ways in which they depart from spoken conversation. I describe the handful of attempts to devise systematic analytic techniques for adapting methods such as conversation and discourse analysis to the study of online discussion. I then present a case study of a thread from the popular UK parenting forum Mumsnet which presents a number of challenges for existing methods, and examine some of the interactive phenomena typical of forums. Finally, I consider ways in which membership categorization analysis and social identity theory can complement one another in the exploration of both group processes and the rhetorical deployment of identities as dynamic phenomena in online discussion. © 2016 The British Psychological Society.

  10. Value of Construction Company and its Dependence on Significant Variables

    NASA Astrophysics Data System (ADS)

    Vítková, E.; Hromádka, V.; Ondrušková, E.

    2017-10-01

    The paper deals with the value of the construction company assessment respecting usable approaches and determinable variables. The reasons of the value of the construction company assessment are different, but the most important reasons are the sale or the purchase of the company, the liquidation of the company, the fusion of the company with another subject or the others. According the reason of the value assessment it is possible to determine theoretically different approaches for valuation, mainly it concerns about the yield method of valuation and the proprietary method of valuation. Both approaches are dependant of detailed input variables, which quality will influence the final assessment of the company´s value. The main objective of the paper is to suggest, according to the analysis, possible ways of input variables, mainly in the form of expected cash-flows or the profit, determination. The paper is focused mainly on methods of time series analysis, regression analysis and mathematical simulation utilization. As the output, the results of the analysis on the case study will be demonstrated.

  11. Sentiments Analysis of Reviews Based on ARCNN Model

    NASA Astrophysics Data System (ADS)

    Xu, Xiaoyu; Xu, Ming; Xu, Jian; Zheng, Ning; Yang, Tao

    2017-10-01

    The sentiments analysis of product reviews is designed to help customers understand the status of the product. The traditional method of sentiments analysis relies on the input of a fixed feature vector which is performance bottleneck of the basic codec architecture. In this paper, we propose an attention mechanism with BRNN-CNN model, referring to as ARCNN model. In order to have a good analysis of the semantic relations between words and solves the problem of dimension disaster, we use the GloVe algorithm to train the vector representations for words. Then, ARCNN model is proposed to deal with the problem of deep features training. Specifically, BRNN model is proposed to investigate non-fixed-length vectors and keep time series information perfectly and CNN can study more connection of deep semantic links. Moreover, the attention mechanism can automatically learn from the data and optimize the allocation of weights. Finally, a softmax classifier is designed to complete the sentiment classification of reviews. Experiments show that the proposed method can improve the accuracy of sentiment classification compared with benchmark methods.

  12. Testing Group Mean Differences of Latent Variables in Multilevel Data Using Multiple-Group Multilevel CFA and Multilevel MIMIC Modeling.

    PubMed

    Kim, Eun Sook; Cao, Chunhua

    2015-01-01

    Considering that group comparisons are common in social science, we examined two latent group mean testing methods when groups of interest were either at the between or within level of multilevel data: multiple-group multilevel confirmatory factor analysis (MG ML CFA) and multilevel multiple-indicators multiple-causes modeling (ML MIMIC). The performance of these methods were investigated through three Monte Carlo studies. In Studies 1 and 2, either factor variances or residual variances were manipulated to be heterogeneous between groups. In Study 3, which focused on within-level multiple-group analysis, six different model specifications were considered depending on how to model the intra-class group correlation (i.e., correlation between random effect factors for groups within cluster). The results of simulations generally supported the adequacy of MG ML CFA and ML MIMIC for multiple-group analysis with multilevel data. The two methods did not show any notable difference in the latent group mean testing across three studies. Finally, a demonstration with real data and guidelines in selecting an appropriate approach to multilevel multiple-group analysis are provided.

  13. Improved sample preparation to determine acrylamide in difficult matrixes such as chocolate powder, cocoa, and coffee by liquid chromatography tandem mass spectroscopy.

    PubMed

    Delatour, Thierry; Périsset, Adrienne; Goldmann, Till; Riediker, Sonja; Stadler, Richard H

    2004-07-28

    An improved sample preparation (extraction and cleanup) is presented that enables the quantification of low levels of acrylamide in difficult matrixes, including soluble chocolate powder, cocoa, coffee, and coffee surrogate. Final analysis is done by isotope-dilution liquid chromatography-electrospray ionization tandem mass spectrometry (LC-MS/MS) using d3-acrylamide as internal standard. Sample pretreatment essentially encompasses (a) protein precipitation with Carrez I and II solutions, (b) extraction of the analyte into ethyl acetate, and (c) solid-phase extraction on a Multimode cartridge. The stability of acrylamide in final extracts and in certain commercial foods and beverages is also reported. This approach provided good performance in terms of linearity, accuracy and precision. Full validation was conducted in soluble chocolate powder, achieving a decision limit (CCalpha) and detection capability (CCbeta) of 9.2 and 12.5 microg/kg, respectively. The method was extended to the analysis of acrylamide in various foodstuffs such as mashed potatoes, crisp bread, and butter biscuit and cookies. Furthermore, the accuracy of the method is demonstrated by the results obtained in three inter-laboratory proficiency tests. Copyright 2004 American Chemical Society

  14. Diagnosis of Strongyloides stercoralis by morphological characteristics combine with molecular biological methods.

    PubMed

    Wang, Li-Fu; Xu, Lian; Luo, Shi-Qi; Xie, Hui; Chen, Wei; Wu, Zhong-Dao; Sun, Xi

    2017-04-01

    Strongyloidiasis is one of the neglected tropical diseases caused by infection with the nematode Strongyloides genus and distributed worldwide. Strongyloidiasis can be fatal in immunosuppressed patients induced hyperinfection or disseminated strongyloidiasis. Unfortunately, until now, due to the unspecific clinical symptom in infected individuals and the low sensitivity diagnosis of strongyloidiasis, many patients were misdiagnosed every year. Furthermore, the larvae of the Strongyloides stercoralis (S. stercoralis) is similar to other nematodes such as hookworm, Trichostrongylus increased the difficulty of diagnosis. In this case, the patient is a 63-year-old male person, who had a nearly 30 years medical history of asthma and emphysema, and 4-5-year medical history of diabetes. The sputum examination found some parasite larvae, then we identify the larvae using clinical observation and morphological characteristics combine with examined cytochrome oxidase subunit 1 (COX1) and 18S rRNA genes by PCR, sequence analysis and finally classified by phylogenetic analysis, the larvae were diagnosed as S. stercoralis. Our results showed that diagnosis with strongyloidiasis by morphological characteristics combine with molecular biological methods can improve the sensitive of diagnosis and provide a final diagnosis for the disease in the clinics.

  15. [Application of chemometrics in composition-activity relationship research of traditional Chinese medicine].

    PubMed

    Han, Sheng-Nan

    2014-07-01

    Chemometrics is a new branch of chemistry which is widely applied to various fields of analytical chemistry. Chemometrics can use theories and methods of mathematics, statistics, computer science and other related disciplines to optimize the chemical measurement process and maximize access to acquire chemical information and other information on material systems by analyzing chemical measurement data. In recent years, traditional Chinese medicine has attracted widespread attention. In the research of traditional Chinese medicine, it has been a key problem that how to interpret the relationship between various chemical components and its efficacy, which seriously restricts the modernization of Chinese medicine. As chemometrics brings the multivariate analysis methods into the chemical research, it has been applied as an effective research tool in the composition-activity relationship research of Chinese medicine. This article reviews the applications of chemometrics methods in the composition-activity relationship research in recent years. The applications of multivariate statistical analysis methods (such as regression analysis, correlation analysis, principal component analysis, etc. ) and artificial neural network (such as back propagation artificial neural network, radical basis function neural network, support vector machine, etc. ) are summarized, including the brief fundamental principles, the research contents and the advantages and disadvantages. Finally, the existing main problems and prospects of its future researches are proposed.

  16. Comprehensive analysis of ß-lactam antibiotics including penicillins, cephalosporins, and carbapenems in poultry muscle using liquid chromatography coupled to tandem mass spectrometry.

    PubMed

    Berendsen, Bjorn J A; Gerritsen, Henk W; Wegh, Robin S; Lameris, Steven; van Sebille, Ralph; Stolker, Alida A M; Nielen, Michel W F

    2013-09-01

    A comprehensive method for the quantitative residue analysis of trace levels of 22 ß-lactam antibiotics, including penicillins, cephalosporins, and carbapenems, in poultry muscle by liquid chromatography in combination with tandem mass spectrometric detection is reported. The samples analyzed for ß-lactam residues are hydrolyzed using piperidine in order to improve compound stability and to include the total residue content of the cephalosporin ceftifour. The reaction procedure was optimized using a full experimental design. Following detailed isotope labeling, tandem mass spectrometry studies and exact mass measurements using high-resolution mass spectrometry reaction schemes could be proposed for all ß-lactams studied. The main reaction occurring is the hydrolysis of the ß-lactam ring under formation of the piperidine substituted amide. For some ß-lactams, multiple isobaric hydrolysis reaction products are obtained, in accordance with expectations, but this did not hamper quantitative analysis. The final method was fully validated as a quantitative confirmatory residue analysis method according to Commission Decision 2002/657/EC and showed satisfactory quantitative performance for all compounds with trueness between 80 and 110% and within-laboratory reproducibility below 22% at target level, except for biapenem. For biapenem, the method proved to be suitable for qualitative analysis only.

  17. Measurements of Reynolds stress profiles in unstratified tidal flow

    USGS Publications Warehouse

    Stacey, M.T.; Monismith, Stephen G.; Burau, J.R.

    1999-01-01

    In this paper we present a method for measuring profiles of turbulence quantities using a broadband acoustic doppler current profiler (ADCP). The method follows previous work on the continental shelf and extends the analysis to develop estimates of the errors associated with the estimation methods. ADCP data was collected in an unstratified channel and the results of the analysis are compared to theory. This comparison shows that the method provides an estimate of the Reynolds stresses, which is unbiased by Doppler noise, and an estimate of the turbulent kinetic energy (TKE) which is biased by an amount proportional to the Doppler noise. The noise in each of these quantities as well as the bias in the TKE match well with the theoretical values produced by the error analysis. The quantification of profiles of Reynolds stresses simultaneous with the measurement of mean velocity profiles allows for extensive analysis of the turbulence of the flow. In this paper, we examine the relation between the turbulence and the mean flow through the calculation of u*, the friction velocity, and Cd, the coefficient of drag. Finally, we calculate quantities of particular interest in turbulence modeling and analysis, the characteristic lengthscales, including a lengthscale which represents the stream-wise scale of the eddies which dominate the Reynolds stresses. Copyright 1999 by the American Geophysical Union.

  18. Soil hydraulic properties estimate based on numerical analysis of disc infiltrometer three-dimensional infiltration curve

    NASA Astrophysics Data System (ADS)

    Latorre, Borja; Peña-Sancho, Carolina; Angulo-Jaramillo, Rafaël; Moret-Fernández, David

    2015-04-01

    Measurement of soil hydraulic properties is of paramount importance in fields such as agronomy, hydrology or soil science. Fundamented on the analysis of the Haverkamp et al. (1994) model, the aim of this paper is to explain a technique to estimate the soil hydraulic properties (sorptivity, S, and hydraulic conductivity, K) from the full-time cumulative infiltration curves. The method (NSH) was validated by means of 12 synthetic infiltration curves generated with HYDRUS-3D from known soil hydraulic properties. The K values used to simulate the synthetic curves were compared to those estimated with the proposed method. A procedure to identify and remove the effect of the contact sand layer on the cumulative infiltration curve was also developed. A sensitivity analysis was performed using the water level measurement as uncertainty source. Finally, the procedure was evaluated using different infiltration times and data noise. Since a good correlation between the K used in HYDRUS-3D to model the infiltration curves and those estimated by the NSH method was obtained, (R2 =0.98), it can be concluded that this technique is robust enough to estimate the soil hydraulic conductivity from complete infiltration curves. The numerical procedure to detect and remove the influence of the contact sand layer on the K and S estimates seemed to be robust and efficient. An effect of the curve infiltration noise on the K estimate was observed, which uncertainty increased with increasing noise. Finally, the results showed that infiltration time was an important factor to estimate K. Lower values of K or smaller uncertainty needed longer infiltration times.

  19. Accuracy improvement of the H-drive air-levitating wafer inspection stage based on error analysis and compensation

    NASA Astrophysics Data System (ADS)

    Zhang, Fan; Liu, Pinkuan

    2018-04-01

    In order to improve the inspection precision of the H-drive air-bearing stage for wafer inspection, in this paper the geometric error of the stage is analyzed and compensated. The relationship between the positioning errors and error sources are initially modeled, and seven error components are identified that are closely related to the inspection accuracy. The most effective factor that affects the geometric error is identified by error sensitivity analysis. Then, the Spearman rank correlation method is applied to find the correlation between different error components, aiming at guiding the accuracy design and error compensation of the stage. Finally, different compensation methods, including the three-error curve interpolation method, the polynomial interpolation method, the Chebyshev polynomial interpolation method, and the B-spline interpolation method, are employed within the full range of the stage, and their results are compared. Simulation and experiment show that the B-spline interpolation method based on the error model has better compensation results. In addition, the research result is valuable for promoting wafer inspection accuracy and will greatly benefit the semiconductor industry.

  20. A New Cluster Analysis-Marker-Controlled Watershed Method for Separating Particles of Granular Soils

    PubMed Central

    Alam, Md Ferdous

    2017-01-01

    An accurate determination of particle-level fabric of granular soils from tomography data requires a maximum correct separation of particles. The popular marker-controlled watershed separation method is widely used to separate particles. However, the watershed method alone is not capable of producing the maximum separation of particles when subjected to boundary stresses leading to crushing of particles. In this paper, a new separation method, named as Monash Particle Separation Method (MPSM), has been introduced. The new method automatically determines the optimal contrast coefficient based on cluster evaluation framework to produce the maximum accurate separation outcomes. Finally, the particles which could not be separated by the optimal contrast coefficient were separated by integrating cuboid markers generated from the clustering by Gaussian mixture models into the routine watershed method. The MPSM was validated on a uniformly graded sand volume subjected to one-dimensional compression loading up to 32 MPa. It was demonstrated that the MPSM is capable of producing the best possible separation of particles required for the fabric analysis. PMID:29057823

  1. A comparison of simple shear characterization methods for composite laminates

    NASA Technical Reports Server (NTRS)

    Yeow, Y. T.; Brinson, H. F.

    1978-01-01

    Various methods for the shear stress/strain characterization of composite laminates are examined and their advantages and limitations are briefly discussed. Experimental results and the necessary accompanying analysis are then presented and compared for three simple shear characterization procedures. These are the off-axis tensile test method, the (+/- 45 deg)s tensile test method and the (0/90 deg)s symmetric rail shear test method. It is shown that the first technique indicates the shear properties of the graphite/epoxy laminates investigated are fundamentally brittle in nature while the latter two methods tend to indicate that these laminates are fundamentally ductile in nature. Finally, predictions of incrementally determined tensile stress/strain curves utilizing the various different shear behaviour methods as input information are presented and discussed.

  2. A comparison of simple shear characterization methods for composite laminates

    NASA Technical Reports Server (NTRS)

    Yeow, Y. T.; Brinson, H. F.

    1977-01-01

    Various methods for the shear stress-strain characterization of composite laminates are examined, and their advantages and limitations are briefly discussed. Experimental results and the necessary accompanying analysis are then presented and compared for three simple shear characterization procedures. These are the off-axis tensile test method, the + or - 45 degs tensile test method and the 0 deg/90 degs symmetric rail shear test method. It is shown that the first technique indicates that the shear properties of the G/E laminates investigated are fundamentally brittle in nature while the latter two methods tend to indicate that the G/E laminates are fundamentally ductile in nature. Finally, predictions of incrementally determined tensile stress-strain curves utilizing the various different shear behavior methods as input information are presented and discussed.

  3. A novel quality by design approach for developing an HPLC method to analyze herbal extracts: A case study of sugar content analysis.

    PubMed

    Shao, Jingyuan; Cao, Wen; Qu, Haibin; Pan, Jianyang; Gong, Xingchu

    2018-01-01

    The aim of this study was to present a novel analytical quality by design (AQbD) approach for developing an HPLC method to analyze herbal extracts. In this approach, critical method attributes (CMAs) and critical method parameters (CMPs) of the analytical method were determined using the same data collected from screening experiments. The HPLC-ELSD method for separation and quantification of sugars in Codonopsis Radix extract (CRE) samples and Astragali Radix extract (ARE) samples was developed as an example method with a novel AQbD approach. Potential CMAs and potential CMPs were found with Analytical Target Profile. After the screening experiments, the retention time of the D-glucose peak of CRE samples, the signal-to-noise ratio of the D-glucose peak of CRE samples, and retention time of the sucrose peak in ARE samples were considered CMAs. The initial and final composition of the mobile phase, flow rate, and column temperature were found to be CMPs using a standard partial regression coefficient method. The probability-based design space was calculated using a Monte-Carlo simulation method and verified by experiments. The optimized method was validated to be accurate and precise, and then it was applied in the analysis of CRE and ARE samples. The present AQbD approach is efficient and suitable for analysis objects with complex compositions.

  4. A novel genome signature based on inter-nucleotide distances profiles for visualization of metagenomic data

    NASA Astrophysics Data System (ADS)

    Xie, Xian-Hua; Yu, Zu-Guo; Ma, Yuan-Lin; Han, Guo-Sheng; Anh, Vo

    2017-09-01

    There has been a growing interest in visualization of metagenomic data. The present study focuses on the visualization of metagenomic data using inter-nucleotide distances profile. We first convert the fragment sequences into inter-nucleotide distances profiles. Then we analyze these profiles by principal component analysis. Finally the principal components are used to obtain the 2-D scattered plot according to their source of species. We name our method as inter-nucleotide distances profiles (INP) method. Our method is evaluated on three benchmark data sets used in previous published papers. Our results demonstrate that the INP method is good, alternative and efficient for visualization of metagenomic data.

  5. Modelling and Analysis of the Excavation Phase by the Theory of Blocks Method of Tunnel 4 Kherrata Gorge, Algeria

    NASA Astrophysics Data System (ADS)

    Boukarm, Riadh; Houam, Abdelkader; Fredj, Mohammed; Boucif, Rima

    2017-12-01

    The aim of our work is to check the stability during excavation tunnel work in the rock mass of Kherrata, connecting the cities of Bejaia to Setif. The characterization methods through the Q system (method of Barton), RMR (Bieniawski classification) allowed us to conclude that the quality of rock mass is average in limestone, and poor in fractured limestone. Then modelling of excavation phase using the theory of blocks method (Software UNWEDGE) with the parameters from the recommendations of classification allowed us to check stability and to finally conclude that the use of geomechanical classification and the theory of blocks can be considered reliable in preliminary design.

  6. On convergence and convergence rates for Ivanov and Morozov regularization and application to some parameter identification problems in elliptic PDEs

    NASA Astrophysics Data System (ADS)

    Kaltenbacher, Barbara; Klassen, Andrej

    2018-05-01

    In this paper we provide a convergence analysis of some variational methods alternative to the classical Tikhonov regularization, namely Ivanov regularization (also called the method of quasi solutions) with some versions of the discrepancy principle for choosing the regularization parameter, and Morozov regularization (also called the method of the residuals). After motivating nonequivalence with Tikhonov regularization by means of an example, we prove well-definedness of the Ivanov and the Morozov method, convergence in the sense of regularization, as well as convergence rates under variational source conditions. Finally, we apply these results to some linear and nonlinear parameter identification problems in elliptic boundary value problems.

  7. A method for simultaneous linear optics and coupling correction for storage rings with turn-by-turn beam position monitor data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Xi; Huang, Xiaobiao

    2016-05-13

    Here, we propose a method to simultaneously correct linear optics errors and linear coupling for storage rings using turn-by-turn (TbT) beam position monitor (BPM) data. The independent component analysis (ICA) method is used to isolate the betatron normal modes from the measured TbT BPM data. The betatron amplitudes and phase advances of the projections of the normal modes on the horizontal and vertical planes are then extracted, which, combined with dispersion measurement, are used to fit the lattice model. The fitting results are used for lattice correction. Finally, the method has been successfully demonstrated on the NSLS-II storage ring.

  8. In-depth analysis and characterization of a dual damascene process with respect to different CD

    NASA Astrophysics Data System (ADS)

    Krause, Gerd; Hofmann, Detlef; Habets, Boris; Buhl, Stefan; Gutsch, Manuela; Lopez-Gomez, Alberto; Kim, Wan-Soo; Thrun, Xaver

    2018-03-01

    In a 200 mm high volume environment, we studied data from a dual damascene process. Dual damascene is a combination of lithography, etch and CMP that is used to create copper lines and contacts in one single step. During these process steps, different metal CD are measured by different measurement methods. In this study, we analyze the key numbers of the different measurements after different process steps and develop simple models to predict the electrical behavior* . In addition, radial profiles have been analyzed of both inline measurement parameters and electrical parameters. A matching method was developed based on inline and electrical data. Finally, correlation analysis for radial signatures is presented that can be used to predict excursions in electrical signatures.

  9. Research on public logistics centers of Zhenzhou city based on GIS

    NASA Astrophysics Data System (ADS)

    Zeng, Yuhuai; Chen, Shuisen; Tian, Zhihui; Miao, Quansheng

    2008-10-01

    The regional public logistics center (PLC) is the intermedium that transports goods or commodity from producer to wholesaler, retailer and end consumer through whole supply chains. According to the Central Place Theory, the PLC should be multi-centric and of more kinds of graded degrees. From the road network planning discipline, an unique index---Importance Degree, is presented to measure the capacity of a PLC. The Importance Degree selects three township criteria: total population, gross industry product and budget income as weights to calculate the weighted vectors by principle component analysis method. Finally, through the clustering analysis, we can get the graded degrees of PLCs. It proves that that this research method is very effective for the road network planning of Zhengzhou City.

  10. Lie symmetry analysis, conservation laws, solitary and periodic waves for a coupled Burger equation

    NASA Astrophysics Data System (ADS)

    Xu, Mei-Juan; Tian, Shou-Fu; Tu, Jian-Min; Zhang, Tian-Tian

    2017-01-01

    Under investigation in this paper is a generalized (2 + 1)-dimensional coupled Burger equation with variable coefficients, which describes lots of nonlinear physical phenomena in geophysical fluid dynamics, condense matter physics and lattice dynamics. By employing the Lie group method, the symmetry reductions and exact explicit solutions are obtained, respectively. Based on a direct method, the conservations laws of the equation are also derived. Furthermore, by virtue of the Painlevé analysis, we successfully obtain the integrable condition on the variable coefficients, which plays an important role in further studying the integrability of the equation. Finally, its auto-Bäcklund transformation as well as some new analytic solutions including solitary and periodic waves are also presented via algebraic and differential manipulation.

  11. Morphological feature extraction for the classification of digital images of cancerous tissues.

    PubMed

    Thiran, J P; Macq, B

    1996-10-01

    This paper presents a new method for automatic recognition of cancerous tissues from an image of a microscopic section. Based on the shape and the size analysis of the observed cells, this method provides the physician with nonsubjective numerical values for four criteria of malignancy. This automatic approach is based on mathematical morphology, and more specifically on the use of Geodesy. This technique is used first to remove the background noise from the image and then to operate a segmentation of the nuclei of the cells and an analysis of their shape, their size, and their texture. From the values of the extracted criteria, an automatic classification of the image (cancerous or not) is finally operated.

  12. Accuracy Analysis of a Wireless Indoor Positioning System Using Geodetic Methods

    NASA Astrophysics Data System (ADS)

    Wagner, Przemysław; Woźniak, Marek; Odziemczyk, Waldemar; Pakuła, Dariusz

    2017-12-01

    Ubisense RTLS is one of the Indoor positioning systems using an Ultra Wide Band. AOA and TDOA methods are used as a principle of positioning. The accuracy of positioning depends primarily on the accuracy of determined angles and distance differences. The paper presents the results of accuracy research which includes a theoretical accuracy prediction and a practical test. Theoretical accuracy was calculated for two variants of system components geometry, assuming the parameters declared by the system manufacturer. Total station measurements were taken as a reference during the practical test. The results of the analysis are presented in a graphical form. A sample implementation (MagMaster) developed by Globema is presented in the final part of the paper.

  13. Algorithm of reducing the false positives in IDS based on correlation Analysis

    NASA Astrophysics Data System (ADS)

    Liu, Jianyi; Li, Sida; Zhang, Ru

    2018-03-01

    This paper proposes an algorithm of reducing the false positives in IDS based on correlation Analysis. Firstly, the algorithm analyzes the distinguishing characteristics of false positives and real alarms, and preliminary screen the false positives; then use the method of attribute similarity clustering to the alarms and further reduces the amount of alarms; finally, according to the characteristics of multi-step attack, associated it by the causal relationship. The paper also proposed a reverse causation algorithm based on the attack association method proposed by the predecessors, turning alarm information into a complete attack path. Experiments show that the algorithm simplifies the number of alarms, improve the efficiency of alarm processing, and contribute to attack purposes identification and alarm accuracy improvement.

  14. Analysis of stationary availability factor of two-level backbone computer networks with arbitrary topology

    NASA Astrophysics Data System (ADS)

    Rahman, P. A.

    2018-05-01

    This scientific paper deals with the two-level backbone computer networks with arbitrary topology. A specialized method, offered by the author for calculation of the stationary availability factor of the two-level backbone computer networks, based on the Markov reliability models for the set of the independent repairable elements with the given failure and repair rates and the methods of the discrete mathematics, is also discussed. A specialized algorithm, offered by the author for analysis of the network connectivity, taking into account different kinds of the network equipment failures, is also observed. Finally, this paper presents an example of calculation of the stationary availability factor for the backbone computer network with the given topology.

  15. Optimization design combined with coupled structural-electrostatic analysis for the electrostatically controlled deployable membrane reflector

    NASA Astrophysics Data System (ADS)

    Liu, Chao; Yang, Guigeng; Zhang, Yiqun

    2015-01-01

    The electrostatically controlled deployable membrane reflector (ECDMR) is a promising scheme to construct large size and high precision space deployable reflector antennas. This paper presents a novel design method for the large size and small F/D ECDMR considering the coupled structure-electrostatic problem. First, the fully coupled structural-electrostatic system is described by a three field formulation, in which the structure and passive electrical field is modeled by finite element method, and the deformation of the electrostatic domain is predicted by a finite element formulation of a fictitious elastic structure. A residual formulation of the structural-electrostatic field finite element model is established and solved by Newton-Raphson method. The coupled structural-electrostatic analysis procedure is summarized. Then, with the aid of this coupled analysis procedure, an integrated optimization method of membrane shape accuracy and stress uniformity is proposed, which is divided into inner and outer iterative loops. The initial state of relatively high shape accuracy and uniform stress distribution is achieved by applying the uniform prestress on the membrane design shape and optimizing the voltages, in which the optimal voltage is computed by a sensitivity analysis. The shape accuracy is further improved by the iterative prestress modification using the reposition balance method. Finally, the results of the uncoupled and coupled methods are compared and the proposed optimization method is applied to design an ECDMR. The results validate the effectiveness of this proposed methods.

  16. SigEMD: A powerful method for differential gene expression analysis in single-cell RNA sequencing data.

    PubMed

    Wang, Tianyu; Nabavi, Sheida

    2018-04-24

    Differential gene expression analysis is one of the significant efforts in single cell RNA sequencing (scRNAseq) analysis to discover the specific changes in expression levels of individual cell types. Since scRNAseq exhibits multimodality, large amounts of zero counts, and sparsity, it is different from the traditional bulk RNA sequencing (RNAseq) data. The new challenges of scRNAseq data promote the development of new methods for identifying differentially expressed (DE) genes. In this study, we proposed a new method, SigEMD, that combines a data imputation approach, a logistic regression model and a nonparametric method based on the Earth Mover's Distance, to precisely and efficiently identify DE genes in scRNAseq data. The regression model and data imputation are used to reduce the impact of large amounts of zero counts, and the nonparametric method is used to improve the sensitivity of detecting DE genes from multimodal scRNAseq data. By additionally employing gene interaction network information to adjust the final states of DE genes, we further reduce the false positives of calling DE genes. We used simulated datasets and real datasets to evaluate the detection accuracy of the proposed method and to compare its performance with those of other differential expression analysis methods. Results indicate that the proposed method has an overall powerful performance in terms of precision in detection, sensitivity, and specificity. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Rapid Determination of Clenbuterol in Pork by Direct Immersion Solid-Phase Microextraction Coupled with Gas Chromatography-Mass Spectrometry.

    PubMed

    Ye, Diru; Wu, Susu; Xu, Jianqiao; Jiang, Ruifen; Zhu, Fang; Ouyang, Gangfeng

    2016-02-01

    Direct immersion solid-phase microextraction (DI-SPME) coupled with gas chromatography-mass spectrometry (GC-MS) was developed for rapid analysis of clenbuterol in pork for the first time. In this work, a low-cost homemade 44 µm polydimethylsiloxane (PDMS) SPME fiber was employed to extract clenbuterol in pork. After extraction, derivatization was performed by suspending the fiber in the headspace of the 2 mL sample vial saturated with a vapor of 100 µL hexamethyldisilazane. Lastly, the fiber was directly introduced to GC-MS for analysis. All parameters that influenced absorption (extraction time), derivatization (derivatization reagent, time and temperature) and desorption (desorption time) were optimized. Under optimized conditions, the method offered a wide linear range (10-1000 ng g(-1)) and a low detection limit (3.6 ng g(-1)). Finally, the method was successfully applied in the analysis of pork from the market, and recoveries of the method for spiked pork were 97.4-105.7%. Compared with the traditional solvent extraction method, the proposed method was much cheaper and fast. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Decoding Dynamic Brain Patterns from Evoked Responses: A Tutorial on Multivariate Pattern Analysis Applied to Time Series Neuroimaging Data.

    PubMed

    Grootswagers, Tijl; Wardle, Susan G; Carlson, Thomas A

    2017-04-01

    Multivariate pattern analysis (MVPA) or brain decoding methods have become standard practice in analyzing fMRI data. Although decoding methods have been extensively applied in brain-computer interfaces, these methods have only recently been applied to time series neuroimaging data such as MEG and EEG to address experimental questions in cognitive neuroscience. In a tutorial style review, we describe a broad set of options to inform future time series decoding studies from a cognitive neuroscience perspective. Using example MEG data, we illustrate the effects that different options in the decoding analysis pipeline can have on experimental results where the aim is to "decode" different perceptual stimuli or cognitive states over time from dynamic brain activation patterns. We show that decisions made at both preprocessing (e.g., dimensionality reduction, subsampling, trial averaging) and decoding (e.g., classifier selection, cross-validation design) stages of the analysis can significantly affect the results. In addition to standard decoding, we describe extensions to MVPA for time-varying neuroimaging data including representational similarity analysis, temporal generalization, and the interpretation of classifier weight maps. Finally, we outline important caveats in the design and interpretation of time series decoding experiments.

  19. Randomized controlled trials and meta-analysis in medical education: what role do they play?

    PubMed

    Cook, David A

    2012-01-01

    Education researchers seek to understand what works, for whom, in what circumstances. Unfortunately, educational environments are complex and research itself is highly context dependent. Faced with these challenges, some have argued that qualitative methods should supplant quantitative methods such as randomized controlled trials (RCTs) and meta-analysis. I disagree. Good qualitative and mixed-methods research are complementary to, rather than exclusive of, quantitative methods. The complexity and challenges we face should not beguile us into ignoring methods that provide strong evidence. What, then, is the proper role for RCTs and meta-analysis in medical education? First, the choice of study design depends on the research question. RCTs and meta-analysis are appropriate for many, but not all, study goals. They have compelling strengths but also numerous limitations. Second, strong methods will not compensate for a pointless question. RCTs do not advance the science when they make confounded comparisons, or make comparison with no intervention. Third, clinical medicine now faces many of the same challenges we encounter in education. We can learn much from other fields about how to handle complexity in RCTs. Finally, no single study will definitively answer any research question. We need carefully planned, theory-building, programmatic research, reflecting a variety of paradigms and approaches, as we accumulate evidence to change the art and science of education.

  20. A stage structure pest management model with impulsive state feedback control

    NASA Astrophysics Data System (ADS)

    Pang, Guoping; Chen, Lansun; Xu, Weijian; Fu, Gang

    2015-06-01

    A stage structure pest management model with impulsive state feedback control is investigated. We get the sufficient condition for the existence of the order-1 periodic solution by differential equation geometry theory and successor function. Further, we obtain a new judgement method for the stability of the order-1 periodic solution of the semi-continuous systems by referencing the stability analysis for limit cycles of continuous systems, which is different from the previous method of analog of Poincarè criterion. Finally, we analyze numerically the theoretical results obtained.

Top