Sample records for analysis methods produced

  1. Testing an automated method to estimate ground-water recharge from streamflow records

    USGS Publications Warehouse

    Rutledge, A.T.; Daniel, C.C.

    1994-01-01

    The computer program, RORA, allows automated analysis of streamflow hydrographs to estimate ground-water recharge. Output from the program, which is based on the recession-curve-displacement method (often referred to as the Rorabaugh method, for whom the program is named), was compared to estimates of recharge obtained from a manual analysis of 156 years of streamflow record from 15 streamflow-gaging stations in the eastern United States. Statistical tests showed that there was no significant difference between paired estimates of annual recharge by the two methods. Tests of results produced by the four workers who performed the manual method showed that results can differ significantly between workers. Twenty-two percent of the variation between manual and automated estimates could be attributed to having different workers perform the manual method. The program RORA will produce estimates of recharge equivalent to estimates produced manually, greatly increase the speed od analysis, and reduce the subjectivity inherent in manual analysis.

  2. Clustering Methods with Qualitative Data: A Mixed Methods Approach for Prevention Research with Small Samples

    PubMed Central

    Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.

    2016-01-01

    Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969

  3. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples.

    PubMed

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G

    2015-10-01

    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.

  4. A comparison of analysis methods to estimate contingency strength.

    PubMed

    Lloyd, Blair P; Staubitz, Johanna L; Tapp, Jon T

    2018-05-09

    To date, several data analysis methods have been used to estimate contingency strength, yet few studies have compared these methods directly. To compare the relative precision and sensitivity of four analysis methods (i.e., exhaustive event-based, nonexhaustive event-based, concurrent interval, concurrent+lag interval), we applied all methods to a simulated data set in which several response-dependent and response-independent schedules of reinforcement were programmed. We evaluated the degree to which contingency strength estimates produced from each method (a) corresponded with expected values for response-dependent schedules and (b) showed sensitivity to parametric manipulations of response-independent reinforcement. Results indicated both event-based methods produced contingency strength estimates that aligned with expected values for response-dependent schedules, but differed in sensitivity to response-independent reinforcement. The precision of interval-based methods varied by analysis method (concurrent vs. concurrent+lag) and schedule type (continuous vs. partial), and showed similar sensitivities to response-independent reinforcement. Recommendations and considerations for measuring contingencies are identified. © 2018 Society for the Experimental Analysis of Behavior.

  5. A Comparison of Component and Factor Patterns: A Monte Carlo Approach.

    ERIC Educational Resources Information Center

    Velicer, Wayne F.; And Others

    1982-01-01

    Factor analysis, image analysis, and principal component analysis are compared with respect to the factor patterns they would produce under various conditions. The general conclusion that is reached is that the three methods produce results that are equivalent. (Author/JKS)

  6. Development of synthetic nuclear melt glass for forensic analysis.

    PubMed

    Molgaard, Joshua J; Auxier, John D; Giminaro, Andrew V; Oldham, C J; Cook, Matthew T; Young, Stephen A; Hall, Howard L

    A method for producing synthetic debris similar to the melt glass produced by nuclear surface testing is demonstrated. Melt glass from the first nuclear weapon test (commonly referred to as trinitite) is used as the benchmark for this study. These surrogates can be used to simulate a variety of scenarios and will serve as a tool for developing and validating forensic analysis methods.

  7. Investigating the Accuracy of Point Clouds Generated for Rock Surfaces

    NASA Astrophysics Data System (ADS)

    Seker, D. Z.; Incekara, A. H.

    2016-12-01

    Point clouds which are produced by means of different techniques are widely used to model the rocks and obtain the properties of rock surfaces like roughness, volume and area. These point clouds can be generated by applying laser scanning and close range photogrammetry techniques. Laser scanning is the most common method to produce point cloud. In this method, laser scanner device produces 3D point cloud at regular intervals. In close range photogrammetry, point cloud can be produced with the help of photographs taken in appropriate conditions depending on developing hardware and software technology. Many photogrammetric software which is open source or not currently provide the generation of point cloud support. Both methods are close to each other in terms of accuracy. Sufficient accuracy in the mm and cm range can be obtained with the help of a qualified digital camera and laser scanner. In both methods, field work is completed in less time than conventional techniques. In close range photogrammetry, any part of rock surfaces can be completely represented owing to overlapping oblique photographs. In contrast to the proximity of the data, these two methods are quite different in terms of cost. In this study, whether or not point cloud produced by photographs can be used instead of point cloud produced by laser scanner device is investigated. In accordance with this purpose, rock surfaces which have complex and irregular shape located in İstanbul Technical University Ayazaga Campus were selected as study object. Selected object is mixture of different rock types and consists of both partly weathered and fresh parts. Study was performed on a part of 30m x 10m rock surface. 2D and 3D analysis were performed for several regions selected from the point clouds of the surface models. 2D analysis is area-based and 3D analysis is volume-based. Analysis conclusions showed that point clouds in both are similar and can be used as alternative to each other. This proved that point cloud produced using photographs which are both economical and enables to produce data in less time can be used in several studies instead of point cloud produced by laser scanner.

  8. Appraisal of Biofilm Formation in Diabetic Foot Infections by Comparing Phenotypic Methods With the Ultrastructural Analysis.

    PubMed

    Pugazhendhi, Sugandhi; Dorairaj, Arvind Prasanth

    Diabetic patients are more prone to the development of foot ulcers, because their underlying tissues are exposed to colonization by various pathogenic organisms. Hence, biofilm formation plays a vital role in disease progression by antibiotic resistance to the pathogen found in foot infections. The present study has demonstrated the correlation of biofilm assay with the clinical characteristics of diabetic foot infection. The clinical characteristics such as the ulcer duration, size, nature, and grade were associated with biofilm production. Our results suggest that as the size of the ulcer with poor glycemic control increased, the organism was more likely to be positive for biofilm formation. A high-degree of antibiotic resistance was exhibited by the biofilm-producing gram-positive isolates for erythromycin and gram-negative isolates for cefpodoxime. Comparisons of biofilm production using 3 different conventional methods were performed. The strong producers with the tube adherence method were able to produce biofilm using the cover slip assay method, and the weak producers in tube adherence method had difficulty in producing biofilm using the other 2 methods, indicating that the tube adherence method is the best method for assessing biofilm formation. The strong production of biofilm with the conventional method was further confirmed by scanning electron microscopy analysis, because bacteria attached as a distinct layer of biofilm. Thus, the high degree of antibiotic resistance was exhibited by biofilm producers compared with nonbiofilm producers. The tube adherence and cover slip assay were found to be the better method for biofilm evaluation. Copyright © 2018 The American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  9. RESEARCH METHOD FOR SAMPLING AND ANALYSIS OF FIBROUS AMPHIBOLE IN VERMICULITE ATTIC INSULATION

    EPA Science Inventory

    NRMRL hosted a meeting on July 17-18, 2003 entitled, "Analytical Method for Bulk Analysis of Vermiculite." The purpose of this effort was to produce an interim research method for use by U.S. EPA's Office of Research and Development (ORD) for the analysis of bulk vermiculite for...

  10. A New Cluster Analysis-Marker-Controlled Watershed Method for Separating Particles of Granular Soils.

    PubMed

    Alam, Md Ferdous; Haque, Asadul

    2017-10-18

    An accurate determination of particle-level fabric of granular soils from tomography data requires a maximum correct separation of particles. The popular marker-controlled watershed separation method is widely used to separate particles. However, the watershed method alone is not capable of producing the maximum separation of particles when subjected to boundary stresses leading to crushing of particles. In this paper, a new separation method, named as Monash Particle Separation Method (MPSM), has been introduced. The new method automatically determines the optimal contrast coefficient based on cluster evaluation framework to produce the maximum accurate separation outcomes. Finally, the particles which could not be separated by the optimal contrast coefficient were separated by integrating cuboid markers generated from the clustering by Gaussian mixture models into the routine watershed method. The MPSM was validated on a uniformly graded sand volume subjected to one-dimensional compression loading up to 32 MPa. It was demonstrated that the MPSM is capable of producing the best possible separation of particles required for the fabric analysis.

  11. Comparative SIFT-MS, GC-MS and FTIR analysis of methane fuel produced in biogas stations and in artificial photosynthesis over acidic anatase TiO2 and montmorillonite

    NASA Astrophysics Data System (ADS)

    Knížek, Antonín; Dryahina, Ksenyia; Španěl, Patrik; Kubelík, Petr; Kavan, Ladislav; Zukalová, Markéta; Ferus, Martin; Civiš, Svatopluk

    2018-06-01

    The era of fossil fuels is slowly nearing its inevitable end and the urgency of alternative energy sources basic research, exploration and testing becomes ever more important. Storage and alternative production of energy from fuels, such as methane, represents one of the many alternative approaches. Natural gas containing methane represents a powerful source of energy producing large volume of greenhouse gases. However, methane can be also produced in closed, CO2-neutral cycles. In our study, we compare detailed chemical composition of CH4 fuel produced in two different processes: Classical production of biogas in a rendering station, industrial wastewater treatment station and landfill gas station together with novel approach of artificial photosynthesis from CO2 over acidic anatase TiO2 in experimental apparatus developed in our laboratory. The analysis of CH4 fuel produced in these processes is important. Trace gaseous traces can be for example corrosive or toxic, low quality of the mixture suppresses effectivity of energy production, etc. In this analysis, we present a combination of two methods: High resolution Fourier transform infrared spectroscopy (HR-FTIR) suitable for the main component analysis; and the complementary extremely sensitive method of Selected Ion Flow Tube Mass Spectrometry (SIFT-MS) and gas chromatography (GC-MS), which are in turn best suited for trace analysis. The combination of these methods provides more information than any single of them would be able to and promises a new possible analytical approach to fuel and gaseous mixture analysis.

  12. A novel radio-tolerant astaxanthin-producing bacterium reveals a new astaxanthin derivative: astaxanthin dirhamnoside.

    PubMed

    Asker, Dalal; Awad, Tarek S; Beppu, Teruhiko; Ueda, Kenji

    2012-01-01

    Astaxanthin is a red ketocarotenoid that exhibits extraordinary health-promoting activities such as antioxidant, anti-inflammatory, antitumor, and immune booster. The recent discovery of the beneficial roles of astaxanthin against many degenerative diseases such as cancers, heart diseases, and exercise-induced fatigue has raised its market demand as a nutraceutical and medicinal ingredient in aquaculture, food, and pharmaceutical industries. To satisfy the growing demand for this high-value nutraceuticals ingredient and consumer interest in natural products, many research efforts are being made to discover novel microbial producers with effective biotechnological production of astaxanthin. Using a rapid screening method based on 16S rRNA gene, and effective HPLC-Diodearray-MS methods for carotenoids analysis, we succeeded to isolate a unique astaxanthin-producing bacterium (strain TDMA-17(T)) that belongs to the family Sphingomonadaceae (Asker et al., Appl Microbiol Biotechnol 77: 383-392, 2007). In this chapter, we provide a detailed description of effective HPLC-Diodearray-MS methods for rapid analysis and identification of the carotenoids produced by strain TDMA-17(T). We also describe the methods of isolation and identification for a novel bacterial carotenoid (astaxanthin derivative), a major carotenoid that is produced by strain TDMA-17(T). Finally, we describe the polyphasic taxonomic analysis of strain TDMA-17(T) and the description of a novel species belonging to genus Sphingomonas.

  13. DendroBLAST: approximate phylogenetic trees in the absence of multiple sequence alignments.

    PubMed

    Kelly, Steven; Maini, Philip K

    2013-01-01

    The rapidly growing availability of genome information has created considerable demand for both fast and accurate phylogenetic inference algorithms. We present a novel method called DendroBLAST for reconstructing phylogenetic dendrograms/trees from protein sequences using BLAST. This method differs from other methods by incorporating a simple model of sequence evolution to test the effect of introducing sequence changes on the reliability of the bipartitions in the inferred tree. Using realistic simulated sequence data we demonstrate that this method produces phylogenetic trees that are more accurate than other commonly-used distance based methods though not as accurate as maximum likelihood methods from good quality multiple sequence alignments. In addition to tests on simulated data, we use DendroBLAST to generate input trees for a supertree reconstruction of the phylogeny of the Archaea. This independent analysis produces an approximate phylogeny of the Archaea that has both high precision and recall when compared to previously published analysis of the same dataset using conventional methods. Taken together these results demonstrate that approximate phylogenetic trees can be produced in the absence of multiple sequence alignments, and we propose that these trees will provide a platform for improving and informing downstream bioinformatic analysis. A web implementation of the DendroBLAST method is freely available for use at http://www.dendroblast.com/.

  14. A Comparison of the Effectiveness of Two Design Methodologies in a Secondary School Setting.

    ERIC Educational Resources Information Center

    Cannizzaro, Brenton; Boughton, Doug

    1998-01-01

    Examines the effectiveness of the analysis-synthesis and generator-conjuncture-analysis models of design education. Concludes that the generator-conjecture-analysis design method produced student design product of a slightly higher standard than the analysis-synthesis design method. Discusses the findings in more detail and considers implications.…

  15. [Study on the genuineness and producing area of Panax notoginseng based on infrared spectroscopy combined with discriminant analysis].

    PubMed

    Liu, Fei; Wang, Yuan-zhong; Yang, Chun-yan; Jin, Hang

    2015-01-01

    The genuineness and producing area of Panax notoginseng were studied based on infrared spectroscopy combined with discriminant analysis. The infrared spectra of 136 taproots of P. notoginseng from 13 planting point in 11 counties were collected and the second derivate spectra were calculated by Omnic 8. 0 software. The infrared spectra and their second derivate spectra in the range 1 800 - 700 cm-1 were used to build model by stepwise discriminant analysis, which was in order to distinguish study on the genuineness of P. notoginseng. The model built based on the second derivate spectra showed the better recognition effect for the genuineness of P. notoginseng. The correct rate of returned classification reached to 100%, and the prediction accuracy was 93. 4%. The stability of model was tested by cross validation and the method was performed extrapolation validation. The second derivate spectra combined with the same discriminant analysis method were used to distinguish the producing area of P. notoginseng. The recognition effect of models built based on different range of spectrum and different numbers of samples were compared and found that when the model was built by collecting 8 samples from each planting point as training sample and the spectrum in the range 1 500 - 1 200 cm-1 , the recognition effect was better, with the correct rate of returned classification reached to 99. 0%, and the prediction accuracy was 76. 5%. The results indicated that infrared spectroscopy combined with discriminant analysis showed good recognition effect for the genuineness of P. notoginseng. The method might be a hopeful new method for identification of genuineness of P. notoginseng in practice. The method could recognize the producing area of P. notoginseng to some extent and could be a new thought for identification of the producing area of P. natoginseng.

  16. Interlaboratory comparability, bias, and precision for four laboratories measuring constituents in precipitation, November 1982-August 1983

    USGS Publications Warehouse

    Brooks, M.H.; Schroder, L.J.; Malo, B.A.

    1985-01-01

    Four laboratories were evaluated in their analysis of identical natural and simulated precipitation water samples. Interlaboratory comparability was evaluated using analysis of variance coupled with Duncan 's multiple range test, and linear-regression models describing the relations between individual laboratory analytical results for natural precipitation samples. Results of the statistical analyses indicate that certain pairs of laboratories produce different results when analyzing identical samples. Analyte bias for each laboratory was examined using analysis of variance coupled with Duncan 's multiple range test on data produced by the laboratories from the analysis of identical simulated precipitation samples. Bias for a given analyte produced by a single laboratory has been indicated when the laboratory mean for that analyte is shown to be significantly different from the mean for the most-probable analyte concentrations in the simulated precipitation samples. Ion-chromatographic methods for the determination of chloride, nitrate, and sulfate have been compared with the colorimetric methods that were also in use during the study period. Comparisons were made using analysis of variance coupled with Duncan 's multiple range test for means produced by the two methods. Analyte precision for each laboratory has been estimated by calculating a pooled variance for each analyte. Analyte estimated precisions have been compared using F-tests and differences in analyte precisions for laboratory pairs have been reported. (USGS)

  17. A New Cluster Analysis-Marker-Controlled Watershed Method for Separating Particles of Granular Soils

    PubMed Central

    Alam, Md Ferdous

    2017-01-01

    An accurate determination of particle-level fabric of granular soils from tomography data requires a maximum correct separation of particles. The popular marker-controlled watershed separation method is widely used to separate particles. However, the watershed method alone is not capable of producing the maximum separation of particles when subjected to boundary stresses leading to crushing of particles. In this paper, a new separation method, named as Monash Particle Separation Method (MPSM), has been introduced. The new method automatically determines the optimal contrast coefficient based on cluster evaluation framework to produce the maximum accurate separation outcomes. Finally, the particles which could not be separated by the optimal contrast coefficient were separated by integrating cuboid markers generated from the clustering by Gaussian mixture models into the routine watershed method. The MPSM was validated on a uniformly graded sand volume subjected to one-dimensional compression loading up to 32 MPa. It was demonstrated that the MPSM is capable of producing the best possible separation of particles required for the fabric analysis. PMID:29057823

  18. Standardization, evaluation and early-phase method validation of an analytical scheme for batch-consistency N-glycosylation analysis of recombinant produced glycoproteins.

    PubMed

    Zietze, Stefan; Müller, Rainer H; Brecht, René

    2008-03-01

    In order to set up a batch-to-batch-consistency analytical scheme for N-glycosylation analysis, several sample preparation steps including enzyme digestions and fluorophore labelling and two HPLC-methods were established. The whole method scheme was standardized, evaluated and validated according to the requirements on analytical testing in early clinical drug development by usage of a recombinant produced reference glycoprotein (RGP). The standardization of the methods was performed by clearly defined standard operation procedures. During evaluation of the methods, the major interest was in the loss determination of oligosaccharides within the analytical scheme. Validation of the methods was performed with respect to specificity, linearity, repeatability, LOD and LOQ. Due to the fact that reference N-glycan standards were not available, a statistical approach was chosen to derive accuracy from the linearity data. After finishing the validation procedure, defined limits for method variability could be calculated and differences observed in consistency analysis could be separated into significant and incidental ones.

  19. [Tobacco quality analysis of producing areas of Yunnan tobacco using near-infrared (NIR) spectrum].

    PubMed

    Wang, Yi; Ma, Xiang; Wen, Ya-Dong; Yu, Chun-Xia; Wang, Luo-Ping; Zhao, Long-Lian; Li, Jun-Hui

    2013-01-01

    In the present study, tobacco quality analysis of different producing areas was carried out applying spectrum projection and correlation methods. The group of industrial classification data was near-infrared (NIR) spectrum in 2010 year of middle parts of tobacco plant from Hongta Tobacco (Group) Co., Ltd. Twelve hundred seventy six superior tobacco leaf samples were collected from four producing areas, in which three areas from Yuxi, Chuxiong and Zhaotong, in Yunnan province all belong to tobacco varieties of K326 and one area from Dali belongs to tobacco varieties of Hongda. The conclusion showed that when the samples were divided into two parts by the ratio of 2 : 1 randomly as analysis and verification sets, the verification set corresponded with the analysis set applying spectrum projection because their correlation coefficients by the first and second dimensional projection were all above 0.99. At the same time, The study discussed a method to get the quantitative similarity values of different producing areas samples. The similarity values were instructive in tobacco plant planning, quality management, acquisition of raw materials of tobacco and tobacco leaf blending.

  20. Development of an efficient fungal DNA extraction method to be used in random amplified polymorphic DNA-PCR analysis to differentiate cyclopiazonic acid mold producers.

    PubMed

    Sánchez, Beatriz; Rodríguez, Mar; Casado, Eva M; Martín, Alberto; Córdoba, Juan J

    2008-12-01

    A variety of previously established mechanical and chemical treatments to achieve fungal cell lysis combined with a semiautomatic system operated by a vacuum pump were tested to obtain DNA extract to be directly used in randomly amplified polymorphic DNA (RAPD)-PCR to differentiate cyclopiazonic acid-producing and -nonproducing mold strains. A DNA extraction method that includes digestion with proteinase K and lyticase prior to using a mortar and pestle grinding and a semiautomatic vacuum system yielded DNA of high quality in all the fungal strains and species tested, at concentrations ranging from 17 to 89 ng/microl in 150 microl of the final DNA extract. Two microliters of DNA extracted with this method was directly used for RAPD-PCR using primer (GACA)4. Reproducible RAPD fingerprints showing high differences between producer and nonproducer strains were observed. These differences in the RAPD patterns did not differentiate all the strains tested in clusters by cyclopiazonic acid production but may be very useful to distinguish cyclopiazonic acid producer strains from nonproducer strains by a simple RAPD analysis. Thus, the DNA extracts obtained could be used directly without previous purification and quantification for RAPD analysis to differentiate cyclopiazonic acid producer from nonproducer mold strains. This combined analysis could be adaptable to other toxigenic fungal species to enable differentiation of toxigenic and non-toxigenic molds, a procedure of great interest in food safety.

  1. Evaluation of DNA extraction methods for the analysis of microbial community in biological activated carbon.

    PubMed

    Zheng, Lu; Gao, Naiyun; Deng, Yang

    2012-01-01

    It is difficult to isolate DNA from biological activated carbon (BAC) samples used in water treatment plants, owing to the scarcity of microorganisms in BAC samples. The aim of this study was to identify DNA extraction methods suitable for a long-term, comprehensive ecological analysis of BAC microbial communities. To identify a procedure that can produce high molecular weight DNA, maximizes detectable diversity and is relatively free from contaminants, the microwave extraction method, the cetyltrimethylammonium bromide (CTAB) extraction method, a commercial DNA extraction kit, and the ultrasonic extraction method were used for the extraction of DNA from BAC samples. Spectrophotometry, agarose gel electrophoresis and polymerase chain reaction (PCR)-restriction fragment length polymorphisms (RFLP) analysis were conducted to compare the yield and quality of DNA obtained using these methods. The results showed that the CTAB method produce the highest yield and genetic diversity of DNA from BAC samples, but DNA purity was slightly less than that obtained with the DNA extraction-kit method. This study provides a theoretical basis for establishing and selecting DNA extraction methods for BAC samples.

  2. Isolation, characterization, and diversity of novel radiotolerant carotenoid-producing bacteria.

    PubMed

    Asker, Dalal; Awad, Tarek S; Beppu, Teruhiko; Ueda, Kenji

    2012-01-01

    Carotenoids are natural pigments that exhibit many biological functions, such as antioxidants (i.e., promote oxidative stress resistance), membrane stabilizers, and precursors for vitamin A. The link between these biological activities and many health benefits (e.g., anticarcinogenic activity, prevention of chronic diseases, etc.) has raised the interest of several industrial sectors, especially in the cosmetics and pharmaceutical industries. The use of microorganisms in biotechnology to produce carotenoids is favorable by consumer and can help meet the growing demand for these bioactive compounds in the food, feed, and pharmaceutical industries. This methodological chapter details the development of a rapid and selective screening method for isolation and identification of carotenoid-producing microorganisms based on UV treatment, sequencing analysis of 16S rRNA genes, and carotenoids' analysis using rapid and effective High-Performance Liquid Chromatography-Diodearray-MS methods. The results of a comprehensive 16S rRNA gene-based phylogenetic analysis revealed a diversity of carotenoid-producing microorganisms (104 isolates) that were isolated at a high frequency from water samples collected at Misasa (Tottori, Japan), a region known for its high natural radioactivity content. These carotenoid-producing isolates were classified into 38 different species belonging to 7 bacterial classes (Flavobacteria, Sphingobacteria, α-Proteobacteria, γ-Proteobacteria, Deinococci, Actinobacteria, and Bacilli). The carotenoids produced by the isolates were zeaxanthin (6 strains), dihydroxyastaxanthin (24 strains), astaxanthin (27 strains), canthaxanthin (10 strains), and unidentified molecular species that were produced by the isolates related to Deinococcus, Exiguobacterium, and Flectobacillus. Here, we describe the methods used to isolate and classify these microorganisms.

  3. Methods of Hematoxylin and Erosin Image Information Acquisition and Optimization in Confocal Microscopy

    PubMed Central

    Yoon, Woong Bae; Kim, Hyunjin; Kim, Kwang Gi; Choi, Yongdoo; Chang, Hee Jin

    2016-01-01

    Objectives We produced hematoxylin and eosin (H&E) staining-like color images by using confocal laser scanning microscopy (CLSM), which can obtain the same or more information in comparison to conventional tissue staining. Methods We improved images by using several image converting techniques, including morphological methods, color space conversion methods, and segmentation methods. Results An image obtained after image processing showed coloring very similar to that in images produced by H&E staining, and it is advantageous to conduct analysis through fluorescent dye imaging and microscopy rather than analysis based on single microscopic imaging. Conclusions The colors used in CLSM are different from those seen in H&E staining, which is the method most widely used for pathologic diagnosis and is familiar to pathologists. Computer technology can facilitate the conversion of images by CLSM to be very similar to H&E staining images. We believe that the technique used in this study has great potential for application in clinical tissue analysis. PMID:27525165

  4. Security Analysis and Improvements to the PsychoPass Method

    PubMed Central

    2013-01-01

    Background In a recent paper, Pietro Cipresso et al proposed the PsychoPass method, a simple way to create strong passwords that are easy to remember. However, the method has some security issues that need to be addressed. Objective To perform a security analysis on the PsychoPass method and outline the limitations of and possible improvements to the method. Methods We used the brute force analysis and dictionary attack analysis of the PsychoPass method to outline its weaknesses. Results The first issue with the Psychopass method is that it requires the password reproduction on the same keyboard layout as was used to generate the password. The second issue is a security weakness: although the produced password is 24 characters long, the password is still weak. We elaborate on the weakness and propose a solution that produces strong passwords. The proposed version first requires the use of the SHIFT and ALT-GR keys in combination with other keys, and second, the keys need to be 1-2 distances apart. Conclusions The proposed improved PsychoPass method yields passwords that can be broken only in hundreds of years based on current computing powers. The proposed PsychoPass method requires 10 keys, as opposed to 20 keys in the original method, for comparable password strength. PMID:23942458

  5. Analysis Resistant Cipher Method and Apparatus

    NASA Technical Reports Server (NTRS)

    Oakley, Ernest C. (Inventor)

    2009-01-01

    A system for encoding and decoding data words including an anti-analysis encoder unit for receiving an original plaintext and producing a recoded data, a data compression unit for receiving the recoded data and producing a compressed recoded data, and an encryption unit for receiving the compressed recoded data and producing an encrypted data. The recoded data has an increased non-correlatable data redundancy compared with the original plaintext in order to mask the statistical distribution of characters in the plaintext data. The system of the present invention further includes a decryption unit for receiving the encrypted data and producing a decrypted data, a data decompression unit for receiving the decrypted data and producing an uncompressed recoded data, and an anti-analysis decoder unit for receiving the uncompressed recoded data and producing a recovered plaintext that corresponds with the original plaintext.

  6. High-Resolution Melting-Curve Analysis of Ligation-Mediated Real-Time PCR for Rapid Evaluation of an Epidemiological Outbreak of Extended-Spectrum-Beta-Lactamase-Producing Escherichia coli ▿

    PubMed Central

    Woksepp, Hanna; Jernberg, Cecilia; Tärnberg, Maria; Ryberg, Anna; Brolund, Alma; Nordvall, Michaela; Olsson-Liljequist, Barbro; Wisell, Karin Tegmark; Monstein, Hans-Jürg; Nilsson, Lennart E.; Schön, Thomas

    2011-01-01

    Methods for the confirmation of nosocomial outbreaks of bacterial pathogens are complex, expensive, and time-consuming. Recently, a method based on ligation-mediated PCR (LM/PCR) using a low denaturation temperature which produces specific melting-profile patterns of DNA products has been described. Our objective was to further develop this method for real-time PCR and high-resolution melting analysis (HRM) in a single-tube system optimized in order to achieve results within 1 day. Following the optimization of LM/PCR for real-time PCR and HRM (LM/HRM), the method was applied for a nosocomial outbreak of extended-spectrum-beta-lactamase (ESBL)-producing and ST131-associated Escherichia coli isolates (n = 15) and control isolates (n = 29), including four previous clusters. The results from LM/HRM were compared to results from pulsed-field gel electrophoresis (PFGE), which served as the gold standard. All isolates from the nosocomial outbreak clustered by LM/HRM, which was confirmed by gel electrophoresis of the LM/PCR products and PFGE. Control isolates that clustered by LM/PCR (n = 4) but not by PFGE were resolved by confirmatory gel electrophoresis. We conclude that LM/HRM is a rapid method for the detection of nosocomial outbreaks of bacterial infections caused by ESBL-producing E. coli strains. It allows the analysis of isolates in a single-tube system within a day, and the discriminatory power is comparable to that of PFGE. PMID:21956981

  7. High-resolution melting-curve analysis of ligation-mediated real-time PCR for rapid evaluation of an epidemiological outbreak of extended-spectrum-beta-lactamase-producing Escherichia coli.

    PubMed

    Woksepp, Hanna; Jernberg, Cecilia; Tärnberg, Maria; Ryberg, Anna; Brolund, Alma; Nordvall, Michaela; Olsson-Liljequist, Barbro; Wisell, Karin Tegmark; Monstein, Hans-Jürg; Nilsson, Lennart E; Schön, Thomas

    2011-12-01

    Methods for the confirmation of nosocomial outbreaks of bacterial pathogens are complex, expensive, and time-consuming. Recently, a method based on ligation-mediated PCR (LM/PCR) using a low denaturation temperature which produces specific melting-profile patterns of DNA products has been described. Our objective was to further develop this method for real-time PCR and high-resolution melting analysis (HRM) in a single-tube system optimized in order to achieve results within 1 day. Following the optimization of LM/PCR for real-time PCR and HRM (LM/HRM), the method was applied for a nosocomial outbreak of extended-spectrum-beta-lactamase (ESBL)-producing and ST131-associated Escherichia coli isolates (n = 15) and control isolates (n = 29), including four previous clusters. The results from LM/HRM were compared to results from pulsed-field gel electrophoresis (PFGE), which served as the gold standard. All isolates from the nosocomial outbreak clustered by LM/HRM, which was confirmed by gel electrophoresis of the LM/PCR products and PFGE. Control isolates that clustered by LM/PCR (n = 4) but not by PFGE were resolved by confirmatory gel electrophoresis. We conclude that LM/HRM is a rapid method for the detection of nosocomial outbreaks of bacterial infections caused by ESBL-producing E. coli strains. It allows the analysis of isolates in a single-tube system within a day, and the discriminatory power is comparable to that of PFGE.

  8. Security analysis and improvements to the PsychoPass method.

    PubMed

    Brumen, Bostjan; Heričko, Marjan; Rozman, Ivan; Hölbl, Marko

    2013-08-13

    In a recent paper, Pietro Cipresso et al proposed the PsychoPass method, a simple way to create strong passwords that are easy to remember. However, the method has some security issues that need to be addressed. To perform a security analysis on the PsychoPass method and outline the limitations of and possible improvements to the method. We used the brute force analysis and dictionary attack analysis of the PsychoPass method to outline its weaknesses. The first issue with the Psychopass method is that it requires the password reproduction on the same keyboard layout as was used to generate the password. The second issue is a security weakness: although the produced password is 24 characters long, the password is still weak. We elaborate on the weakness and propose a solution that produces strong passwords. The proposed version first requires the use of the SHIFT and ALT-GR keys in combination with other keys, and second, the keys need to be 1-2 distances apart. The proposed improved PsychoPass method yields passwords that can be broken only in hundreds of years based on current computing powers. The proposed PsychoPass method requires 10 keys, as opposed to 20 keys in the original method, for comparable password strength.

  9. A simplified procedure for GC/C/IRMS analysis of underivatized 19-norandrosterone in urine following HPLC purification.

    PubMed

    de la Torre, Xavier; Colamonici, Cristiana; Curcio, Davide; Molaioni, Francesco; Pizzardi, Marta; Botrè, Francesco

    2011-04-01

    Nandrolone and/or its precursors are included in the World Anti-doping Agency (WADA) list of forbidden substances and methods and as such their use is banned in sport. 19-Norandrosterone (19-NA) the main metabolite of these compounds can also be produced endogenously. The need to establish the origin of 19-NA in human urine samples obliges the antidoping laboratories to use isotope ratio mass spectrometry (IRMS) coupled to gas chromatography (GC/C/IRMS). In this work a simple liquid chromatographic method without any additional derivatization step is proposed, allowing to drastically simplify the urine pretreatment procedure, leading to extracts free of interferences permitting precise and accurate IRMS analysis. The purity of the extracts was verified by parallel analysis by gas chromatography coupled to mass spectrometry with GC conditions identical to those of the GC/C/IRMS assay. The method has been validated according to ISO17025 requirements (within assay precision of ±0.3‰ and between assay precision of ±0.4‰). The method has been tested with samples obtained after the administration of synthetic 19-norandrostenediol and samples collected during pregnancy where 19-NA is known to be produced endogenously. Twelve drugs and synthetic standards able to produce through metabolism 19-NA have shown to present δ(13)C values around -29‰ being quite homogeneous (-28.8±1.5; mean±standard deviation) while endogenously produced 19-NA has shown values comparable to other endogenous produced steroids in the range -21 to -24‰ as already reported. The efficacy of the method was tested on real samples from routine antidoping analyses. Copyright © 2011 Elsevier Inc. All rights reserved.

  10. 40 CFR 435.61 - Specialized definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., abbreviations, and methods of analysis set forth in 40 CFR part 401 shall apply to this subpart. (b) The term... well” shall mean any well which produces natural gas in a ratio to the petroleum liquids produced...

  11. The efficiency of parameter estimation of latent path analysis using summated rating scale (SRS) and method of successive interval (MSI) for transformation of score to scale

    NASA Astrophysics Data System (ADS)

    Solimun, Fernandes, Adji Achmad Rinaldo; Arisoesilaningsih, Endang

    2017-12-01

    Research in various fields generally investigates systems and involves latent variables. One method to analyze the model representing the system is path analysis. The data of latent variables measured using questionnaires by applying attitude scale model yields data in the form of score, before analyzed should be transformation so that it becomes data of scale. Path coefficient, is parameter estimator, calculated from scale data using method of successive interval (MSI) and summated rating scale (SRS). In this research will be identifying which data transformation method is better. Path coefficients have smaller varieties are said to be more efficient. The transformation method that produces scaled data and used in path analysis capable of producing path coefficients (parameter estimators) with smaller varieties is said to be better. The result of analysis using real data shows that on the influence of Attitude variable to Intention Entrepreneurship, has relative efficiency (ER) = 1, where it shows that the result of analysis using data transformation of MSI and SRS as efficient. On the other hand, for simulation data, at high correlation between items (0.7-0.9), MSI method is more efficient 1.3 times better than SRS method.

  12. Linear regression based on Minimum Covariance Determinant (MCD) and TELBS methods on the productivity of phytoplankton

    NASA Astrophysics Data System (ADS)

    Gusriani, N.; Firdaniza

    2018-03-01

    The existence of outliers on multiple linear regression analysis causes the Gaussian assumption to be unfulfilled. If the Least Square method is forcedly used on these data, it will produce a model that cannot represent most data. For that, we need a robust regression method against outliers. This paper will compare the Minimum Covariance Determinant (MCD) method and the TELBS method on secondary data on the productivity of phytoplankton, which contains outliers. Based on the robust determinant coefficient value, MCD method produces a better model compared to TELBS method.

  13. Spectroscopic Chemical Analysis Methods and Apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F.; Reid, Ray D.

    2012-01-01

    This invention relates to non-contact spectroscopic methods and apparatus for performing chemical analysis and the ideal wavelengths and sources needed for this analysis. It employs deep ultraviolet (200- to 300-nm spectral range) electron-beam-pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor lightemitting devices, and hollow cathode metal ion lasers. Three achieved goals for this innovation are to reduce the size (under 20 L), reduce the weight [under 100 lb (.45 kg)], and reduce the power consumption (under 100 W). This method can be used in microscope or macroscope to provide measurement of Raman and/or native fluorescence emission spectra either by point-by-point measurement, or by global imaging of emissions within specific ultraviolet spectral bands. In other embodiments, the method can be used in analytical instruments such as capillary electrophoresis, capillary electro-chromatography, high-performance liquid chromatography, flow cytometry, and related instruments for detection and identification of unknown analytes using a combination of native fluorescence and/or Raman spectroscopic methods. This design provides an electron-beampumped semiconductor radiation-producing method, or source, that can emit at a wavelength (or wavelengths) below 300 nm, e.g. in the deep ultraviolet between about 200 and 300 nm, and more preferably less than 260 nm. In some variations, the method is to produce incoherent radiation, while in other implementations it produces laser radiation. In some variations, this object is achieved by using an AlGaN emission medium, while in other implementations a diamond emission medium may be used. This instrument irradiates a sample with deep UV radiation, and then uses an improved filter for separating wavelengths to be detected. This provides a multi-stage analysis of the sample. To avoid the difficulties related to producing deep UV semiconductor sources, a pumping approach has been developed that uses ballistic electron beam injection directly into the active region of a wide bandgap semiconductor material.

  14. Journal of Naval Science. Volume 2, Number 1

    DTIC Science & Technology

    1976-01-01

    has defined a probability distribution function which fits this type of data and forms the basis for statistical analysis of test results (see...Conditions to Assess the Performance of Fire-Resistant Fluids’. Wear, 28 (1974) 29. J.N.S., Vol. 2, No. 1 APPENDIX A Analysis of Fatigue Test Data...used to produce the impulse response and the equipment required for the analysis is relatively simple. The methods that must be used to produce

  15. Non-Destructive Thermography Analysis of Impact Damage on Large-Scale CFRP Automotive Parts.

    PubMed

    Maier, Alexander; Schmidt, Roland; Oswald-Tranta, Beate; Schledjewski, Ralf

    2014-01-14

    Laminated composites are increasingly used in aeronautics and the wind energy industry, as well as in the automotive industry. In these applications, the construction and processing need to fulfill the highest requirements regarding weight and mechanical properties. Environmental issues, like fuel consumption and CO₂-footprint, set new challenges in producing lightweight parts that meet the highly monitored standards for these branches. In the automotive industry, one main aspect of construction is the impact behavior of structural parts. To verify the quality of parts made from composite materials with little effort, cost and time, non-destructive test methods are increasingly used. A highly recommended non-destructive testing method is thermography analysis. In this work, a prototype for a car's base plate was produced by using vacuum infusion. For research work, testing specimens were produced with the same multi-layer build up as the prototypes. These specimens were charged with defined loads in impact tests to simulate the effect of stone chips. Afterwards, the impacted specimens were investigated with thermography analysis. The research results in that work will help to understand the possible fields of application and the usage of thermography analysis as the first quick and economic failure detection method for automotive parts.

  16. Second Harmonic Generation of Unpolarized Light

    NASA Astrophysics Data System (ADS)

    Ding, Changqin; Ulcickas, James R. W.; Deng, Fengyuan; Simpson, Garth J.

    2017-11-01

    A Mueller tensor mathematical framework was applied for predicting and interpreting the second harmonic generation (SHG) produced with an unpolarized fundamental beam. In deep tissue imaging through SHG and multiphoton fluorescence, partial or complete depolarization of the incident light complicates polarization analysis. The proposed framework has the distinct advantage of seamlessly merging the purely polarized theory based on the Jones or Cartesian susceptibility tensors with a more general Mueller tensor framework capable of handling partial depolarized fundamental and/or SHG produced. The predictions of the model are in excellent agreement with experimental measurements of z -cut quartz and mouse tail tendon obtained with polarized and depolarized incident light. The polarization-dependent SHG produced with unpolarized fundamental allowed determination of collagen fiber orientation in agreement with orthogonal methods based on image analysis. This method has the distinct advantage of being immune to birefringence or depolarization of the fundamental beam for structural analysis of tissues.

  17. Methods of Hematoxylin and Erosin Image Information Acquisition and Optimization in Confocal Microscopy.

    PubMed

    Yoon, Woong Bae; Kim, Hyunjin; Kim, Kwang Gi; Choi, Yongdoo; Chang, Hee Jin; Sohn, Dae Kyung

    2016-07-01

    We produced hematoxylin and eosin (H&E) staining-like color images by using confocal laser scanning microscopy (CLSM), which can obtain the same or more information in comparison to conventional tissue staining. We improved images by using several image converting techniques, including morphological methods, color space conversion methods, and segmentation methods. An image obtained after image processing showed coloring very similar to that in images produced by H&E staining, and it is advantageous to conduct analysis through fluorescent dye imaging and microscopy rather than analysis based on single microscopic imaging. The colors used in CLSM are different from those seen in H&E staining, which is the method most widely used for pathologic diagnosis and is familiar to pathologists. Computer technology can facilitate the conversion of images by CLSM to be very similar to H&E staining images. We believe that the technique used in this study has great potential for application in clinical tissue analysis.

  18. Analysis of 40 conventional and emerging disinfection by-products in fresh-cut produce wash water by modified EPA methods.

    PubMed

    Lee, Wan-Ning; Huang, Ching-Hua; Zhu, Guangxuan

    2018-08-01

    Chlorine sanitizers used in washing fresh and fresh-cut produce can lead to generation of disinfection by-products (DBPs) that are harmful to human health. Monitoring of DBPs is necessary to protect food safety but comprehensive analytical methods have been lacking. This study has optimized three U.S. Environmental Protection Agency methods for drinking water DBPs to improve their performance for produce wash water. The method development encompasses 40 conventional and emerging DBPs. Good recoveries (60-130%) were achieved for most DBPs in deionized water and in lettuce, strawberry and cabbage wash water. The method detection limits are in the range of 0.06-0.58 μg/L for most DBPs and 10-24 ng/L for nitrosamines in produce wash water. Preliminary results revealed the formation of many DBPs when produce is washed with chlorine. The optimized analytical methods by this study effectively reduce matrix interference and can serve as useful tools for future research on food DBPs. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Nonparametric Estimation of Standard Errors in Covariance Analysis Using the Infinitesimal Jackknife

    ERIC Educational Resources Information Center

    Jennrich, Robert I.

    2008-01-01

    The infinitesimal jackknife provides a simple general method for estimating standard errors in covariance structure analysis. Beyond its simplicity and generality what makes the infinitesimal jackknife method attractive is that essentially no assumptions are required to produce consistent standard error estimates, not even the requirement that the…

  20. Approximate string matching algorithms for limited-vocabulary OCR output correction

    NASA Astrophysics Data System (ADS)

    Lasko, Thomas A.; Hauser, Susan E.

    2000-12-01

    Five methods for matching words mistranslated by optical character recognition to their most likely match in a reference dictionary were tested on data from the archives of the National Library of Medicine. The methods, including an adaptation of the cross correlation algorithm, the generic edit distance algorithm, the edit distance algorithm with a probabilistic substitution matrix, Bayesian analysis, and Bayesian analysis on an actively thinned reference dictionary were implemented and their accuracy rates compared. Of the five, the Bayesian algorithm produced the most correct matches (87%), and had the advantage of producing scores that have a useful and practical interpretation.

  1. Validation and Improvement of Reliability Methods for Air Force Building Systems

    DTIC Science & Technology

    focusing primarily on HVAC systems . This research used contingency analysis to assess the performance of each model for HVAC systems at six Air Force...probabilistic model produced inflated reliability calculations for HVAC systems . In light of these findings, this research employed a stochastic method, a...Nonhomogeneous Poisson Process (NHPP), in an attempt to produce accurate HVAC system reliability calculations. This effort ultimately concluded that

  2. Financing Alternatives Comparison Tool

    EPA Pesticide Factsheets

    FACT is a financial analysis tool that helps identify the most cost-effective method to fund a wastewater or drinking water management project. It produces a comprehensive analysis that compares various financing options.

  3. The use of fluorescence microscopy and image analysis for rapid detection of non-producing revertant cells of Synechocystis sp. PCC6803 and Synechococcus sp. PCC7002.

    PubMed

    Schulze, Katja; Lang, Imke; Enke, Heike; Grohme, Diana; Frohme, Marcus

    2015-04-17

    Ethanol production via genetically engineered cyanobacteria is a promising solution for the production of biofuels. Through the introduction of a pyruvate decarboxylase and alcohol dehydrogenase direct ethanol production becomes possible within the cells. However, during cultivation genetic instability can lead to mutations and thus loss of ethanol production. Cells then revert back to the wild type phenotype. A method for a rapid and simple detection of these non-producing revertant cells in an ethanol producing cell population is an important quality control measure in order to predict genetic stability and the longevity of a producing culture. Several comparable cultivation experiments revealed a difference in the pigmentation for non-producing and producing cells: the accessory pigment phycocyanin (PC) is reduced in case of the ethanol producer, resulting in a yellowish appearance of the culture. Microarray and western blot studies of Synechocystis sp. PCC6803 and Synechococcus sp. PCC7002 confirmed this PC reduction on the level of RNA and protein. Based on these findings we developed a method for fluorescence microscopy in order to distinguish producing and non-producing cells with respect to their pigmentation phenotype. By applying a specific filter set the emitted fluorescence of a producer cell with a reduced PC content appeared orange. The emitted fluorescence of a non-producing cell with a wt pigmentation phenotype was detected in red, and dead cells in green. In an automated process multiple images of each sample were taken and analyzed with a plugin for the image analysis software ImageJ to identify dead (green), non-producing (red) and producing (orange) cells. The results of the presented validation experiments revealed a good identification with 98 % red cells in the wt sample and 90 % orange cells in the producer sample. The detected wt pigmentation phenotype (red cells) in the producer sample were either not fully induced yet (in 48 h induced cultures) or already reverted to a non-producing cells (in long-term photobioreactor cultivations), emphasizing the sensitivity and resolution of the method. The fluorescence microscopy method displays a useful technique for a rapid detection of non-producing single cells in an ethanol producing cell population.

  4. Heuristic Implementation of Dynamic Programming for Matrix Permutation Problems in Combinatorial Data Analysis

    ERIC Educational Resources Information Center

    Brusco, Michael J.; Kohn, Hans-Friedrich; Stahl, Stephanie

    2008-01-01

    Dynamic programming methods for matrix permutation problems in combinatorial data analysis can produce globally-optimal solutions for matrices up to size 30x30, but are computationally infeasible for larger matrices because of enormous computer memory requirements. Branch-and-bound methods also guarantee globally-optimal solutions, but computation…

  5. Resting-state functional magnetic resonance imaging: the impact of regression analysis.

    PubMed

    Yeh, Chia-Jung; Tseng, Yu-Sheng; Lin, Yi-Ru; Tsai, Shang-Yueh; Huang, Teng-Yi

    2015-01-01

    To investigate the impact of regression methods on resting-state functional magnetic resonance imaging (rsfMRI). During rsfMRI preprocessing, regression analysis is considered effective for reducing the interference of physiological noise on the signal time course. However, it is unclear whether the regression method benefits rsfMRI analysis. Twenty volunteers (10 men and 10 women; aged 23.4 ± 1.5 years) participated in the experiments. We used node analysis and functional connectivity mapping to assess the brain default mode network by using five combinations of regression methods. The results show that regressing the global mean plays a major role in the preprocessing steps. When a global regression method is applied, the values of functional connectivity are significantly lower (P ≤ .01) than those calculated without a global regression. This step increases inter-subject variation and produces anticorrelated brain areas. rsfMRI data processed using regression should be interpreted carefully. The significance of the anticorrelated brain areas produced by global signal removal is unclear. Copyright © 2014 by the American Society of Neuroimaging.

  6. Experimental variability and data pre-processing as factors affecting the discrimination power of some chemometric approaches (PCA, CA and a new algorithm based on linear regression) applied to (+/-)ESI/MS and RPLC/UV data: Application on green tea extracts.

    PubMed

    Iorgulescu, E; Voicu, V A; Sârbu, C; Tache, F; Albu, F; Medvedovici, A

    2016-08-01

    The influence of the experimental variability (instrumental repeatability, instrumental intermediate precision and sample preparation variability) and data pre-processing (normalization, peak alignment, background subtraction) on the discrimination power of multivariate data analysis methods (Principal Component Analysis -PCA- and Cluster Analysis -CA-) as well as a new algorithm based on linear regression was studied. Data used in the study were obtained through positive or negative ion monitoring electrospray mass spectrometry (+/-ESI/MS) and reversed phase liquid chromatography/UV spectrometric detection (RPLC/UV) applied to green tea extracts. Extractions in ethanol and heated water infusion were used as sample preparation procedures. The multivariate methods were directly applied to mass spectra and chromatograms, involving strictly a holistic comparison of shapes, without assignment of any structural identity to compounds. An alternative data interpretation based on linear regression analysis mutually applied to data series is also discussed. Slopes, intercepts and correlation coefficients produced by the linear regression analysis applied on pairs of very large experimental data series successfully retain information resulting from high frequency instrumental acquisition rates, obviously better defining the profiles being compared. Consequently, each type of sample or comparison between samples produces in the Cartesian space an ellipsoidal volume defined by the normal variation intervals of the slope, intercept and correlation coefficient. Distances between volumes graphically illustrates (dis)similarities between compared data. The instrumental intermediate precision had the major effect on the discrimination power of the multivariate data analysis methods. Mass spectra produced through ionization from liquid state in atmospheric pressure conditions of bulk complex mixtures resulting from extracted materials of natural origins provided an excellent data basis for multivariate analysis methods, equivalent to data resulting from chromatographic separations. The alternative evaluation of very large data series based on linear regression analysis produced information equivalent to results obtained through application of PCA an CA. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Asymptotic approximation method of force reconstruction: Application and analysis of stationary random forces

    NASA Astrophysics Data System (ADS)

    Sanchez, J.

    2018-06-01

    In this paper, the application and analysis of the asymptotic approximation method to a single degree-of-freedom has recently been produced. The original concepts are summarized, and the necessary probabilistic concepts are developed and applied to single degree-of-freedom systems. Then, these concepts are united, and the theoretical and computational models are developed. To determine the viability of the proposed method in a probabilistic context, numerical experiments are conducted, and consist of a frequency analysis, analysis of the effects of measurement noise, and a statistical analysis. In addition, two examples are presented and discussed.

  8. Characterization of biogenic ferrihydrite nanoparticles by means of SAXS, SRD and IBA methods

    NASA Astrophysics Data System (ADS)

    Balasoiu, M.; Kichanov, S.; Pantelica, A.; Pantelica, D.; Stolyar, S.; Iskhakov, R.; Aranghel, D.; Ionescu, P.; Badita, C. R.; Kurkin, S.; Orelovich, O.; Tiutiunikov, S.

    2018-03-01

    Investigations of biogenic ferrihydrite nanoparticles produced by bacteria Klebsiella oxytoca by applying small angle X-ray scattering, synchrotron radiation diffraction and ion beam analysis methods are reviewed. Different experimental data processing methods are used and analyzed.

  9. Non-Destructive Thermography Analysis of Impact Damage on Large-Scale CFRP Automotive Parts

    PubMed Central

    Maier, Alexander; Schmidt, Roland; Oswald-Tranta, Beate; Schledjewski, Ralf

    2014-01-01

    Laminated composites are increasingly used in aeronautics and the wind energy industry, as well as in the automotive industry. In these applications, the construction and processing need to fulfill the highest requirements regarding weight and mechanical properties. Environmental issues, like fuel consumption and CO2-footprint, set new challenges in producing lightweight parts that meet the highly monitored standards for these branches. In the automotive industry, one main aspect of construction is the impact behavior of structural parts. To verify the quality of parts made from composite materials with little effort, cost and time, non-destructive test methods are increasingly used. A highly recommended non-destructive testing method is thermography analysis. In this work, a prototype for a car’s base plate was produced by using vacuum infusion. For research work, testing specimens were produced with the same multi-layer build up as the prototypes. These specimens were charged with defined loads in impact tests to simulate the effect of stone chips. Afterwards, the impacted specimens were investigated with thermography analysis. The research results in that work will help to understand the possible fields of application and the usage of thermography analysis as the first quick and economic failure detection method for automotive parts. PMID:28788464

  10. An efficient scan diagnosis methodology according to scan failure mode for yield enhancement

    NASA Astrophysics Data System (ADS)

    Kim, Jung-Tae; Seo, Nam-Sik; Oh, Ghil-Geun; Kim, Dae-Gue; Lee, Kyu-Taek; Choi, Chi-Young; Kim, InSoo; Min, Hyoung Bok

    2008-12-01

    Yield has always been a driving consideration during fabrication of modern semiconductor industry. Statistically, the largest portion of wafer yield loss is defective scan failure. This paper presents efficient failure analysis methods for initial yield ramp up and ongoing product with scan diagnosis. Result of our analysis shows that more than 60% of the scan failure dies fall into the category of shift mode in the very deep submicron (VDSM) devices. However, localization of scan shift mode failure is very difficult in comparison to capture mode failure because it is caused by the malfunction of scan chain. Addressing the biggest challenge, we propose the most suitable analysis method according to scan failure mode (capture / shift) for yield enhancement. In the event of capture failure mode, this paper describes the method that integrates scan diagnosis flow and backside probing technology to obtain more accurate candidates. We also describe several unique techniques, such as bulk back-grinding solution, efficient backside probing and signal analysis method. Lastly, we introduce blocked chain analysis algorithm for efficient analysis of shift failure mode. In this paper, we contribute to enhancement of the yield as a result of the combination of two methods. We confirm the failure candidates with physical failure analysis (PFA) method. The direct feedback of the defective visualization is useful to mass-produce devices in a shorter time. The experimental data on mass products show that our method produces average reduction by 13.7% in defective SCAN & SRAM-BIST failure rates and by 18.2% in wafer yield rates.

  11. Evidence of validity of the Stress-Producing Life Events (SPLE) instrument.

    PubMed

    Rizzini, Marta; Santos, Alcione Miranda Dos; Silva, Antônio Augusto Moura da

    2018-01-01

    OBJECTIVE Evaluate the construct validity of a list of eight Stressful Life Events in pregnant women. METHODS A cross-sectional study was conducted with 1,446 pregnant women in São Luís, MA, and 1,364 pregnant women in Ribeirão Preto, SP (BRISA cohort), from February 2010 to June 2011. In the exploratory factorial analysis, the promax oblique rotation was used and for the calculation of the internal consistency, we used the compound reliability. The construct validity was determined by means of the confirmatory factorial analysis with the method of estimation of weighted least squares adjusted by the mean and variance. RESULTS The model with the best fit in the exploratory analysis was the one that retained three factors with a cumulative variance of 61.1%. The one-factor model did not obtain a good fit in both samples in the confirmatory analysis. The three-factor model called Stress-Producing Life Events presented a good fit (RMSEA < 0.05; CFI/TLI > 0.90) for both samples. CONCLUSIONS The Stress-Producing Life Events constitute a second order construct with three dimensions related to health, personal and financial aspects and violence. This study found evidence that confirms the construct validity of a list of stressor events, entitled Stress-Producing Life Events Inventory.

  12. Validated green high-performance liquid chromatographic methods for the determination of coformulated pharmaceuticals: a comparison with reported conventional methods.

    PubMed

    Elzanfaly, Eman S; Hegazy, Maha A; Saad, Samah S; Salem, Maissa Y; Abd El Fattah, Laila E

    2015-03-01

    The introduction of sustainable development concepts to analytical laboratories has recently gained interest, however, most conventional high-performance liquid chromatography methods do not consider either the effect of the used chemicals or the amount of produced waste on the environment. The aim of this work was to prove that conventional methods can be replaced by greener ones with the same analytical parameters. The suggested methods were designed so that they neither use nor produce harmful chemicals and produce minimum waste to be used in routine analysis without harming the environment. This was achieved by using green mobile phases and short run times. Four mixtures were chosen as models for this study; clidinium bromide/chlordiazepoxide hydrochloride, phenobarbitone/pipenzolate bromide, mebeverine hydrochloride/sulpiride, and chlorphenoxamine hydrochloride/caffeine/8-chlorotheophylline either in their bulk powder or in their dosage forms. The methods were validated with respect to linearity, precision, accuracy, system suitability, and robustness. The developed methods were compared to the reported conventional high-performance liquid chromatography methods regarding their greenness profile. The suggested methods were found to be greener and more time- and solvent-saving than the reported ones; hence they can be used for routine analysis of the studied mixtures without harming the environment. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Portable method of measuring gaseous acetone concentrations.

    PubMed

    Worrall, Adam D; Bernstein, Jonathan A; Angelopoulos, Anastasios P

    2013-08-15

    Measurement of acetone in human breath samples has been previously shown to provide significant non-invasive diagnostic insight into the control of a patient's diabetic condition. In patients with diabetes mellitus, the body produces excess amounts of ketones such as acetone, which are then exhaled during respiration. Using various breath analysis methods has allowed for the accurate determination of acetone concentrations in exhaled breath. However, many of these methods require instrumentation and pre-concentration steps not suitable for point-of-care use. We have found that by immobilizing resorcinol reagent into a perfluorosulfonic acid polymer membrane, a controlled organic synthesis reaction occurs with acetone in a dry carrier gas. The immobilized, highly selective product of this reaction (a flavan) is found to produce a visible spectrum color change which could measure acetone concentrations to less than ppm. We here demonstrate how this approach can be used to produce a portable optical sensing device for real-time, non-invasive acetone analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. An Extreme-Value Approach to Anomaly Vulnerability Identification

    NASA Technical Reports Server (NTRS)

    Everett, Chris; Maggio, Gaspare; Groen, Frank

    2010-01-01

    The objective of this paper is to present a method for importance analysis in parametric probabilistic modeling where the result of interest is the identification of potential engineering vulnerabilities associated with postulated anomalies in system behavior. In the context of Accident Precursor Analysis (APA), under which this method has been developed, these vulnerabilities, designated as anomaly vulnerabilities, are conditions that produce high risk in the presence of anomalous system behavior. The method defines a parameter-specific Parameter Vulnerability Importance measure (PVI), which identifies anomaly risk-model parameter values that indicate the potential presence of anomaly vulnerabilities, and allows them to be prioritized for further investigation. This entails analyzing each uncertain risk-model parameter over its credible range of values to determine where it produces the maximum risk. A parameter that produces high system risk for a particular range of values suggests that the system is vulnerable to the modeled anomalous conditions, if indeed the true parameter value lies in that range. Thus, PVI analysis provides a means of identifying and prioritizing anomaly-related engineering issues that at the very least warrant improved understanding to reduce uncertainty, such that true vulnerabilities may be identified and proper corrective actions taken.

  15. The Use of Multidimensional Image-Based Analysis to Accurately Monitor Cell Growth in 3D Bioreactor Culture

    PubMed Central

    Baradez, Marc-Olivier; Marshall, Damian

    2011-01-01

    The transition from traditional culture methods towards bioreactor based bioprocessing to produce cells in commercially viable quantities for cell therapy applications requires the development of robust methods to ensure the quality of the cells produced. Standard methods for measuring cell quality parameters such as viability provide only limited information making process monitoring and optimisation difficult. Here we describe a 3D image-based approach to develop cell distribution maps which can be used to simultaneously measure the number, confluency and morphology of cells attached to microcarriers in a stirred tank bioreactor. The accuracy of the cell distribution measurements is validated using in silico modelling of synthetic image datasets and is shown to have an accuracy >90%. Using the cell distribution mapping process and principal component analysis we show how cell growth can be quantitatively monitored over a 13 day bioreactor culture period and how changes to manufacture processes such as initial cell seeding density can significantly influence cell morphology and the rate at which cells are produced. Taken together, these results demonstrate how image-based analysis can be incorporated in cell quality control processes facilitating the transition towards bioreactor based manufacture for clinical grade cells. PMID:22028809

  16. The use of multidimensional image-based analysis to accurately monitor cell growth in 3D bioreactor culture.

    PubMed

    Baradez, Marc-Olivier; Marshall, Damian

    2011-01-01

    The transition from traditional culture methods towards bioreactor based bioprocessing to produce cells in commercially viable quantities for cell therapy applications requires the development of robust methods to ensure the quality of the cells produced. Standard methods for measuring cell quality parameters such as viability provide only limited information making process monitoring and optimisation difficult. Here we describe a 3D image-based approach to develop cell distribution maps which can be used to simultaneously measure the number, confluency and morphology of cells attached to microcarriers in a stirred tank bioreactor. The accuracy of the cell distribution measurements is validated using in silico modelling of synthetic image datasets and is shown to have an accuracy >90%. Using the cell distribution mapping process and principal component analysis we show how cell growth can be quantitatively monitored over a 13 day bioreactor culture period and how changes to manufacture processes such as initial cell seeding density can significantly influence cell morphology and the rate at which cells are produced. Taken together, these results demonstrate how image-based analysis can be incorporated in cell quality control processes facilitating the transition towards bioreactor based manufacture for clinical grade cells.

  17. Application of phyto-indication and radiocesium indicative methods for microrelief mapping

    NASA Astrophysics Data System (ADS)

    Panidi, E.; Trofimetz, L.; Sokolova, J.

    2016-04-01

    Remote sensing technologies are widely used for production of Digital Elevation Models (DEMs), and geomorphometry techniques are valuable tools for DEM analysis. One of the broadly used applications of these technologies and techniques is relief mapping. In the simplest case, we can identify relief structures using DEM analysis, and produce a map or map series to show the relief condition. However, traditional techniques might fail when used for mapping microrelief structures (structures below ten meters in size). In this case high microrelief dynamics lead to technological and conceptual difficulties. Moreover, erosion of microrelief structures cannot be detected at the initial evolution stage using DEM modelling and analysis only. In our study, we investigate the possibilities and specific techniques for allocation of erosion microrelief structures, and mapping techniques for the microrelief derivatives (e.g. quantitative parameters of microrelief). Our toolset includes the analysis of spatial redistribution of the soil pollutants and phyto-indication analysis, which complement the common DEM modelling and geomorphometric analysis. We use field surveys produced at the test area, which is arable territory with high erosion risks. Our main conclusion at the current stage is that the indicative methods (i.e. radiocesium and phyto-indication methods) are effective for allocation of the erosion microrelief structures. Also, these methods need to be formalized for convenient use.

  18. Three dimensional empirical mode decomposition analysis apparatus, method and article manufacture

    NASA Technical Reports Server (NTRS)

    Gloersen, Per (Inventor)

    2004-01-01

    An apparatus and method of analysis for three-dimensional (3D) physical phenomena. The physical phenomena may include any varying 3D phenomena such as time varying polar ice flows. A repesentation of the 3D phenomena is passed through a Hilbert transform to convert the data into complex form. A spatial variable is separated from the complex representation by producing a time based covariance matrix. The temporal parts of the principal components are produced by applying Singular Value Decomposition (SVD). Based on the rapidity with which the eigenvalues decay, the first 3-10 complex principal components (CPC) are selected for Empirical Mode Decomposition into intrinsic modes. The intrinsic modes produced are filtered in order to reconstruct the spatial part of the CPC. Finally, a filtered time series may be reconstructed from the first 3-10 filtered complex principal components.

  19. [Tobacco quality analysis of industrial classification of different producing area using near-infrared (NIR) spectrum].

    PubMed

    Wang, Yi; Ma, Xiang; Wen, Ya-Dong; Yu, Chun-Xia; Wang, Luo-Ping; Zhao, Long-Lian; Li, Jun-Hui

    2012-10-01

    In this study, tobacco quality analysis of industrial classification of different producing area was carried out applying spectrum projection and correlation methods. The group of industrial classification data was near-infrared (NIR) spectrum in 2010 year from different tobacco plant parts and colors of Hongta Tobacco (Group) Co., Ltd. 6 064 tobacco leaf samples of 17 classes from Yuxi, Chuxiong and Zhaotong, in Yunnan province and 6 industrial classifications were collected using near infrared spectroscopy, which from different parts and colors and all belong to tobacco varieties of K326. The conclusion showed that, the probability of the grading belonging by the first dimension was 84%, the probability of the producing area belonging by the second dimension was 71%. The study can explain the difference of tobacco quality of industrial classification and producing area by a projection method to get the quantitative similarity values. The quantitative similarity values were instructive in combination of tobacco leaf blending.

  20. Comparison of Video Head Impulse Test (vHIT) Gains Between Two Commercially Available Devices and by Different Gain Analytical Methods.

    PubMed

    Lee, Sang Hun; Yoo, Myung Hoon; Park, Jun Woo; Kang, Byung Chul; Yang, Chan Joo; Kang, Woo Suk; Ahn, Joong Ho; Chung, Jong Woo; Park, Hong Ju

    2018-06-01

    To evaluate whether video head impulse test (vHIT) gains are dependent on the measuring device and method of analysis. Prospective study. vHIT was performed in 25 healthy subjects using two devices simultaneously. vHIT gains were compared between these instruments and using five different methods of comparing position and velocity gains during head movement intervals. The two devices produced different vHIT gain results with the same method of analysis. There were also significant differences in the vHIT gains measured using different analytical methods. The gain analytic method that compares the areas under the velocity curve (AUC) of the head and eye movements during head movements showed lower vHIT gains than a method that compared the peak velocities of the head and eye movements. The former method produced the vHIT gain with the smallest standard deviation among the five procedures tested in this study. vHIT gains differ in normal subjects depending on the device and method of analysis used, suggesting that it is advisable for each device to have its own normal values. Gain calculations that compare the AUC of the head and eye movements during the head movements show the smallest variance.

  1. Quantitative analysis of fatty-acid-based biofuels produced by wild-type and genetically engineered cyanobacteria by gas chromatography-mass spectrometry.

    PubMed

    Guan, Wenna; Zhao, Hui; Lu, Xuefeng; Wang, Cong; Yang, Menglong; Bai, Fali

    2011-11-11

    Simple and rapid quantitative determination of fatty-acid-based biofuels is greatly important for the study of genetic engineering progress for biofuels production by microalgae. Ideal biofuels produced from biological systems should be chemically similar to petroleum, like fatty-acid-based molecules including free fatty acids, fatty acid methyl esters, fatty acid ethyl esters, fatty alcohols and fatty alkanes. This study founded a gas chromatography-mass spectrometry (GC-MS) method for simultaneous quantification of seven free fatty acids, nine fatty acid methyl esters, five fatty acid ethyl esters, five fatty alcohols and three fatty alkanes produced by wild-type Synechocystis PCC 6803 and its genetically engineered strain. Data obtained from GC-MS analyses were quantified using internal standard peak area comparisons. The linearity, limit of detection (LOD) and precision (RSD) of the method were evaluated. The results demonstrated that fatty-acid-based biofuels can be directly determined by GC-MS without derivation. Therefore, rapid and reliable quantitative analysis of fatty-acid-based biofuels produced by wild-type and genetically engineered cyanobacteria can be achieved using the GC-MS method founded in this work. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Univariate and bivariate likelihood-based meta-analysis methods performed comparably when marginal sensitivity and specificity were the targets of inference.

    PubMed

    Dahabreh, Issa J; Trikalinos, Thomas A; Lau, Joseph; Schmid, Christopher H

    2017-03-01

    To compare statistical methods for meta-analysis of sensitivity and specificity of medical tests (e.g., diagnostic or screening tests). We constructed a database of PubMed-indexed meta-analyses of test performance from which 2 × 2 tables for each included study could be extracted. We reanalyzed the data using univariate and bivariate random effects models fit with inverse variance and maximum likelihood methods. Analyses were performed using both normal and binomial likelihoods to describe within-study variability. The bivariate model using the binomial likelihood was also fit using a fully Bayesian approach. We use two worked examples-thoracic computerized tomography to detect aortic injury and rapid prescreening of Papanicolaou smears to detect cytological abnormalities-to highlight that different meta-analysis approaches can produce different results. We also present results from reanalysis of 308 meta-analyses of sensitivity and specificity. Models using the normal approximation produced sensitivity and specificity estimates closer to 50% and smaller standard errors compared to models using the binomial likelihood; absolute differences of 5% or greater were observed in 12% and 5% of meta-analyses for sensitivity and specificity, respectively. Results from univariate and bivariate random effects models were similar, regardless of estimation method. Maximum likelihood and Bayesian methods produced almost identical summary estimates under the bivariate model; however, Bayesian analyses indicated greater uncertainty around those estimates. Bivariate models produced imprecise estimates of the between-study correlation of sensitivity and specificity. Differences between methods were larger with increasing proportion of studies that were small or required a continuity correction. The binomial likelihood should be used to model within-study variability. Univariate and bivariate models give similar estimates of the marginal distributions for sensitivity and specificity. Bayesian methods fully quantify uncertainty and their ability to incorporate external evidence may be useful for imprecisely estimated parameters. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Geometrically derived difference formulae for the numerical integration of trajectory problems

    NASA Technical Reports Server (NTRS)

    Mcleod, R. J. Y.; Sanz-Serna, J. M.

    1981-01-01

    The term 'trajectory problem' is taken to include problems that can arise, for instance, in connection with contour plotting, or in the application of continuation methods, or during phase-plane analysis. Geometrical techniques are used to construct difference methods for these problems to produce in turn explicit and implicit circularly exact formulae. Based on these formulae, a predictor-corrector method is derived which, when compared with a closely related standard method, shows improved performance. It is found that this latter method produces spurious limit cycles, and this behavior is partly analyzed. Finally, a simple variable-step algorithm is constructed and tested.

  4. Lactase persistence genotyping on whole blood by loop-mediated isothermal amplification and melting curve analysis.

    PubMed

    Abildgaard, Anders; Tovbjerg, Sara K; Giltay, Axel; Detemmerman, Liselot; Nissen, Peter H

    2018-03-26

    The lactase persistence phenotype is controlled by a regulatory enhancer region upstream of the Lactase (LCT) gene. In northern Europe, specifically the -13910C > T variant has been associated with lactase persistence whereas other persistence variants, e.g. -13907C > G and -13915 T > G, have been identified in Africa and the Middle East. The aim of the present study was to compare a previously developed high resolution melting assay (HRM) with a novel method based on loop-mediated isothermal amplification and melting curve analysis (LAMP-MC) with both whole blood and DNA as input material. To evaluate the LAMP-MC method, we used 100 whole blood samples and 93 DNA samples in a two tiered study. First, we studied the ability of the LAMP-MC method to produce specific melting curves for several variants of the LCT enhancer region. Next, we performed a blinded comparison between the LAMP-MC method and our existing HRM method with clinical samples of unknown genotype. The LAMP-MC method produced specific melting curves for the variants at position -13909, -13910, -13913 whereas the -13907C > G and -13915 T > G variants produced indistinguishable melting profiles. The LAMP-MC assay is a simple method for lactase persistence genotyping and compares well with our existing HRM method. Copyright © 2018. Published by Elsevier B.V.

  5. Morphology and physical-chemical properties of celluloses obtained by different methods

    NASA Astrophysics Data System (ADS)

    Anpilova, A. Yu.; Mastalygina, E. E.; Mikhaylov, I. A.; Popov, A. A.; Kartasheva, Z. S.

    2017-12-01

    The morphology and structural characteristics of celluloses obtained by different methods were studied. The objects of the investigation are cellulose from pulp source, commercial celluloses produced by sodium and acid hydrolysis, laboratory produced cellulose from bleached birch kraft pulp, and cellulose obtained by thermooxidative catalytic treatment of maple leaves by peroxide. According to a complex analysis of cellulose characteristics, several types of celluloses were offered as modifying additives for polymers.

  6. Optical spectroscopy of laser-produced plasmas for standoff isotopic analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harilal, Sivanandan S.; Brumfield, Brian E.; LaHaye, Nicole L.

    2018-04-20

    This review article covers the present status of isotope detection through emission, absorption, and fluorescence spectroscopy of atoms and molecules in a laser-produced plasma formed from a solid sample. A description of the physics behind isotope shifts in atoms and molecules is presented, followed by the physics behind solid sampling of laser ablation plumes, optical methods for isotope measurements, the suitable physical conditions of laser-produced plasma plumes for isotopic analysis, and the current status. Finally, concluding remarks will be made on the existing gaps between previous works in the literature and suggestions for future work.

  7. Optical spectroscopy of laser-produced plasmas for standoff isotopic analysis

    DOE PAGES

    Harilal, S. S.; Brumfield, B. E.; LaHaye, N. L.; ...

    2018-04-20

    This review article covers the present status of isotope detection through emission, absorption, and fluorescence spectroscopy of atoms and molecules in a laser-produced plasma formed from a solid sample. A description of the physics behind isotope shifts in atoms and molecules is presented, followed by the physics behind solid sampling of laser ablation plumes, optical methods for isotope measurements, the suitable physical conditions of laser-produced plasma plumes for isotopic analysis, and the current status. Lastly, concluding remarks will be made on the existing gaps between previous works in the literature and suggestions for future work.

  8. Optical spectroscopy of laser-produced plasmas for standoff isotopic analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harilal, S. S.; Brumfield, B. E.; LaHaye, N. L.

    This review article covers the present status of isotope detection through emission, absorption, and fluorescence spectroscopy of atoms and molecules in a laser-produced plasma formed from a solid sample. A description of the physics behind isotope shifts in atoms and molecules is presented, followed by the physics behind solid sampling of laser ablation plumes, optical methods for isotope measurements, the suitable physical conditions of laser-produced plasma plumes for isotopic analysis, and the current status. Finally, concluding remarks will be made on the existing gaps between previous works in the literature and suggestions for future work.

  9. Optical spectroscopy of laser-produced plasmas for standoff isotopic analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harilal, S. S.; Brumfield, B. E.; LaHaye, N. L.

    This review article covers the present status of isotope detection through emission, absorption, and fluorescence spectroscopy of atoms and molecules in a laser-produced plasma formed from a solid sample. A description of the physics behind isotope shifts in atoms and molecules is presented, followed by the physics behind solid sampling of laser ablation plumes, optical methods for isotope measurements, the suitable physical conditions of laser-produced plasma plumes for isotopic analysis, and the current status. Lastly, concluding remarks will be made on the existing gaps between previous works in the literature and suggestions for future work.

  10. Optical spectroscopy of laser-produced plasmas for standoff isotopic analysis

    DOE PAGES

    Harilal, S. S.; Brumfield, B. E.; LaHaye, N. L.; ...

    2018-06-01

    This review article covers the present status of isotope detection through emission, absorption, and fluorescence spectroscopy of atoms and molecules in a laser-produced plasma formed from a solid sample. A description of the physics behind isotope shifts in atoms and molecules is presented, followed by the physics behind solid sampling of laser ablation plumes, optical methods for isotope measurements, the suitable physical conditions of laser-produced plasma plumes for isotopic analysis, and the current status. Finally, concluding remarks will be made on the existing gaps between previous works in the literature and suggestions for future work.

  11. Focused ion beam source method and apparatus

    DOEpatents

    Pellin, Michael J.; Lykke, Keith R.; Lill, Thorsten B.

    2000-01-01

    A focused ion beam having a cross section of submicron diameter, a high ion current, and a narrow energy range is generated from a target comprised of particle source material by laser ablation. The method involves directing a laser beam having a cross section of critical diameter onto the target, producing a cloud of laser ablated particles having unique characteristics, and extracting and focusing a charged particle beam from the laser ablated cloud. The method is especially suited for producing focused ion beams for semiconductor device analysis and modification.

  12. Vessel extraction in retinal images using automatic thresholding and Gabor Wavelet.

    PubMed

    Ali, Aziah; Hussain, Aini; Wan Zaki, Wan Mimi Diyana

    2017-07-01

    Retinal image analysis has been widely used for early detection and diagnosis of multiple systemic diseases. Accurate vessel extraction in retinal image is a crucial step towards a fully automated diagnosis system. This work affords an efficient unsupervised method for extracting blood vessels from retinal images by combining existing Gabor Wavelet (GW) method with automatic thresholding. Green channel image is extracted from color retinal image and used to produce Gabor feature image using GW. Both green channel image and Gabor feature image undergo vessel-enhancement step in order to highlight blood vessels. Next, the two vessel-enhanced images are transformed to binary images using automatic thresholding before combined to produce the final vessel output. Combining the images results in significant improvement of blood vessel extraction performance compared to using individual image. Effectiveness of the proposed method was proven via comparative analysis with existing methods validated using publicly available database, DRIVE.

  13. A Mechanical Power Flow Capability for the Finite Element Code NASTRAN

    DTIC Science & Technology

    1989-07-01

    perimental methods. statistical energy analysis , the finite element method, and a finite element analog-,y using heat conduction equations. Experimental...weights and inertias of the transducers attached to an experimental structure may produce accuracy problems. Statistical energy analysis (SEA) is a...405-422 (1987). 8. Lyon, R.L., Statistical Energy Analysis of Dynamical Sistems, The M.I.T. Press, (1975). 9. Mickol, J.D., and R.J. Bernhard, "An

  14. Supercritical Water Oxidation Total Organic Carbon (TOC) Analysis

    EPA Pesticide Factsheets

    The work presented here is the evaluation of the modified wet‐oxidation method described as Supercritical Water Oxidation (SCWO) for the analysis of total organic carbon (TOC) in very difficult oil/gas produced water sample matrices.

  15. Review and statistical analysis of the ultrasonic velocity method for estimating the porosity fraction in polycrystalline materials

    NASA Technical Reports Server (NTRS)

    Roth, D. J.; Swickard, S. M.; Stang, D. B.; Deguire, M. R.

    1990-01-01

    A review and statistical analysis of the ultrasonic velocity method for estimating the porosity fraction in polycrystalline materials is presented. Initially, a semi-empirical model is developed showing the origin of the linear relationship between ultrasonic velocity and porosity fraction. Then, from a compilation of data produced by many researchers, scatter plots of velocity versus percent porosity data are shown for Al2O3, MgO, porcelain-based ceramics, PZT, SiC, Si3N4, steel, tungsten, UO2,(U0.30Pu0.70)C, and YBa2Cu3O(7-x). Linear regression analysis produced predicted slope, intercept, correlation coefficient, level of significance, and confidence interval statistics for the data. Velocity values predicted from regression analysis for fully-dense materials are in good agreement with those calculated from elastic properties.

  16. The JPL Cost Risk Analysis Approach that Incorporates Engineering Realism

    NASA Technical Reports Server (NTRS)

    Harmon, Corey C.; Warfield, Keith R.; Rosenberg, Leigh S.

    2006-01-01

    This paper discusses the JPL Cost Engineering Group (CEG) cost risk analysis approach that accounts for all three types of cost risk. It will also describe the evaluation of historical cost data upon which this method is based. This investigation is essential in developing a method that is rooted in engineering realism and produces credible, dependable results to aid decision makers.

  17. Methods for collection and analysis of geopressured geothermal and oil field waters

    USGS Publications Warehouse

    Lico, Michael S.; Kharaka, Yousif K.; Carothers, William W.; Wright, Victoria A.

    1982-01-01

    Present methods are described for the collection, preservation, and chemical analysis of waters produced from geopressured geothermal and petroleum wells. Detailed procedures for collection include precautions and equipment necessary to ensure that the sample is representative of the water produced. Procedures for sample preservation include filtration, acidification, dilution for silica, methyl isobutyl ketone (MIBK) extraction of aluminum, addition of potassium permanganate to preserve mercury, and precipitation of carbonate species as strontium carbonate for stable carbon isotopes and total dissolved carbonate analysis. Characteristics determined at the well site are sulfide, pH, ammonia, and conductivity. Laboratory procedures are given for the analysis of lithium, sodium, potassium, rubidium, cesium, magnesium, calcium, strontium, barium, iron, manganese, zinc, lead, aluminum, .and mercury by atomic absorption and flame emission spectroscopy. Chloride is determined by silver nitrate titration and fluoride by ion-specific electrode. Bromide and iodide concentrations are determined by the hypochlorite oxidation method. Sulfate is analyzed by titration using barium chloride with thorin indicator after pretreatment with alumina. Boron and silica are determined colorimetrically by the carmine and molybdate-blue methods, respectively. Aliphatic acid anions (C2 through C5) are determined by gas chromatography after separation and concentration in a chloroform-butanol mixture.

  18. Bearing faults identification and resonant band demodulation based on wavelet de-noising methods and envelope analysis

    NASA Astrophysics Data System (ADS)

    Abdelrhman, Ahmed M.; Sei Kien, Yong; Salman Leong, M.; Meng Hee, Lim; Al-Obaidi, Salah M. Ali

    2017-07-01

    The vibration signals produced by rotating machinery contain useful information for condition monitoring and fault diagnosis. Fault severities assessment is a challenging task. Wavelet Transform (WT) as a multivariate analysis tool is able to compromise between the time and frequency information in the signals and served as a de-noising method. The CWT scaling function gives different resolutions to the discretely signals such as very fine resolution at lower scale but coarser resolution at a higher scale. However, the computational cost increased as it needs to produce different signal resolutions. DWT has better low computation cost as the dilation function allowed the signals to be decomposed through a tree of low and high pass filters and no further analysing the high-frequency components. In this paper, a method for bearing faults identification is presented by combing Continuous Wavelet Transform (CWT) and Discrete Wavelet Transform (DWT) with envelope analysis for bearing fault diagnosis. The experimental data was sampled by Case Western Reserve University. The analysis result showed that the proposed method is effective in bearing faults detection, identify the exact fault’s location and severity assessment especially for the inner race and outer race faults.

  19. Using IR spectroscopy to determine biodiesel conversion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tirla, Cornelia; Dooling, Thomas; Smith, Rachel B.

    This project’s aim was to develop a way to use Infrared (IR) Spectroscopy to determine the extent of the conversion of oil to biodiesel produced during a typical transesterification process. The most reliable method of biodiesel analysis is gas chromatography with flame ionization detector (GC-FID) analysis. This analysis is not ideal for the educational or field applications since it is time intensive and requires expensive, specialized equipment. The use of the FT-IR is faster and more affordable. A calibration curve was created by comparing data on the purity of biodiesel samples obtained from the GC-FID analysis to the ratio ofmore » the absorbances at 1197 cm -1 to 1166 cm -1 from the FT-IR spectrum. For field application, a similar method was developed using a portable IR spectrometer, the InfraSpecTM VFA-IR spectrometer. As a result, the data collected gave a good linear fit for % purity of the samples and can be used as a method for monitoring the conversion of different types of oils to biodiesel and determining the purity of biodiesel produced from oil.« less

  20. Using IR spectroscopy to determine biodiesel conversion

    DOE PAGES

    Tirla, Cornelia; Dooling, Thomas; Smith, Rachel B.; ...

    2013-01-01

    This project’s aim was to develop a way to use Infrared (IR) Spectroscopy to determine the extent of the conversion of oil to biodiesel produced during a typical transesterification process. The most reliable method of biodiesel analysis is gas chromatography with flame ionization detector (GC-FID) analysis. This analysis is not ideal for the educational or field applications since it is time intensive and requires expensive, specialized equipment. The use of the FT-IR is faster and more affordable. A calibration curve was created by comparing data on the purity of biodiesel samples obtained from the GC-FID analysis to the ratio ofmore » the absorbances at 1197 cm -1 to 1166 cm -1 from the FT-IR spectrum. For field application, a similar method was developed using a portable IR spectrometer, the InfraSpecTM VFA-IR spectrometer. As a result, the data collected gave a good linear fit for % purity of the samples and can be used as a method for monitoring the conversion of different types of oils to biodiesel and determining the purity of biodiesel produced from oil.« less

  1. A Search for the tt¯H (H → bb) Large Hadron Collider with the atlas detector using a matrix element method

    NASA Astrophysics Data System (ADS)

    Basye, Austin T.

    A matrix element method analysis of the Standard Model Higgs boson, produced in association with two top quarks decaying to the lepton-plus-jets channel is presented. Based on 20.3 fb--1 of s=8 TeV data, produced at the Large Hadron Collider and collected by the ATLAS detector, this analysis utilizes multiple advanced techniques to search for ttH signatures with a 125 GeV Higgs boson decaying to two b -quarks. After categorizing selected events based on their jet and b-tag multiplicities, signal rich regions are analyzed using the matrix element method. Resulting variables are then propagated to two parallel multivariate analyses utilizing Neural Networks and Boosted Decision Trees respectively. As no significant excess is found, an observed (expected) limit of 3.4 (2.2) times the Standard Model cross-section is determined at 95% confidence, using the CLs method, for the Neural Network analysis. For the Boosted Decision Tree analysis, an observed (expected) limit of 5.2 (2.7) times the Standard Model cross-section is determined at 95% confidence, using the CLs method. Corresponding unconstrained fits of the Higgs boson signal strength to the observed data result in the measured signal cross-section to Standard Model cross-section prediction of mu = 1.2 +/- 1.3(total) +/- 0.7(stat.) for the Neural Network analysis, and mu = 2.9 +/- 1.4(total) +/- 0.8(stat.) for the Boosted Decision Tree analysis.

  2. Practical considerations for measuring hydrogen concentrations in groundwater

    USGS Publications Warehouse

    Chapelle, F.H.; Vroblesky, D.A.; Woodward, J.C.; Lovley, D.R.

    1997-01-01

    Several practical considerations for measuring concentrations of dissolved molecular hydrogen (H2) in groundwater including 1 sampling methods 2 pumping methods and (3) effects of well casing materials were evaluated. Three different sampling methodologies (a downhole sampler, a gas- stripping method, and a diffusion sampler) were compared. The downhole sampler and gas-stripping methods gave similar results when applied to the same wells, the other hand, appeared to The diffusion sampler, on overestimate H2 concentrations relative to the downhole sampler. Of these methods, the gas-stripping method is better suited to field conditions because it is faster (~ 30 min for a single analysis as opposed to 2 h for the downhole sampler or 8 h for the diffusion sampler), the analysis is easier (less sample manipulation is required), and the data computations are more straightforward (H2 concentrations need not be corrected for water sample volume). Measurement of H2 using the gas-stripping method can be affected by different pumping equipment. Peristaltic, piston, and bladder pumps all gave similar results when applied to water produced from the same well. It was observed, however, that peristaltic-pumped water (which draws water under a negative pressure) enhanced the gas-stripping process and equilibrated slightly faster than either piston or bladder pumps (which push water under a positive pressure). A direct current(dc) electrically driven submersible pump was observed to produce H2 and was not suitable for measuring H2 in groundwater. Measurements from two field sites indicate that iron or steel well casings, produce H2, which masks H2 concentrations in groundwater. PVC-cased wells or wells cased with other materials that do not produce H2 are necessary for measuring H2 concentrations in groundwater.Several practical considerations for measuring concentrations of dissolved molecular hydrogen in groundwater including sampling methods, pumping methods, and effects of well casing materials were evaluated. The downhole sampler and gas-stripping methods gave similar results when applied to the same wells. The diffusional sampler appears to overestimate H2 concentrations relative to the downhole sampler. Gas-stripping method is better for a single analysis and the data computations are more straightforward. Measurement of H2 using the gas-stripping method can be affected by different pumping equipment.

  3. A meta-analysis of an implicit measure of personality functioning: the Mutuality of Autonomy Scale.

    PubMed

    Graceffo, Robert A; Mihura, Joni L; Meyer, Gregory J

    2014-01-01

    The Mutuality of Autonomy scale (MA) is a Rorschach variable designed to capture the degree to which individuals mentally represent self and other as mutually autonomous versus pathologically destructive (Urist, 1977). Discussions of the MA's validity found in articles and chapters usually claim good support, which we evaluated by a systematic review and meta-analysis of its construct validity. Overall, in a random effects analysis across 24 samples (N = 1,801) and 91 effect sizes, the MA scale was found to maintain a relationship of r =.20, 95% CI [.16,.25], with relevant validity criteria. We hypothesized that MA summary scores that aggregate more MA response-level data would maintain the strongest relationship with relevant validity criteria. Results supported this hypothesis (aggregated scoring method: r =.24, k = 57, S = 24; nonaggregated scoring methods: r =.15, k = 34, S = 10; p =.039, 2-tailed). Across 7 exploratory moderator analyses, only 1 (criterion method) produced significant results. Criteria derived from the Thematic Apperception Test produced smaller effects than clinician ratings, diagnostic differentiation, and self-attributed characteristics; criteria derived from observer reports produced smaller effects than clinician ratings and self-attributed characteristics. Implications of the study's findings are discussed in terms of both research and clinical work.

  4. Estimation of reactogenicity of preparations produced on the basis of photoinactivated live vaccines against brucellosis and tularaemia on the organismic level.1. Using the LASCA method

    NASA Astrophysics Data System (ADS)

    Ulianova, O. V.; Uianov, S. S.; Li, Pengcheng; Luo, Qingming

    2011-04-01

    A new method of photoinactivation of bacteria aimed at producing prototypes of vaccine preparations against extremely dangerous infections is described. The reactogenicity of the new prophylactic preparations was studied using the laser speckle contrast analysis (LASCA). The performed experimental studies show that bacterial suspensions, irradiated using different regimes of photoinactivation, do not cause detrimental effect on the blood microcirculation in laboratory animals.

  5. Estimation of reactogenicity of preparations produced on the basis of photoinactivated live vaccines against brucellosis and tularaemia on the organismic level. 1. Using the LASCA method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ulianova, O V; Uianov, S S; Li Pengcheng

    2011-04-30

    A new method of photoinactivation of bacteria aimed at producing prototypes of vaccine preparations against extremely dangerous infections is described. The reactogenicity of the new prophylactic preparations was studied using the laser speckle contrast analysis (LASCA). The performed experimental studies show that bacterial suspensions, irradiated using different regimes of photoinactivation, do not cause detrimental effect on the blood microcirculation in laboratory animals. (optical technologies in biophysics and medicine)

  6. Relating marten scat contents to prey consumed

    Treesearch

    William J. Zielinski

    1986-01-01

    A European ferret, Mustela putorius furo, was fed typical marten food items to discover the relationship between prey weight and number of scats produced per unit weight of prey. A correction factor was derived that was used in the analysis of pine marten, Martes americana, scats to produce a method capable of comparing foods on a...

  7. Projecting county pulpwood production with historical production and macro-economic variables

    Treesearch

    Consuelo Brandeis; Dayton M. Lambert

    2014-01-01

    We explored forecasting of county roundwood pulpwood produc-tion with county-vector autoregressive (CVAR) and spatial panelvector autoregressive (SPVAR) methods. The analysis used timberproducts output data for the state of Florida, together with a set ofmacro-economic variables. Overall, we found the SPVAR specifica-tion produced forecasts with lower error rates...

  8. The technical analysis of the stock exchange and physics: Japanese candlesticks for solar activity

    NASA Astrophysics Data System (ADS)

    Dineva, C.; Atanasov, V.

    2013-09-01

    In this article, we use the Japanese candlesticks, a method popular in the technical analysis of the Stock/Forex markets and apply it to a variable in physics-the solar activity. This method is invented and used exclusively for economic analysis and its application to a physical problem produced unexpected results. We found that the Japanese candlesticks are convenient tool in the analysis of the variables in the physics of the Sun. Based on our observations, we differentiated a new cycle in the solar activity.

  9. 16th IHIW: Global analysis of registry HLA haplotypes from 20 Million individuals: Report from the IHIW Registry Diversity Group

    PubMed Central

    Maiers, M; Gragert, L; Madbouly, A; Steiner, D; Marsh, S G E; Gourraud, P-A; Oudshoorn, M; Zanden, H; Schmidt, A H; Pingel, J; Hofmann, J; Müller, C; Eberhard, H-P

    2013-01-01

    This project has the goal to validate bioinformatics methods and tools for HLA haplotype frequency analysis specifically addressing unique issues of haematopoietic stem cell registry data sets. In addition to generating new methods and tools for the analysis of registry data sets, the intent is to produce a comprehensive analysis of HLA data from 20 million donors from the Bone Marrow Donors Worldwide (BMDW) database. This report summarizes the activity on this project as of the 16IHIW meeting in Liverpool. PMID:23280139

  10. A Look at Aircraft Accident Analysis in the Early Days: Do Early 20th Century Accident Investigation Techniques Have Any Lessons for Today?

    NASA Technical Reports Server (NTRS)

    Holloway, C. M.; Johnson, C. W.

    2007-01-01

    In the early years of powered flight, the National Advisory Committee on Aeronautics in the United States produced three reports describing a method of analysis of aircraft accidents. The first report was published in 1928; the second, which was a revision of the first, was published in 1930; and the third, which was a revision and update of the second, was published in 1936. This paper describes the contents of these reports, and compares the method of analysis proposed therein to the methods used today.

  11. Validity and reliability of bioelectrical impedance analysis and skinfold thickness in predicting body fat in military personnel.

    PubMed

    Aandstad, Anders; Holtberget, Kristian; Hageberg, Rune; Holme, Ingar; Anderssen, Sigmund A

    2014-02-01

    Previous studies show that body composition is related to injury risk and physical performance in soldiers. Thus, valid methods for measuring body composition in military personnel are needed. The frequently used body mass index method is not a valid measure of body composition in soldiers, but reliability and validity of alternative field methods are less investigated in military personnel. Thus, we carried out test and retest of skinfold (SKF), single frequency bioelectrical impedance analysis (SF-BIA), and multifrequency bioelectrical impedance analysis measurements in 65 male and female soldiers. Several validated equations were used to predict percent body fat from these methods. Dual-energy X-ray absorptiometry was also measured, and acted as the criterion method. Results showed that SF-BIA was the most reliable method in both genders. In women, SF-BIA was also the most valid method, whereas SKF or a combination of SKF and SF-BIA produced the highest validity in men. Reliability and validity varied substantially among the equations examined. The best methods and equations produced test-retest 95% limits of agreement below ±1% points, whereas the corresponding validity figures were ±3.5% points. Each investigator and practitioner must consider whether such measurement errors are acceptable for its specific use. Reprint & Copyright © 2014 Association of Military Surgeons of the U.S.

  12. Convergence analysis of a monotonic penalty method for American option pricing

    NASA Astrophysics Data System (ADS)

    Zhang, Kai; Yang, Xiaoqi; Teo, Kok Lay

    2008-12-01

    This paper is devoted to study the convergence analysis of a monotonic penalty method for pricing American options. A monotonic penalty method is first proposed to solve the complementarity problem arising from the valuation of American options, which produces a nonlinear degenerated parabolic PDE with Black-Scholes operator. Based on the variational theory, the solvability and convergence properties of this penalty approach are established in a proper infinite dimensional space. Moreover, the convergence rate of the combination of two power penalty functions is obtained.

  13. On the Exploitation of Sensitivity Derivatives for Improving Sampling Methods

    NASA Technical Reports Server (NTRS)

    Cao, Yanzhao; Hussaini, M. Yousuff; Zang, Thomas A.

    2003-01-01

    Many application codes, such as finite-element structural analyses and computational fluid dynamics codes, are capable of producing many sensitivity derivatives at a small fraction of the cost of the underlying analysis. This paper describes a simple variance reduction method that exploits such inexpensive sensitivity derivatives to increase the accuracy of sampling methods. Three examples, including a finite-element structural analysis of an aircraft wing, are provided that illustrate an order of magnitude improvement in accuracy for both Monte Carlo and stratified sampling schemes.

  14. Event by event analysis and entropy of multiparticle systems

    NASA Astrophysics Data System (ADS)

    Bialas, A.; Czyz, W.

    2000-04-01

    The coincidence method of measuring the entropy of a system, proposed some time ago by Ma, is generalized to include systems out of equilibrium. It is suggested that the method can be adapted to analyze multiparticle states produced in high-energy collisions.

  15. Rapid prediction of ochratoxin A-producing strains of Penicillium on dry-cured meat by MOS-based electronic nose.

    PubMed

    Lippolis, Vincenzo; Ferrara, Massimo; Cervellieri, Salvatore; Damascelli, Anna; Epifani, Filomena; Pascale, Michelangelo; Perrone, Giancarlo

    2016-02-02

    The availability of rapid diagnostic methods for monitoring ochratoxigenic species during the seasoning processes for dry-cured meats is crucial and constitutes a key stage in order to prevent the risk of ochratoxin A (OTA) contamination. A rapid, easy-to-perform and non-invasive method using an electronic nose (e-nose) based on metal oxide semiconductors (MOS) was developed to discriminate dry-cured meat samples in two classes based on the fungal contamination: class P (samples contaminated by OTA-producing Penicillium strains) and class NP (samples contaminated by OTA non-producing Penicillium strains). Two OTA-producing strains of Penicillium nordicum and two OTA non-producing strains of Penicillium nalgiovense and Penicillium salamii, were tested. The feasibility of this approach was initially evaluated by e-nose analysis of 480 samples of both Yeast extract sucrose (YES) and meat-based agar media inoculated with the tested Penicillium strains and incubated up to 14 days. The high recognition percentages (higher than 82%) obtained by Discriminant Function Analysis (DFA), either in calibration and cross-validation (leave-more-out approach), for both YES and meat-based samples demonstrated the validity of the used approach. The e-nose method was subsequently developed and validated for the analysis of dry-cured meat samples. A total of 240 e-nose analyses were carried out using inoculated sausages, seasoned by a laboratory-scale process and sampled at 5, 7, 10 and 14 days. DFA provided calibration models that permitted discrimination of dry-cured meat samples after only 5 days of seasoning with mean recognition percentages in calibration and cross-validation of 98 and 88%, respectively. A further validation of the developed e-nose method was performed using 60 dry-cured meat samples produced by an industrial-scale seasoning process showing a total recognition percentage of 73%. The pattern of volatile compounds of dry-cured meat samples was identified and characterized by a developed HS-SPME/GC-MS method. Seven volatile compounds (2-methyl-1-butanol, octane, 1R-α-pinene, d-limonene, undecane, tetradecanal, 9-(Z)-octadecenoic acid methyl ester) allowed discrimination between dry-cured meat samples of classes P and NP. These results demonstrate that MOS-based electronic nose can be a useful tool for a rapid screening in preventing OTA contamination in the cured meat supply chain. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Demodulation circuit for AC motor current spectral analysis

    DOEpatents

    Hendrix, Donald E.; Smith, Stephen F.

    1990-12-18

    A motor current analysis method for the remote, noninvasive inspection of electric motor-operated systems. Synchronous amplitude demodulation and phase demodulation circuits are used singly and in combination along with a frequency analyzer to produce improved spectral analysis of load-induced frequencies present in the electric current flowing in a motor-driven system.

  17. Explanation production by expert planners

    NASA Technical Reports Server (NTRS)

    Bridges, Susan; Jhannes, James D.

    1988-01-01

    Although the explanation capability of expert systems is usually listed as one of the distinguishing characteristics of these systems, the explanation facilities of most existing systems are quite primitive. Computer generated explanations are typically produced from canned text or by direct translation of the knowledge structures. Explanations produced in this manner bear little resemblance to those produced by humans for similar tasks. The focus of our research in explanation is the production of justifications for decisions by expert planning systems. An analysis of justifications written by people for planning tasks has been taken as the starting point. The purpose of this analysis is two-fold. First, analysis of the information content of the justifications will provide a basis for deciding what knowledge must be represented if human-like justifications are to be produced. Second, an analysis of the textual organization of the justifications will be used in the development of a mechanism for selecting and organizing the knowledge to be included in a computer-produced explanation. This paper describes a preliminary analysis done of justifications written by people for a planning task. It is clear that these justifications differ significantly from those that would be produced by an expert system by tracing the firing of production rules. The results from the text analysis have been used to develop an augmented phrase structured grammar (APSG) describing the organization of the justifications. The grammar was designed to provide a computationally feasible method for determining textual organization that will allow the necessary information to be communicated in a cohesive manner.

  18. Optical properties of thin fibrous PVP/SiO2 composite mats prepared via the sol-gel and electrospinning methods

    NASA Astrophysics Data System (ADS)

    Tański, Tomasz; Matysiak, Wiktor; Krzemiński, Łukasz; Jarka, Paweł; Gołombek, Klaudiusz

    2017-12-01

    The aim of the research was to create thin, nanofibrous composite mats with a polyvinylpyrrolidone (PVP) matrix, with the reinforcing phase in the form of silicon oxide (SiO2) nanoparticles. SiO2 nanopowder was obtained using the zol-gel method with a mixture of tetraethyl orthosilicate (TEOS, Si (OC2H5)), hydrochloric acid (HCl), ethanol (C3H5OH) and distilled water. The produced colloidal suspension was subjected to a drying process and a calcination process at 550 °C, resulting in an amorphous silica nanopowder with an average particle diameter of 20 nm. The morphology and structure of the manufactured SiO2 nanoparticles was tested using transmission electron microscopy (TEM) and X-ray diffraction analysis (XRD). Then, using the electrospinning method with a 15% (weight) solution of PVP in ethanol and a 15% solution of PVP/EtOH containing the produced nanoparticles equivalent to 5% of the mass concentration relative to the polymer matrix, polymer PVP nanofibres and PVP/SiO2 composite nanofibres/SiO2 nanoparticles were produced. The morphology and chemical composition of the produced polymer and composite nanofibres were tested using a scanning electron microscope (SEM) with an energy dispersive spectrometer (EDS). The analysis of the impact of the reinforcing phase on the absorption of electromagnetic radiation was conducted on the basis of UV-vis spectra, based on which the rated values of band gaps of the produced thin fibrous mats were assessed.

  19. Screening of the key volatile organic compounds of Tuber melanosporum fermentation by aroma sensory evaluation combination with principle component analysis

    PubMed Central

    Liu, Rui-Sang; Jin, Guang-Huai; Xiao, Deng-Rong; Li, Hong-Mei; Bai, Feng-Wu; Tang, Ya-Jie

    2015-01-01

    Aroma results from the interplay of volatile organic compounds (VOCs) and the attributes of microbial-producing aromas are significantly affected by fermentation conditions. Among the VOCs, only a few of them contribute to aroma. Thus, screening and identification of the key VOCs is critical for microbial-producing aroma. The traditional method is based on gas chromatography-olfactometry (GC-O), which is time-consuming and laborious. Considering the Tuber melanosporum fermentation system as an example, a new method to screen and identify the key VOCs by combining the aroma evaluation method with principle component analysis (PCA) was developed in this work. First, an aroma sensory evaluation method was developed to screen 34 potential favorite aroma samples from 504 fermentation samples. Second, PCA was employed to screen nine common key VOCs from these 34 samples. Third, seven key VOCs were identified by the traditional method. Finally, all of the seven key VOCs identified by the traditional method were also identified, along with four others, by the new strategy. These results indicate the reliability of the new method and demonstrate it to be a viable alternative to the traditional method. PMID:26655663

  20. Translations on Eastern Europe, Scientific Affairs, No. 562

    DTIC Science & Technology

    1977-10-28

    remodeling and mod- ernization of the institute’s facilities resulted in an increase in the reactor’s neutron flux and power output capacity and...research technique involving the use of the experimental reactor is neutron activation analysis. Using this method it is possible to produce...artificial radioactivity through the bombardment of non-active substances with neutrons . This is one of the most sensitive methods of chemical analysis

  1. Results of the Round Robin on opening-load measurement conducted by ASTM Task Group E24.04.04 on Crack Closure Measurement and Analysis

    NASA Technical Reports Server (NTRS)

    Phillips, Edward P.

    1989-01-01

    An experimental Round Robin on the measurement of the opening load in fatigue crack growth tests was conducted on Crack Closure Measurement and Analysis. The Round Robin evaluated the current level of consistency of opening load measurements among laboratories and to identify causes for observed inconsistency. Eleven laboratories participated in the testing of compact and middle-crack specimens. Opening-load measurements were made for crack growth at two stress-intensity factor levels, three crack lengths, and following an overload. All opening-load measurements were based on the analysis of specimen compliance data. When all of the results reported (from all participants, all measurement methods, and all data analysis methods) for a given test condition were pooled, the range of opening loads was very large--typically spanning the lower half of the fatigue loading cycle. Part of the large scatter in the reported opening-load results was ascribed to consistent differences in results produced by the various methods used to measure specimen compliance and to evaluate the opening load from the compliance data. Another significant portion of the scatter was ascribed to lab-to-lab differences in producing the compliance data when using nominally the same method of measurement.

  2. Counting pollen grains using readily available, free image processing and analysis software.

    PubMed

    Costa, Clayton M; Yang, Suann

    2009-10-01

    Although many methods exist for quantifying the number of pollen grains in a sample, there are few standard methods that are user-friendly, inexpensive and reliable. The present contribution describes a new method of counting pollen using readily available, free image processing and analysis software. Pollen was collected from anthers of two species, Carduus acanthoides and C. nutans (Asteraceae), then illuminated on slides and digitally photographed through a stereomicroscope. Using ImageJ (NIH), these digital images were processed to remove noise and sharpen individual pollen grains, then analysed to obtain a reliable total count of the number of grains present in the image. A macro was developed to analyse multiple images together. To assess the accuracy and consistency of pollen counting by ImageJ analysis, counts were compared with those made by the human eye. Image analysis produced pollen counts in 60 s or less per image, considerably faster than counting with the human eye (5-68 min). In addition, counts produced with the ImageJ procedure were similar to those obtained by eye. Because count parameters are adjustable, this image analysis protocol may be used for many other plant species. Thus, the method provides a quick, inexpensive and reliable solution to counting pollen from digital images, not only reducing the chance of error but also substantially lowering labour requirements.

  3. Determination of acrylamide in brewed coffee and coffee powder using polymeric ionic liquid-based sorbent coatings in solid-phase microextraction coupled to gas chromatography-mass spectrometry.

    PubMed

    Cagliero, Cecilia; Ho, Tien D; Zhang, Cheng; Bicchi, Carlo; Anderson, Jared L

    2016-06-03

    This study describes a simple and rapid sampling method employing a polymeric ionic liquid (PIL) sorbent coating in direct immersion solid-phase microextraction (SPME) for the trace-level analysis of acrylamide in brewed coffee and coffee powder. The crosslinked PIL sorbent coating demonstrated superior sensitivity in the extraction of acrylamide compared to all commercially available SPME coatings. A spin coating method was developed to evenly distribute the PIL coating on the SPME support and reproducibly produce fibers with a large film thickness. Ninhydrin was employed as a quenching reagent during extraction to inhibit the production of interfering acrylamide. The PIL fiber produced a limit of quantitation for acrylamide of 10μgL(-1) and achieved comparable results to the ISO method in the analysis of six coffee powder samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Representativeness of direct observations selected using a work-sampling equation.

    PubMed

    Sharp, Rebecca A; Mudford, Oliver C; Elliffe, Douglas

    2015-01-01

    Deciding on appropriate sampling to obtain representative samples of behavior is important but not straightforward, because the relative duration of the target behavior may affect its observation in a given sampling interval. Work-sampling methods, which offer a way to adjust the frequency of sampling according to a priori or ongoing estimates of the behavior to achieve a preselected level of representativeness, may provide a solution. Full-week observations of 7 behaviors were conducted for 3 students with autism spectrum disorder and intellectual disabilities. Work-sampling methods were used to select momentary time samples from the full time-of-interest, which produced representative samples. However, work sampling required impractically high numbers of time samples to obtain representative samples. More practical momentary time samples produced less representative samples, particularly for low-duration behaviors. The utility and limits of work-sampling methods for applied behavior analysis are discussed. © Society for the Experimental Analysis of Behavior.

  5. Comparison Through Image Analysis Between Al Foams Produced Using Two Different Methods

    NASA Astrophysics Data System (ADS)

    Boschetto, A.; Campana, F.; Pilone, D.

    2014-02-01

    Several methods are available for making metal foams. They allow to tailor their mechanical, thermal, acoustic, and electrical properties for specific applications by varying the relative density as well as the cell size and morphology. Foams have a very heterogeneous structure so that their properties may show a large scatter. In this paper, an aluminum foam produced by means of foaming of powder compacts and another one prepared via the infiltration process were analyzed and compared. Image analysis has been used as a useful tool to determine size, morphology, and distribution of cells in both foams and to correlate cell morphology with the considered manufacturing process. The results highlighted that cell size and morphology are strictly dependent upon the manufacturing method. This paper shows how some standard 2D morphological indicators may be usefully adopted to characterize foams whose structure derives from the specific manufacturing process.

  6. Time-Series Analysis: Assessing the Effects of Multiple Educational Interventions in a Small-Enrollment Course

    NASA Astrophysics Data System (ADS)

    Warren, Aaron R.

    2009-11-01

    Time-series designs are an alternative to pretest-posttest methods that are able to identify and measure the impacts of multiple educational interventions, even for small student populations. Here, we use an instrument employing standard multiple-choice conceptual questions to collect data from students at regular intervals. The questions are modified by asking students to distribute 100 Confidence Points among the options in order to indicate the perceived likelihood of each answer option being the correct one. Tracking the class-averaged ratings for each option produces a set of time-series. ARIMA (autoregressive integrated moving average) analysis is then used to test for, and measure, changes in each series. In particular, it is possible to discern which educational interventions produce significant changes in class performance. Cluster analysis can also identify groups of students whose ratings evolve in similar ways. A brief overview of our methods and an example are presented.

  7. Automated geospatial Web Services composition based on geodata quality requirements

    NASA Astrophysics Data System (ADS)

    Cruz, Sérgio A. B.; Monteiro, Antonio M. V.; Santos, Rafael

    2012-10-01

    Service-Oriented Architecture and Web Services technologies improve the performance of activities involved in geospatial analysis with a distributed computing architecture. However, the design of the geospatial analysis process on this platform, by combining component Web Services, presents some open issues. The automated construction of these compositions represents an important research topic. Some approaches to solving this problem are based on AI planning methods coupled with semantic service descriptions. This work presents a new approach using AI planning methods to improve the robustness of the produced geospatial Web Services composition. For this purpose, we use semantic descriptions of geospatial data quality requirements in a rule-based form. These rules allow the semantic annotation of geospatial data and, coupled with the conditional planning method, this approach represents more precisely the situations of nonconformities with geodata quality that may occur during the execution of the Web Service composition. The service compositions produced by this method are more robust, thus improving process reliability when working with a composition of chained geospatial Web Services.

  8. Interferometric Laser Scanner for Direction Determination

    PubMed Central

    Kaloshin, Gennady; Lukin, Igor

    2016-01-01

    In this paper, we explore the potential capabilities of new laser scanning-based method for direction determination. The method for fully coherent beams is extended to the case when interference pattern is produced in the turbulent atmosphere by two partially coherent sources. The performed theoretical analysis identified the conditions under which stable pattern may form on extended paths of 0.5–10 km in length. We describe a method for selecting laser scanner parameters, ensuring the necessary operability range in the atmosphere for any possible turbulence characteristics. The method is based on analysis of the mean intensity of interference pattern, formed by two partially coherent sources of optical radiation. Visibility of interference pattern is estimated as a function of propagation pathlength, structure parameter of atmospheric turbulence, and spacing of radiation sources, producing the interference pattern. It is shown that, when atmospheric turbulences are moderately strong, the contrast of interference pattern of laser scanner may ensure its applicability at ranges up to 10 km. PMID:26805841

  9. Interferometric Laser Scanner for Direction Determination.

    PubMed

    Kaloshin, Gennady; Lukin, Igor

    2016-01-21

    In this paper, we explore the potential capabilities of new laser scanning-based method for direction determination. The method for fully coherent beams is extended to the case when interference pattern is produced in the turbulent atmosphere by two partially coherent sources. The performed theoretical analysis identified the conditions under which stable pattern may form on extended paths of 0.5-10 km in length. We describe a method for selecting laser scanner parameters, ensuring the necessary operability range in the atmosphere for any possible turbulence characteristics. The method is based on analysis of the mean intensity of interference pattern, formed by two partially coherent sources of optical radiation. Visibility of interference pattern is estimated as a function of propagation pathlength, structure parameter of atmospheric turbulence, and spacing of radiation sources, producing the interference pattern. It is shown that, when atmospheric turbulences are moderately strong, the contrast of interference pattern of laser scanner may ensure its applicability at ranges up to 10 km.

  10. Candida Species From Eye Infections: Drug Susceptibility, Virulence Factors, and Molecular Characterization.

    PubMed

    Ranjith, Konduri; Sontam, Bhavani; Sharma, Savitri; Joseph, Joveeta; Chathoth, Kanchana N; Sama, Kalyana C; Murthy, Somasheila I; Shivaji, Sisinthy

    2017-08-01

    To determine the type of Candida species in ocular infections and to investigate the relationship of antifungal susceptibility profile to virulence factors. Fifty isolates of yeast-like fungi from patients with keratitis, endophthalmitis, and orbital cellulitis were identified by Vitek-2 compact system and DNA sequencing of ITS1-5.8S-ITS2 regions of the rRNA gene, followed by phylogenetic analysis for phenotypic and genotypic identification, respectively. Minimum inhibitory concentration of six antifungal drugs was determined by E test/microbroth dilution methods. Phenotypic and genotypic methods were used to determine the virulence factors. Phylogenetic analysis showed the clustering of all isolates into eight distinct groups with a major cluster formed Candida parapsilosis (n = 21), which was the most common species by both Vitek 2 and DNA sequencing. Using χ2 test no significant difference was noted between the techniques except that Vitek 2 did not identify C. viswanathii, C. orthopsilosis, and two non-Candida genera. Of 43 tested Candida isolates high susceptibility to amphotericin B (39/43, 90.6%) and natamycin (43/43, 100%) was noted. While none of the isolates produced coagulase, all produced esterase and catalase. The potential to form biofilm was detected in 23/43 (53.4%) isolates. Distribution of virulence factors by heat map analysis showed difference in metabolic activity of biofilm producers from nonbiofilm producers. Identified by Vitek 2 and DNA sequencing methods C. parapsilosis was the most common species associated with eye infections. Irrespective of the virulence factors elaborated, the Candida isolates were susceptible to commonly used antifungal drugs such as amphotericin B and natamycin.

  11. Research on Health Inequalities in Latin America and the Caribbean: Bibliometric Analysis (1971–2000) and Descriptive Content Analysis (1971–1995)

    PubMed Central

    Almeida-Filho, Naomar; Kawachi, Ichiro; Filho, Alberto Pellegrini; Dachs, J. Norberto W.

    2003-01-01

    We conducted a bibliometric and content analysis of research on health inequalities produced in Latin American and Caribbean countries. In our bibliometric analysis (n = 576), we used indexed material published between 1971 and 2000. The content analysis (n = 269) covered the period 1971 to 1995 and included unpublished material. We found recent rapid growth in overall output. Brazil, Chile, and Mexico contributed mostly empirical research, while Ecuador and Argentina produced more conceptual studies. We found, in the literature reviewed, a relative neglect of gender, race, and ethnicity issues. We also found remarkable diversity in research designs, however, along with strong consideration of ecological and ethnographic methods absent in other research traditions. PMID:14652329

  12. Controller design via structural reduced modeling by FETM

    NASA Technical Reports Server (NTRS)

    Yousuff, A.

    1986-01-01

    The Finite Element - Transfer Matrix (FETM) method has been developed to reduce the computations involved in analysis of structures. This widely accepted method, however, has certain limitations, and does not directly produce reduced models for control design. To overcome these shortcomings, a modification of FETM method has been developed. The modified FETM method easily produces reduced models that are tailored toward subsequent control design. Other features of this method are its ability to: (1) extract open loop frequencies and mode shapes with less computations, (2) overcome limitations of the original FETM method, and (3) simplify the procedures for output feedback, constrained compensation, and decentralized control. This semi annual report presents the development of the modified FETM, and through an example, illustrates its applicability to an output feedback and a decentralized control design.

  13. Comparative Modeling of Proteins: A Method for Engaging Students' Interest in Bioinformatics Tools

    ERIC Educational Resources Information Center

    Badotti, Fernanda; Barbosa, Alan Sales; Reis, André Luiz Martins; do Valle, Ítalo Faria; Ambrósio, Lara; Bitar, Mainá

    2014-01-01

    The huge increase in data being produced in the genomic era has produced a need to incorporate computers into the research process. Sequence generation, its subsequent storage, interpretation, and analysis are now entirely computer-dependent tasks. Universities from all over the world have been challenged to seek a way of encouraging students to…

  14. Social Network Analysis: A New Methodology for Counseling Research.

    ERIC Educational Resources Information Center

    Koehly, Laura M.; Shivy, Victoria A.

    1998-01-01

    Social network analysis (SNA) uses indices of relatedness among individuals to produce representations of social structures and positions inherent in dyads or groups. SNA methods provide quantitative representations of ongoing transactional patterns in a given social environment. Methodological issues, applications and resources are discussed…

  15. A comparative study on generating simulated Landsat NDVI images using data fusion and regression method-the case of the Korean Peninsula.

    PubMed

    Lee, Mi Hee; Lee, Soo Bong; Eo, Yang Dam; Kim, Sun Woong; Woo, Jung-Hun; Han, Soo Hee

    2017-07-01

    Landsat optical images have enough spatial and spectral resolution to analyze vegetation growth characteristics. But, the clouds and water vapor degrade the image quality quite often, which limits the availability of usable images for the time series vegetation vitality measurement. To overcome this shortcoming, simulated images are used as an alternative. In this study, weighted average method, spatial and temporal adaptive reflectance fusion model (STARFM) method, and multilinear regression analysis method have been tested to produce simulated Landsat normalized difference vegetation index (NDVI) images of the Korean Peninsula. The test results showed that the weighted average method produced the images most similar to the actual images, provided that the images were available within 1 month before and after the target date. The STARFM method gives good results when the input image date is close to the target date. Careful regional and seasonal consideration is required in selecting input images. During summer season, due to clouds, it is very difficult to get the images close enough to the target date. Multilinear regression analysis gives meaningful results even when the input image date is not so close to the target date. Average R 2 values for weighted average method, STARFM, and multilinear regression analysis were 0.741, 0.70, and 0.61, respectively.

  16. Impact desolvation of electrosprayed microdroplets--a new ionization method for mass spectrometry of large biomolecules.

    PubMed

    Aksyonov, S A; Williams, P

    2001-01-01

    Impact desolvation of electrosprayed microdroplets (IDEM) is a new method for producing gas-phase ions of large biomolecules. Analytes are dissolved in an electrolyte solution which is electrosprayed in vacuum, producing highly charged micron and sub-micron sized droplets (microdroplets). These microdroplets are accelerated through potential differences approximately 5 - 10 kV to velocities of several km/s and allowed to impact a target surface. The energetic impacts vaporize the droplets and release desolvated gas-phase ions of the analyte molecules. Oligonucleotides (2- to 12-mer) and peptides (bradykinin, neurotensin) yield singly and doubly charged molecular ions with no detectable fragmentation. Because the extent of multiple charging is significantly less than in atmospheric pressure electrospray ionization, and the method produces ions largely free of adducts from solutions of high ionic strength, IDEM has some promise as a method for coupling to liquid chromatographic techniques and for mixture analysis. Ions are produced in vacuum at a flat equipotential surface, potentially allowing efficient ion extraction. Copyright 2001 John Wiley & Sons, Ltd.

  17. Method for making a bio-compatible scaffold

    DOEpatents

    Cesarano, III, Joseph; Stuecker, John N [Albuquerque, NM; Dellinger, Jennifer G [Champaigne, IL; Jamison, Russell D [Urbana, IL

    2006-01-31

    A method for forming a three-dimensional, biocompatible, porous scaffold structure using a solid freeform fabrication technique (referred to herein as robocasting) that can be used as a medical implant into a living organism, such as a human or other mammal. Imaging technology and analysis is first used to determine the three-dimensional design required for the medical implant, such as a bone implant or graft, fashioned as a three-dimensional, biocompatible scaffold structure. The robocasting technique is used to either directly produce the three-dimensional, porous scaffold structure or to produce an over-sized three-dimensional, porous scaffold lattice which can be machined to produce the designed three-dimensional, porous scaffold structure for implantation.

  18. Accuracy and efficiency of published film dosimetry techniques using a flat-bed scanner and EBT3 film.

    PubMed

    Spelleken, E; Crowe, S B; Sutherland, B; Challens, C; Kairn, T

    2018-03-01

    Gafchromic EBT3 film is widely used for patient specific quality assurance of complex treatment plans. Film dosimetry techniques commonly involve the use of transmission scanning to produce TIFF files, which are analysed using a non-linear calibration relationship between the dose and red channel net optical density (netOD). Numerous film calibration techniques featured in the literature have not been independently verified or evaluated. A range of previously published film dosimetry techniques were re-evaluated, to identify whether these methods produce better results than the commonly-used non-linear, netOD method. EBT3 film was irradiated at calibration doses between 0 and 4000 cGy and 25 pieces of film were irradiated at 200 cGy to evaluate uniformity. The film was scanned using two different scanners: The Epson Perfection V800 and the Epson Expression 10000XL. Calibration curves, uncertainty in the fit of the curve, overall uncertainty and uniformity were calculated following the methods described by the different calibration techniques. It was found that protocols based on a conventional film dosimetry technique produced results that were accurate and uniform to within 1%, while some of the unconventional techniques produced much higher uncertainties (> 25% for some techniques). Some of the uncommon methods produced reliable results when irradiated to the standard treatment doses (< 400 cGy), however none could be recommended as an efficient or accurate replacement for a common film analysis technique which uses transmission scanning, red colour channel analysis, netOD and a non-linear calibration curve for measuring doses up to 4000 cGy when using EBT3 film.

  19. Interactive visual exploration and refinement of cluster assignments.

    PubMed

    Kern, Michael; Lex, Alexander; Gehlenborg, Nils; Johnson, Chris R

    2017-09-12

    With ever-increasing amounts of data produced in biology research, scientists are in need of efficient data analysis methods. Cluster analysis, combined with visualization of the results, is one such method that can be used to make sense of large data volumes. At the same time, cluster analysis is known to be imperfect and depends on the choice of algorithms, parameters, and distance measures. Most clustering algorithms don't properly account for ambiguity in the source data, as records are often assigned to discrete clusters, even if an assignment is unclear. While there are metrics and visualization techniques that allow analysts to compare clusterings or to judge cluster quality, there is no comprehensive method that allows analysts to evaluate, compare, and refine cluster assignments based on the source data, derived scores, and contextual data. In this paper, we introduce a method that explicitly visualizes the quality of cluster assignments, allows comparisons of clustering results and enables analysts to manually curate and refine cluster assignments. Our methods are applicable to matrix data clustered with partitional, hierarchical, and fuzzy clustering algorithms. Furthermore, we enable analysts to explore clustering results in context of other data, for example, to observe whether a clustering of genomic data results in a meaningful differentiation in phenotypes. Our methods are integrated into Caleydo StratomeX, a popular, web-based, disease subtype analysis tool. We show in a usage scenario that our approach can reveal ambiguities in cluster assignments and produce improved clusterings that better differentiate genotypes and phenotypes.

  20. Coherent Lienard-Wiechert fields produced by free electron lasers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elias, L.R.; Gallardo, J.C.

    1981-12-01

    Results are presented here of a three-dimensional numerical analysis of the radiation fields produced in a free electron laser. The method used here to obtain the spatial and temporal behavior of the radiated fields is based on the coherent superposition of the exact Lienard-Wiechert fields produced by each electron in the beam. Interference effects are responsible for the narrow angular radiation patterns obtained and for the high degree of monochromaticity of the radiated field.

  1. Analysis of metal-laden water via portable X-ray fluorescence spectrometry

    NASA Astrophysics Data System (ADS)

    Pearson, Delaina; Weindorf, David C.; Chakraborty, Somsubhra; Li, Bin; Koch, Jaco; Van Deventer, Piet; de Wet, Jandre; Kusi, Nana Yaw

    2018-06-01

    A rapid method for in-situ elemental composition analysis of metal-laden water would be indispensable for studying polluted water. Current analytical lab methods to determine water quality include flame atomic absorption spectrometry (FAAS), atomic absorption spectrophotometry (AAS), electrothermal atomic absorption spectrometry (EAAS), and inductively coupled plasma (ICP) spectroscopy. However only two field methods, colorimetry and absorptiometry, exist for elemental analysis of water. Portable X-ray fluorescence (PXRF) spectrometry is an effective method for elemental analysis of soil, sediment, and other matrices. However, the accuracy of PXRF is known to be affected while scanning moisture-laden soil samples. This study sought to statistically establish PXRF's predictive ability for various elements in water at different concentrations relative to inductively coupled plasma atomic emission spectroscopy (ICP-AES). A total of 390 metal-laden water samples collected from leaching columns of mine tailings in South Africa were analyzed via PXRF and ICP-AES. The PXRF showed differential effectiveness in elemental quantification. For the collected water samples, the best relationships between ICP and PXRF elemental data were obtained for K and Cu (R2 = 0.92). However, when scanning ICP calibration solutions with elements in isolation, PXRF results indicated near perfect agreement; Ca, K, Fe, Cu and Pb produced an R2 of 0.99 while Zn and Mn produced an R2 of 1.00. The utilization of multiple PXRF (stacked) beams produced stronger correlation to ICP relative to the use of a single beam in isolation. The results of this study demonstrated the PXRF's ability to satisfactorily predict the composition of metal-laden water as reported by ICP for several elements. Additionally this study indicated the need for a "Water Mode" calibration for the PXRF and demonstrates the potential of PXRF for future study of polluted or contaminated waters.

  2. Watershed Planning within a Quantitative Scenario Analysis Framework.

    PubMed

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-07-24

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation.

  3. Retinal vasculature classification using novel multifractal features

    NASA Astrophysics Data System (ADS)

    Ding, Y.; Ward, W. O. C.; Duan, Jinming; Auer, D. P.; Gowland, Penny; Bai, L.

    2015-11-01

    Retinal blood vessels have been implicated in a large number of diseases including diabetic retinopathy and cardiovascular diseases, which cause damages to retinal blood vessels. The availability of retinal vessel imaging provides an excellent opportunity for monitoring and diagnosis of retinal diseases, and automatic analysis of retinal vessels will help with the processes. However, state of the art vascular analysis methods such as counting the number of branches or measuring the curvature and diameter of individual vessels are unsuitable for the microvasculature. There has been published research using fractal analysis to calculate fractal dimensions of retinal blood vessels, but so far there has been no systematic research extracting discriminant features from retinal vessels for classifications. This paper introduces new methods for feature extraction from multifractal spectra of retinal vessels for classification. Two publicly available retinal vascular image databases are used for the experiments, and the proposed methods have produced accuracies of 85.5% and 77% for classification of healthy and diabetic retinal vasculatures. Experiments show that classification with multiple fractal features produces better rates compared with methods using a single fractal dimension value. In addition to this, experiments also show that classification accuracy can be affected by the accuracy of vessel segmentation algorithms.

  4. [Diversity and enzyme-producing activity of culturable halophilic bacteria in Daishan Saltern of East China].

    PubMed

    Yang, Dan-Dan; Li, Qian; Huang, Jing-Jing; Chen, Min

    2012-11-01

    Soil and saline water samples were collected from the Daishan Saltern of East China, and the halophilic bacteria were isolated and cultured by using selective media, aimed to investigate the diversity and enzyme-producing activity of culturable halophilic bacteria in saltern environment. A total of 181 strains were isolated by culture-dependent method. Specific primers were used to amplify the 16S rRNA gene of bacteria and archaea. The operation taxonomy units (OTUs) were determined by ARDRA method, and the representative strain of each OTU was sequenced. The phylogenetic position of all the isolated strains was determined by 16S rRNA sequencing. The results showed that the isolated 181 strains displayed 21 operational taxonomic units (OTUs), of which, 12 OTUs belonged to halophilic bacteria, and the others belonged to halophilic archaea. Phylogenetic analysis indicated that there were 7 genera presented among the halophilic bacteria group, and 4 genera presented among the halophilic archaea group. The dominant halophilic strains were of Halomonas and Haloarcula, with 46.8% in halophilic bacteria and 49.1% in halophilic archaea group, respectively. Enzyme-producing analysis indicated that most strains displayed enzyme-producing activity, including the activities of producing amylase, proteinase and lipase, and the dominant strains capable of enzyme-producing were of Haloarcula. Our results showed that in the environment of Daishan Saltern, there existed a higher diversity of halophilic bacteria, being a source sink for screening enzyme-producing bacterial strains.

  5. Method of analysis of polymerizable monomeric species in a complex mixture

    DOEpatents

    Hermes, Robert E

    2014-03-18

    Method of selective quantitation of a polymerizable monomeric species in a well spacer fluid, said method comprising the steps of adding at least one solvent having a refractive index of less than about 1.33 to a sample of the complex mixture to produce a solvent phase, and measuring the refractive index of the solvent phase.

  6. Range of sound levels in the outdoor environment

    Treesearch

    Lewis S. Goodfriend

    1977-01-01

    Current methods of measuring and rating noise in a metropolitan area are examined, including real-time spectrum analysis and sound-level integration, producing a single-number value representing the noise impact for each hour or each day. Methods of noise rating for metropolitan areas are reviewed, and the various measures from multidimensional rating methods such as...

  7. A single-loop optimization method for reliability analysis with second order uncertainty

    NASA Astrophysics Data System (ADS)

    Xie, Shaojun; Pan, Baisong; Du, Xiaoping

    2015-08-01

    Reliability analysis may involve random variables and interval variables. In addition, some of the random variables may have interval distribution parameters owing to limited information. This kind of uncertainty is called second order uncertainty. This article develops an efficient reliability method for problems involving the three aforementioned types of uncertain input variables. The analysis produces the maximum and minimum reliability and is computationally demanding because two loops are needed: a reliability analysis loop with respect to random variables and an interval analysis loop for extreme responses with respect to interval variables. The first order reliability method and nonlinear optimization are used for the two loops, respectively. For computational efficiency, the two loops are combined into a single loop by treating the Karush-Kuhn-Tucker (KKT) optimal conditions of the interval analysis as constraints. Three examples are presented to demonstrate the proposed method.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rowe, M.D.; Pierce, B.L.

    This report presents results of tests of different final site selection methods used for siting large-scale facilities such as nuclear power plants. Test data are adapted from a nuclear power plant siting study conducted on Long Island, New York. The purpose of the tests is to determine whether or not different final site selection methods produce different results, and to obtain some understanding of the nature of any differences found. Decision rules and weighting methods are included. Decision rules tested are Weighting Summation, Power Law, Decision Analysis, Goal Programming, and Goal Attainment; weighting methods tested are Categorization, Ranking, Rating Ratiomore » Estimation, Metfessel Allocation, Indifferent Tradeoff, Decision Analysis lottery, and Global Evaluation. Results show that different methods can, indeed, produce different results, but that the probability that they will do so is controlled by the structure of differences among the sites being evaluated. Differences in weights and suitability scores attributable to methods have reduced significance if the alternatives include one or two sites that are superior to all others in many attributes. The more tradeoffs there are among good and bad levels of different attributes at different sites, the more important are the specifics of methods to the final decision. 5 refs., 14 figs., 19 tabs.« less

  9. Betweenness-Based Method to Identify Critical Transmission Sectors for Supply Chain Environmental Pressure Mitigation.

    PubMed

    Liang, Sai; Qu, Shen; Xu, Ming

    2016-02-02

    To develop industry-specific policies for mitigating environmental pressures, previous studies primarily focus on identifying sectors that directly generate large amounts of environmental pressures (a.k.a. production-based method) or indirectly drive large amounts of environmental pressures through supply chains (e.g., consumption-based method). In addition to those sectors as important environmental pressure producers or drivers, there exist sectors that are also important to environmental pressure mitigation as transmission centers. Economy-wide environmental pressure mitigation might be achieved by improving production efficiency of these key transmission sectors, that is, using less upstream inputs to produce unitary output. We develop a betweenness-based method to measure the importance of transmission sectors, borrowing the betweenness concept from network analysis. We quantify the betweenness of sectors by examining supply chain paths extracted from structural path analysis that pass through a particular sector. We take China as an example and find that those critical transmission sectors identified by betweenness-based method are not always identifiable by existing methods. This indicates that betweenness-based method can provide additional insights that cannot be obtained with existing methods on the roles individual sectors play in generating economy-wide environmental pressures. Betweenness-based method proposed here can therefore complement existing methods for guiding sector-level environmental pressure mitigation strategies.

  10. Analysis of Cement-Based Pastes Mixed with Waste Tire Rubber

    NASA Astrophysics Data System (ADS)

    Sola, O. C.; Ozyazgan, C.; Sayin, B.

    2017-03-01

    Using the methods of thermal gravimetry, differential thermal analysis, Furier transform infrared analysis, and capillary absorption, the properties of a cement composite produced by introducing waste tyre rubber into a cement mixture were investigated. It was found that the composite filled with the rubber had a much lower water absorption ability than the unfilled one.

  11. The Analysis Performance Method Naive Bayes Andssvm Determine Pattern Groups of Disease

    NASA Astrophysics Data System (ADS)

    Sitanggang, Rianto; Tulus; Situmorang, Zakarias

    2017-12-01

    Information is a very important element and into the daily needs of the moment, to get a precise and accurate information is not easy, this research can help decision makers and make a comparison. Researchers perform data mining techniques to analyze the performance of methods and algorithms naïve Bayes methods Smooth Support Vector Machine (ssvm) in the grouping of the disease.The pattern of disease that is often suffered by people in the group can be in the detection area of the collection of information contained in the medical record. Medical records have infromasi disease by patients in coded according to standard WHO. Processing of medical record data to find patterns of this group of diseases that often occur in this community take the attribute address, sex, type of disease, and age. Determining the next analysis is grouping of four ersebut attribute. From the results of research conducted on the dataset fever diabete mellitus, naïve Bayes method produces an average value of 99% and an accuracy and SSVM method produces an average value of 93% accuracy

  12. Interactive Correlation Analysis and Visualization of Climate Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Kwan-Liu

    The relationship between our ability to analyze and extract insights from visualization of climate model output and the capability of the available resources to make those visualizations has reached a crisis point. The large volume of data currently produced by climate models is overwhelming the current, decades-old visualization workflow. The traditional methods for visualizing climate output also have not kept pace with changes in the types of grids used, the number of variables involved, and the number of different simulations performed with a climate model or the feature-richness of high-resolution simulations. This project has developed new and faster methods formore » visualization in order to get the most knowledge out of the new generation of high-resolution climate models. While traditional climate images will continue to be useful, there is need for new approaches to visualization and analysis of climate data if we are to gain all the insights available in ultra-large data sets produced by high-resolution model output and ensemble integrations of climate models such as those produced for the Coupled Model Intercomparison Project. Towards that end, we have developed new visualization techniques for performing correlation analysis. We have also introduced highly scalable, parallel rendering methods for visualizing large-scale 3D data. This project was done jointly with climate scientists and visualization researchers at Argonne National Laboratory and NCAR.« less

  13. Discrete ordinates-Monte Carlo coupling: A comparison of techniques in NERVA radiation analysis

    NASA Technical Reports Server (NTRS)

    Lindstrom, D. G.; Normand, E.; Wilcox, A. D.

    1972-01-01

    In the radiation analysis of the NERVA nuclear rocket system, two-dimensional discrete ordinates calculations are sufficient to provide detail in the pressure vessel and reactor assembly. Other parts of the system, however, require three-dimensional Monte Carlo analyses. To use these two methods in a single analysis, a means of coupling was developed whereby the results of a discrete ordinates calculation can be used to produce source data for a Monte Carlo calculation. Several techniques for producing source detail were investigated. Results of calculations on the NERVA system are compared and limitations and advantages of the coupling techniques discussed.

  14. Myocardial strains from 3D displacement encoded magnetic resonance imaging

    PubMed Central

    2012-01-01

    Background The ability to measure and quantify myocardial motion and deformation provides a useful tool to assist in the diagnosis, prognosis and management of heart disease. The recent development of magnetic resonance imaging methods, such as harmonic phase analysis of tagging and displacement encoding with stimulated echoes (DENSE), make detailed non-invasive 3D kinematic analyses of human myocardium possible in the clinic and for research purposes. A robust analysis method is required, however. Methods We propose to estimate strain using a polynomial function which produces local models of the displacement field obtained with DENSE. Given a specific polynomial order, the model is obtained as the least squares fit of the acquired displacement field. These local models are subsequently used to produce estimates of the full strain tensor. Results The proposed method is evaluated on a numerical phantom as well as in vivo on a healthy human heart. The evaluation showed that the proposed method produced accurate results and showed low sensitivity to noise in the numerical phantom. The method was also demonstrated in vivo by assessment of the full strain tensor and to resolve transmural strain variations. Conclusions Strain estimation within a 3D myocardial volume based on polynomial functions yields accurate and robust results when validated on an analytical model. The polynomial field is capable of resolving the measured material positions from the in vivo data, and the obtained in vivo strains values agree with previously reported myocardial strains in normal human hearts. PMID:22533791

  15. An evaluation of tyramide signal amplification and archived fixed and frozen tissue in microarray gene expression analysis

    PubMed Central

    Karsten, Stanislav L.; Van Deerlin, Vivianna M. D.; Sabatti, Chiara; Gill, Lisa H.; Geschwind, Daniel H.

    2002-01-01

    Archival formalin-fixed, paraffin-embedded and ethanol-fixed tissues represent a potentially invaluable resource for gene expression analysis, as they are the most widely available material for studies of human disease. Little data are available evaluating whether RNA obtained from fixed (archival) tissues could produce reliable and reproducible microarray expression data. Here we compare the use of RNA isolated from human archival tissues fixed in ethanol and formalin to frozen tissue in cDNA microarray experiments. Since an additional factor that can limit the utility of archival tissue is the often small quantities available, we also evaluate the use of the tyramide signal amplification method (TSA), which allows the use of small amounts of RNA. Detailed analysis indicates that TSA provides a consistent and reproducible signal amplification method for cDNA microarray analysis, across both arrays and the genes tested. Analysis of this method also highlights the importance of performing non-linear channel normalization and dye switching. Furthermore, archived, fixed specimens can perform well, but not surprisingly, produce more variable results than frozen tissues. Consistent results are more easily obtainable using ethanol-fixed tissues, whereas formalin-fixed tissue does not typically provide a useful substrate for cDNA synthesis and labeling. PMID:11788730

  16. Development of higher-order modal methods for transient thermal and structural analysis

    NASA Technical Reports Server (NTRS)

    Camarda, Charles J.; Haftka, Raphael T.

    1989-01-01

    A force-derivative method which produces higher-order modal solutions to transient problems is evaluated. These higher-order solutions converge to an accurate response using fewer degrees-of-freedom (eigenmodes) than lower-order methods such as the mode-displacement or mode-acceleration methods. Results are presented for non-proportionally damped structural problems as well as thermal problems modeled by finite elements.

  17. Simple, effective protein extraction method and proteomics analysis from polyunsaturated fatty acids-producing micro-organisms.

    PubMed

    Ling, Xueping; Guo, Jing; Zheng, Chuqiang; Ye, Chiming; Lu, Yinghua; Pan, Xueshan; Chen, Zhengqi; Ng, I-Son

    2015-12-01

    Polyunsaturated fatty acids (PUFAs) are valuable ingredients in the food and pharmaceutical products due to their beneficial influence on human health. Most studies paid attention on the production of PUFAs from oleaginous micro-organisms but seldom on the comparative proteomics of cells. In the study, three methods (i.e., cold shock, acetone precipitation and ethanol precipitation) for lipid removal from crude protein extracts were applied in different PUFAs-producing micro-organisms. Among the selective strains, Schizochytrium was used as an oleaginous strain with high lipid of 60.3 (w/w%) in biomass. The Mortierella alpina and Cunninghamella echinulata were chosen as the low-lipid-content strains with 25.8 (w/w%) and 21.8 (w/w%) of lipid in biomass, respectively. The cold shock resulted as the most effective method for lipid removed, thus obtained higher protein amount for Schizochytrium. Moreover, from the comparative proteomics for the three PUFAs-producing strains, it showed more significant proteins of up or down-regulation were explored under cold shock treatment. Therefore, the essential proteins (i.e., polyunsaturated fatty acid synthase) and regulating proteins were observed. In conclusion, this study provides a valuable and practical approach for analysis of high PUFAs-producing strains at the proteomics level, and would further accelerate the understanding of the metabolic flux in oleaginous micro-organisms.

  18. Quality control of murine monoclonal antibodies using isoelectric focusing affinity immunoblot analysis

    NASA Technical Reports Server (NTRS)

    Hamilton, Robert G.; Rodkey, L. Scott; Reimer, Charles B.

    1987-01-01

    The quality control of murine hybridoma secretory products has been performed using two approaches for isoelectric focusing affinity immunoblot analysis: (1) a method in which antigen-coated nitrocellulose is placed on top of an acrylamide gel containing isoelectrically focused ascites to bind the antigen specific monoclonal antibody; and (2) a method in which focused ascite proteins were passively blotted onto nitrocellulose and specific monoclonal antibodies were detected with enzyme-conjugated antigen. Analysis by both methods of batches of ascites containing antihuman IgG antibodies that were produced by six hybridomas permitted effective monitoring of immunoreactive antibodies for pI microheterogeneity.

  19. Warpage analysis on thin shell part using response surface methodology (RSM)

    NASA Astrophysics Data System (ADS)

    Zulhasif, Z.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.

    2017-09-01

    The optimisation of moulding parameters appropriate to reduce warpage defects produce using Autodesk Moldflow Insight (AMI) 2012 software The product is injected by using Acrylonitrile-Butadiene-Styrene (ABS) materials. This analysis has processing parameter that varies in melting temperature, mould temperature, packing pressure and packing time. Design of Experiments (DOE) has been integrated to obtain a polynomial model using Response Surface Methodology (RSM). The Glowworm Swarm Optimisation (GSO) method is used to predict a best combination parameters to minimise warpage defect in order to produce high quality parts.

  20. Analytical Strategies Involved in the Detailed Componential Characterization of Biooil Produced from Lignocellulosic Biomass

    PubMed Central

    Li, Guo-Sheng; Wei, Xian-Yong

    2017-01-01

    Elucidation of chemical composition of biooil is essentially important to evaluate the process of lignocellulosic biomass (LCBM) conversion and its upgrading and suggest proper value-added utilization like producing fuel and feedstock for fine chemicals. Although the main components of LCBM are cellulose, hemicelluloses, and lignin, the chemicals derived from LCBM differ significantly due to the various feedstock and methods used for the decomposition. Biooil, produced from pyrolysis of LCBM, contains hundreds of organic chemicals with various classes. This review covers the methodologies used for the componential analysis of biooil, including pretreatments and instrumental analysis techniques. The use of chromatographic and spectrometric methods was highlighted, covering the conventional techniques such as gas chromatography, high performance liquid chromatography, Fourier transform infrared spectroscopy, nuclear magnetic resonance, and mass spectrometry. The combination of preseparation methods and instrumental technologies is a robust pathway for the detailed componential characterization of biooil. The organic species in biooils can be classified into alkanes, alkenes, alkynes, benzene-ring containing hydrocarbons, ethers, alcohols, phenols, aldehydes, ketones, esters, carboxylic acids, and other heteroatomic organic compounds. The recent development of high resolution mass spectrometry and multidimensional hyphenated chromatographic and spectrometric techniques has considerably elucidated the composition of biooils. PMID:29387086

  1. Hearsay Ethnography: Conversational Journals as a Method for Studying Culture in Action.

    PubMed

    Watkins, Susan Cotts; Swidler, Ann

    2009-04-01

    Social scientists have long struggled to develop methods adequate to their theoretical understanding of meaning as collective and dynamic. While culture is widely understood as an emergent property of collectivities, the methods we use keep pulling us back towards interview-situated accounts and an image of culture as located in individual experience. Scholars who seek to access supra-individual semiotic structures by studying public rituals and other collectively-produced texts then have difficulty capturing the dynamic processes through which such meanings are created and changed in situ. To try to capture more effectively the way meaning is produced and re-produced in everyday life, we focus here on conversational interactions-the voices and actions that constitute the relational space among actors. Conversational journals provide us with a method: the analysis of texts produced by cultural insiders who keep journals of who-said-what-to-whom in conversations they overhear or events they participate in during the course of their daily lives. We describe the method, distinguishing it from other approaches and noting its drawbacks. We then illustrate the methodological advantages of conversational journals with examples from our texts. We end with a discussion of the method's potential in our setting as well as in other places and times.

  2. Experience report: Using formal methods for requirements analysis of critical spacecraft software

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.; Ampo, Yoko

    1994-01-01

    Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.

  3. Analysis of fuel using the Direct LSC method determination of bio-originated fuel in the presence of quenching

    DOE PAGES

    Doll, Charles G.; Wright, Cherylyn W.; Morley, Shannon M.; ...

    2017-02-01

    In this paper, a modified version of the Direct LSC method to correct for quenching effect was investigated for the determination of bio-originated fuel content in fuel samples produced from multiple biological starting materials. The modified method was found to be accurate in determining the percent bio-originated fuel to within 5% of the actual value for samples with quenching effects ≤43%. Finally, analysis of highly quenched samples was possible when diluted with the exception of one sample with a 100% quenching effect.

  4. Analysis of fuel using the Direct LSC method determination of bio-originated fuel in the presence of quenching

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doll, Charles G.; Wright, Cherylyn W.; Morley, Shannon M.

    A modified version of the Direct LSC method to correct for quenching effect was investigated for the determination of bio-originated fuel content in fuel samples produced from multiple biological starting materials. The modified method was found to be accurate in determining the percent bio-originated fuel to within 5% of the actual value for samples with quenching effects ≤43%. Analysis of highly quenched samples was possible when diluted with the exception of one sample with a 100% quenching effect.

  5. Analysis of fuel using the Direct LSC method determination of bio-originated fuel in the presence of quenching

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doll, Charles G.; Wright, Cherylyn W.; Morley, Shannon M.

    In this paper, a modified version of the Direct LSC method to correct for quenching effect was investigated for the determination of bio-originated fuel content in fuel samples produced from multiple biological starting materials. The modified method was found to be accurate in determining the percent bio-originated fuel to within 5% of the actual value for samples with quenching effects ≤43%. Finally, analysis of highly quenched samples was possible when diluted with the exception of one sample with a 100% quenching effect.

  6. Finite element analysis and computer graphics visualization of flow around pitching and plunging airfoils

    NASA Technical Reports Server (NTRS)

    Bratanow, T.; Ecer, A.

    1973-01-01

    A general computational method for analyzing unsteady flow around pitching and plunging airfoils was developed. The finite element method was applied in developing an efficient numerical procedure for the solution of equations describing the flow around airfoils. The numerical results were employed in conjunction with computer graphics techniques to produce visualization of the flow. The investigation involved mathematical model studies of flow in two phases: (1) analysis of a potential flow formulation and (2) analysis of an incompressible, unsteady, viscous flow from Navier-Stokes equations.

  7. The Tracer Method of Curriculum Analysis in Cancer Education

    ERIC Educational Resources Information Center

    Mahan, J. Maurice; And Others

    1976-01-01

    To assist faculty involved in cancer education in various courses in the curriculum, rather than instituting a new course in oncology, a method was developed for identifying and assessing cancer-related content (a clinical clerk attended lectures, interviewed instructors, reviewed syllibi etc.) and a comprehensive description was produced and…

  8. Evaluation of isolation methods for bacterial RNA quantitation in Dickeya dadantii

    USDA-ARS?s Scientific Manuscript database

    Dickeya dadantii is a difficult source for RNA of a sufficient quality for real-time qRT-PCR analysis of gene expression. Three RNA isolation methods were evaluated for their ability to produce high-quality RNA from this bacterium. Bacterial lysis with Trizol using standard protocols consistently ga...

  9. Analysis of Publications and Citations from a Geophysics Research Institute.

    ERIC Educational Resources Information Center

    Frohlich, Cliff; Resler, Lynn

    2001-01-01

    Performs an analysis of all 1128 publications produced by scientists during their employment at the University of Texas Institute for Geophysics, thus assessing research performance using as bibliometric indicators such statistics as publications per year, citations per paper, and cited half-lives. Evaluates five different methods for determining…

  10. A Guide to Job Analysis for the Preparation of Job Training Programmes.

    ERIC Educational Resources Information Center

    Ceramics, Glass, and Mineral Products Industry Training Board, Harrow (England).

    The paper deals with job analysis for the preparation of job training programs. The analytical approach involves five steps: enlisting support, examining the job, describing the job, analyzing training requirements, and planning the programs. Appendixes include methods of producing training schemes--the simple job breakdown, straightforward…

  11. Plasmons: Why Should We Care?

    ERIC Educational Resources Information Center

    Campbell, Dean J.; Xia, Younan

    2007-01-01

    The physical phenomenon of plasmons and the techniques that build upon them are discussed. Plasmon-enhanced applications are well-suited for introduction in physical chemistry and instrumental analysis classes and some methods of fabrication and analysis of plasmon-producing structures are simple for use in labs in general, physical and inorganic…

  12. Comparison of potential method in analytic hierarchy process for multi-attribute of catering service companies

    NASA Astrophysics Data System (ADS)

    Mamat, Siti Salwana; Ahmad, Tahir; Awang, Siti Rahmah

    2017-08-01

    Analytic Hierarchy Process (AHP) is a method used in structuring, measuring and synthesizing criteria, in particular ranking of multiple criteria in decision making problems. On the other hand, Potential Method is a ranking procedure in which utilizes preference graph ς (V, A). Two nodes are adjacent if they are compared in a pairwise comparison whereby the assigned arc is oriented towards the more preferred node. In this paper Potential Method is used to solve problem on a catering service selection. The comparison of result by using Potential method is made with Extent Analysis. The Potential Method is found to produce the same rank as Extent Analysis in AHP.

  13. Coupled Electro-Magneto-Mechanical-Acoustic Analysis Method Developed by Using 2D Finite Element Method for Flat Panel Speaker Driven by Magnetostrictive-Material-Based Actuator

    NASA Astrophysics Data System (ADS)

    Yoo, Byungjin; Hirata, Katsuhiro; Oonishi, Atsurou

    In this study, a coupled analysis method for flat panel speakers driven by giant magnetostrictive material (GMM) based actuator was developed. The sound field produced by a flat panel speaker that is driven by a GMM actuator depends on the vibration of the flat panel, this vibration is a result of magnetostriction property of the GMM. In this case, to predict the sound pressure level (SPL) in the audio-frequency range, it is necessary to take into account not only the magnetostriction property of the GMM but also the effect of eddy current and the vibration characteristics of the actuator and the flat panel. In this paper, a coupled electromagnetic-structural-acoustic analysis method is presented; this method was developed by using the finite element method (FEM). This analysis method is used to predict the performance of a flat panel speaker in the audio-frequency range. The validity of the analysis method is verified by comparing with the measurement results of a prototype speaker.

  14. A comparison of carbon stock estimates and projections for the northeastern United States

    Treesearch

    Richard G. MacLean; Mark J. Ducey; Coeli M. Hoover

    2014-01-01

    We conducted a comparison of carbon stock estimates produced by three different methods using regional data from the USDA Forest Service Forest Inventory and Analysis (FIA). Two methods incorporated by the Forest Vegetation Simulator (FVS) were compared to each other and to the current FIA component ratio method. We also examined the uncalibrated performance of FVS...

  15. Direct analysis of prostaglandin-E2 and -D2 produced in an inflammatory cell reaction and its application for activity screening and potency evaluation using turbulent flow chromatography liquid chromatography-high resolution mass spectrometry.

    PubMed

    Shin, Jeong-Sook; Peng, Lei; Kang, Kyungsu; Choi, Yongsoo

    2016-09-09

    Direct analysis of prostaglandin-E2 (PGE2) and -D2 (PGD2) produced from a RAW264.7 cell-based reaction was performed by liquid chromatography high-resolution mass spectrometry (LC-HRMS), which was online coupled with turbulent flow chromatography (TFC). The capability of this method to accurately measure PG levels in cell reaction medium containing cytokines or proteins as a reaction byproduct was cross-validated by two conventional methods. Two methods, including an LC-HRMS method after liquid-liquid extraction (LLE) of the sample and a commercial PGE2 enzyme-linked immunosorbent assay (ELISA), showed PGE2 and/or PGD2 levels almost similar to those obtained by TFC LC-HRMS over the reaction time after LPS stimulation. After the cross-validation, significant analytical throughputs, allowing simultaneous screening and potency evaluation of 80 natural products including 60 phytochemicals and 20 natural product extracts for the inhibition of the PGD2 produced in the cell-based inflammatory reaction, were achieved using the TFC LC-HRMS method developed. Among the 60 phytochemicals screened, licochalcone A and formononetin inhibited PGD2 production the most with IC50 values of 126 and 151nM, respectively. For a reference activity, indomethacin and diclofenac were used, measuring IC50 values of 0.64 and 0.21nM, respectively. This method also found a butanol extract of Akebia quinata Decne (AQ) stem as a promising natural product for PGD2 inhibition. Direct and accurate analysis of PGs in the inflammatory cell reaction using the TFC LC-HRMS method developed enables the high-throughput screening and potency evaluation of as many as 320 samples in less than 48h without changing a TFC column. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Identification of atmospheric organic sources using the carbon hollow tube-gas chromatography method and factor analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cobb, G.P.; Braman, R.S.; Gilbert, R.A.

    Atmospheric organics were sampled and analyzed by using the carbon hollow tube-gas chromatography method. Chromatograms from spice mixtures, cigarettes, and ambient air were analyzed. Principal factor analysis of row order chromatographic data produces factors which are eigenchromatograms of the components in the samples. Component sources are identified from the eigenchromatograms in all experiments and the individual eigenchromatogram corresponding to a particular source is determined in most cases. Organic sources in ambient air and in cigaretts are identified with 87% certainty. Analysis of clove cigarettes allows the determination of the relative amount of clove in different cigarettes. A new nondestructive qualitymore » control method using the hollow tube-gas chromatography analysis is discussed.« less

  17. Meshfree truncated hierarchical refinement for isogeometric analysis

    NASA Astrophysics Data System (ADS)

    Atri, H. R.; Shojaee, S.

    2018-05-01

    In this paper truncated hierarchical B-spline (THB-spline) is coupled with reproducing kernel particle method (RKPM) to blend advantages of the isogeometric analysis and meshfree methods. Since under certain conditions, the isogeometric B-spline and NURBS basis functions are exactly represented by reproducing kernel meshfree shape functions, recursive process of producing isogeometric bases can be omitted. More importantly, a seamless link between meshfree methods and isogeometric analysis can be easily defined which provide an authentic meshfree approach to refine the model locally in isogeometric analysis. This procedure can be accomplished using truncated hierarchical B-splines to construct new bases and adaptively refine them. It is also shown that the THB-RKPM method can provide efficient approximation schemes for numerical simulations and represent a promising performance in adaptive refinement of partial differential equations via isogeometric analysis. The proposed approach for adaptive locally refinement is presented in detail and its effectiveness is investigated through well-known benchmark examples.

  18. Nondestructive Analysis of Tumor-Associated Membrane Protein Integrating Imaging and Amplified Detection in situ Based on Dual-Labeled DNAzyme.

    PubMed

    Chen, Xiaoxia; Zhao, Jing; Chen, Tianshu; Gao, Tao; Zhu, Xiaoli; Li, Genxi

    2018-01-01

    Comprehensive analysis of the expression level and location of tumor-associated membrane proteins (TMPs) is of vital importance for the profiling of tumor cells. Currently, two kinds of independent techniques, i.e. ex situ detection and in situ imaging, are usually required for the quantification and localization of TMPs respectively, resulting in some inevitable problems. Methods: Herein, based on a well-designed and fluorophore-labeled DNAzyme, we develop an integrated and facile method, in which imaging and quantification of TMPs in situ are achieved simultaneously in a single system. The labeled DNAzyme not only produces localized fluorescence for the visualization of TMPs but also catalyzes the cleavage of a substrate to produce quantitative fluorescent signals that can be collected from solution for the sensitive detection of TMPs. Results: Results from the DNAzyme-based in situ imaging and quantification of TMPs match well with traditional immunofluorescence and western blotting. In addition to the advantage of two-in-one, the DNAzyme-based method is highly sensitivity, allowing the detection of TMPs in only 100 cells. Moreover, the method is nondestructive. Cells after analysis could retain their physiological activity and could be cultured for other applications. Conclusion: The integrated system provides solid results for both imaging and quantification of TMPs, making it a competitive method over some traditional techniques for the analysis of TMPs, which offers potential application as a toolbox in the future.

  19. Comparison of data analysis strategies for intent-to-treat analysis in pre-test-post-test designs with substantial dropout rates.

    PubMed

    Salim, Agus; Mackinnon, Andrew; Christensen, Helen; Griffiths, Kathleen

    2008-09-30

    The pre-test-post-test design (PPD) is predominant in trials of psychotherapeutic treatments. Missing data due to withdrawals present an even bigger challenge in assessing treatment effectiveness under the PPD than under designs with more observations since dropout implies an absence of information about response to treatment. When confronted with missing data, often it is reasonable to assume that the mechanism underlying missingness is related to observed but not to unobserved outcomes (missing at random, MAR). Previous simulation and theoretical studies have shown that, under MAR, modern techniques such as maximum-likelihood (ML) based methods and multiple imputation (MI) can be used to produce unbiased estimates of treatment effects. In practice, however, ad hoc methods such as last observation carried forward (LOCF) imputation and complete-case (CC) analysis continue to be used. In order to better understand the behaviour of these methods in the PPD, we compare the performance of traditional approaches (LOCF, CC) and theoretically sound techniques (MI, ML), under various MAR mechanisms. We show that the LOCF method is seriously biased and conclude that its use should be abandoned. Complete-case analysis produces unbiased estimates only when the dropout mechanism does not depend on pre-test values even when dropout is related to fixed covariates including treatment group (covariate-dependent: CD). However, CC analysis is generally biased under MAR. The magnitude of the bias is largest when the correlation of post- and pre-test is relatively low.

  20. Magnesium stearine production via direct reaction of palm stearine and magnesium hydroxide

    NASA Astrophysics Data System (ADS)

    Pratiwi, M.; Ylitervo, P.; Pettersson, A.; Prakoso, T.; Soerawidjaja, T. H.

    2017-06-01

    The fossil oil production could not compensate with the increase of its consumption, because of this reason the renewable alternative energy source is needed to meet this requirement of this fuel. One of the methods to produce hydrocarbon is by decarboxylation of fatty acids. Vegetable oil and fats are the greatest source of fatty acids, so these can be used as raw material for biohydrocarbon production. From other researchers on their past researchs, by heating base soap from divalent metal, those metal salts will decarboxylate and produce hydrocarbon. This study investigate the process and characterization of magnesium soaps from palm stearine by Blachford method. The metal soaps are synthesized by direct reaction of palm stearine and magnesium hydroxide to produce magnesium stearine and magnesium stearine base soaps at 140-180°C and 6-10 bar for 3-6 hours. The operation process which succeed to gain metal soaps is 180°C, 10 bar, for 3-6 hours. These metal soaps are then compared with commercial magnesium stearate. Based on Thermogravimetry Analysis (TGA) results, the decomposition temperature of all the metal soaps were 250°C. Scanning Electron Microscope with Energy Dispersive X-ray (SEM-EDX) analysis have shown the traces of sodium sulphate for magnesium stearate commercial and magnesium hydroxide for both type of magnesium stearine soaps. The analysis results from Microwave Plasma-Atomic Emission Spectrometry (MP-AES) have shown that the magnesium content of magnesium stearine approximate with magnesium stearate commercial and lower compare with magnesium stearine base soaps. These experiments suggest that the presented saponification process method could produced metal soaps comparable with the commercial metal soaps.

  1. Performance comparisons on spatial lattice algorithm and direct matrix inverse method with application to adaptive arrays processing

    NASA Technical Reports Server (NTRS)

    An, S. H.; Yao, K.

    1986-01-01

    Lattice algorithm has been employed in numerous adaptive filtering applications such as speech analysis/synthesis, noise canceling, spectral analysis, and channel equalization. In this paper the application to adaptive-array processing is discussed. The advantages are fast convergence rate as well as computational accuracy independent of the noise and interference conditions. The results produced by this technique are compared to those obtained by the direct matrix inverse method.

  2. A simplified and efficient method for the analysis of fatty acid methyl esters suitable for large clinical studies.

    PubMed

    Masood, Athar; Stark, Ken D; Salem, Norman

    2005-10-01

    Conventional sample preparation for fatty acid analysis is a complicated, multiple-step process, and gas chromatography (GC) analysis alone can require >1 h per sample to resolve fatty acid methyl esters (FAMEs). Fast GC analysis was adapted to human plasma FAME analysis using a modified polyethylene glycol column with smaller internal diameters, thinner stationary phase films, increased carrier gas linear velocity, and faster temperature ramping. Our results indicated that fast GC analyses were comparable to conventional GC in peak resolution. A conventional transesterification method based on Lepage and Roy was simplified to a one-step method with the elimination of the neutralization and centrifugation steps. A robotics-amenable method was also developed, with lower methylation temperatures and in an open-tube format using multiple reagent additions. The simplified methods produced results that were quantitatively similar and with similar coefficients of variation as compared with the original Lepage and Roy method. The present streamlined methodology is suitable for the direct fatty acid analysis of human plasma, is appropriate for research studies, and will facilitate large clinical trials and make possible population studies.

  3. Job Performance as Multivariate Dynamic Criteria: Experience Sampling and Multiway Component Analysis.

    PubMed

    Spain, Seth M; Miner, Andrew G; Kroonenberg, Pieter M; Drasgow, Fritz

    2010-08-06

    Questions about the dynamic processes that drive behavior at work have been the focus of increasing attention in recent years. Models describing behavior at work and research on momentary behavior indicate that substantial variation exists within individuals. This article examines the rationale behind this body of work and explores a method of analyzing momentary work behavior using experience sampling methods. The article also examines a previously unused set of methods for analyzing data produced by experience sampling. These methods are known collectively as multiway component analysis. Two archetypal techniques of multimode factor analysis, the Parallel factor analysis and the Tucker3 models, are used to analyze data from Miner, Glomb, and Hulin's (2010) experience sampling study of work behavior. The efficacy of these techniques for analyzing experience sampling data is discussed as are the substantive multimode component models obtained.

  4. Development and optimization of SPECT gated blood pool cluster analysis for the prediction of CRT outcome

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lalonde, Michel, E-mail: mlalonde15@rogers.com; Wassenaar, Richard; Wells, R. Glenn

    2014-07-15

    Purpose: Phase analysis of single photon emission computed tomography (SPECT) radionuclide angiography (RNA) has been investigated for its potential to predict the outcome of cardiac resynchronization therapy (CRT). However, phase analysis may be limited in its potential at predicting CRT outcome as valuable information may be lost by assuming that time-activity curves (TAC) follow a simple sinusoidal shape. A new method, cluster analysis, is proposed which directly evaluates the TACs and may lead to a better understanding of dyssynchrony patterns and CRT outcome. Cluster analysis algorithms were developed and optimized to maximize their ability to predict CRT response. Methods: Aboutmore » 49 patients (N = 27 ischemic etiology) received a SPECT RNA scan as well as positron emission tomography (PET) perfusion and viability scans prior to undergoing CRT. A semiautomated algorithm sampled the left ventricle wall to produce 568 TACs from SPECT RNA data. The TACs were then subjected to two different cluster analysis techniques, K-means, and normal average, where several input metrics were also varied to determine the optimal settings for the prediction of CRT outcome. Each TAC was assigned to a cluster group based on the comparison criteria and global and segmental cluster size and scores were used as measures of dyssynchrony and used to predict response to CRT. A repeated random twofold cross-validation technique was used to train and validate the cluster algorithm. Receiver operating characteristic (ROC) analysis was used to calculate the area under the curve (AUC) and compare results to those obtained for SPECT RNA phase analysis and PET scar size analysis methods. Results: Using the normal average cluster analysis approach, the septal wall produced statistically significant results for predicting CRT results in the ischemic population (ROC AUC = 0.73;p < 0.05 vs. equal chance ROC AUC = 0.50) with an optimal operating point of 71% sensitivity and 60% specificity. Cluster analysis results were similar to SPECT RNA phase analysis (ROC AUC = 0.78, p = 0.73 vs cluster AUC; sensitivity/specificity = 59%/89%) and PET scar size analysis (ROC AUC = 0.73, p = 1.0 vs cluster AUC; sensitivity/specificity = 76%/67%). Conclusions: A SPECT RNA cluster analysis algorithm was developed for the prediction of CRT outcome. Cluster analysis results produced results equivalent to those obtained from Fourier and scar analysis.« less

  5. Molecular characterization of patulin producing and non-producing Penicillium species in apples from Morocco.

    PubMed

    Rharmitt, Sanae; Hafidi, Majida; Hajjaj, Hassan; Scordino, Fabio; Giosa, Domenico; Giuffrè, Letterio; Barreca, Davide; Criseo, Giuseppe; Romeo, Orazio

    2016-01-18

    The isolation of patulin-producing Penicillia in apples collected in different markets in four localities in Morocco is reported. Fungi were identified by β-tubulin sequencing and further characterized using a specific PCR-based method targeting the isoepoxydon dehydrogenase (IDH) gene to discriminate between patulin-producing and non-producing strains. Production of patulin was also evaluated using standard cultural and biochemical methods. Results showed that 79.5% of contaminant fungi belonged to the genus Penicillium and that Penicillium expansum was the most isolated species (83.9%) followed by Penicillium chrysogenum (~9.7%) and Penicillium crustosum (~6.4%). Molecular analysis revealed that 64.5% of the Penicillium species produced the expected IDH-amplicon denoting patulin production in these strains. However, patulin production was not chemically confirmed in all P. expansum strains. The isolation of IDH(-)/patulin(+) strains poses the hypothesis that gentisylaldehyde is not a direct patulin precursor, supporting previous observations that highlighted the importance of the gentisyl alcohol in the production of this mycotoxin. Total agreement between IDH-gene detection and cultural/chemical methods employed was observed in 58% of P. expansum strains and for 100% of the other species isolated. Overall the data reported here showed a substantial genetic variability within P. expansum population from Morocco. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Environmental impact assessment for alternative-energy power plants in México.

    PubMed

    González-Avila, María E; Beltrán-Morales, Luis Felipe; Braker, Elizabeth; Ortega-Rubio, Alfredo

    2006-07-01

    Ten Environmental Impact Assessment Reports (EIAR) were reviewed for projects involving alternative power plants in Mexico developed during the last twelve years. Our analysis focused on the methods used to assess the impacts produced by hydroelectric and geothermal power projects. These methods used to assess impacts in EIARs ranged from the most simple, descriptive criteria, to quantitative models. These methods are not concordant with the level of the EIAR required by the environmental authority or even, with the kind of project developed. It is concluded that there is no correlation between the tools used to assess impacts and the assigned type of the EIAR. Because the methods to assess impacts produced by these power projects have not changed during 2000 years, we propose a quantitative method, based on ecological criteria and tools, to assess the impacts produced by hydroelectric and geothermal plants, according to the specific characteristics of the project. The proposed method is supported by environmental norms, and can assist environmental authorities in assigning the correct level and tools to be applied to hydroelectric and geothermal projects. The proposed method can be adapted to other production activities in Mexico and to other countries.

  7. Application of the Taguchi Method for Optimizing the Process Parameters of Producing Lightweight Aggregates by Incorporating Tile Grinding Sludge with Reservoir Sediments

    PubMed Central

    Chen, How-Ji; Chang, Sheng-Nan; Tang, Chao-Wei

    2017-01-01

    This study aimed to apply the Taguchi optimization technique to determine the process conditions for producing synthetic lightweight aggregate (LWA) by incorporating tile grinding sludge powder with reservoir sediments. An orthogonal array L16(45) was adopted, which consisted of five controllable four-level factors (i.e., sludge content, preheat temperature, preheat time, sintering temperature, and sintering time). Moreover, the analysis of variance method was used to explore the effects of the experimental factors on the particle density, water absorption, bloating ratio, and loss on ignition of the produced LWA. Overall, the produced aggregates had particle densities ranging from 0.43 to 2.1 g/cm3 and water absorption ranging from 0.6% to 13.4%. These values are comparable to the requirements for ordinary and high-performance LWAs. The results indicated that it is considerably feasible to produce high-performance LWA by incorporating tile grinding sludge with reservoir sediments. PMID:29125576

  8. Application of the Taguchi Method for Optimizing the Process Parameters of Producing Lightweight Aggregates by Incorporating Tile Grinding Sludge with Reservoir Sediments.

    PubMed

    Chen, How-Ji; Chang, Sheng-Nan; Tang, Chao-Wei

    2017-11-10

    This study aimed to apply the Taguchi optimization technique to determine the process conditions for producing synthetic lightweight aggregate (LWA) by incorporating tile grinding sludge powder with reservoir sediments. An orthogonal array L 16 (4⁵) was adopted, which consisted of five controllable four-level factors (i.e., sludge content, preheat temperature, preheat time, sintering temperature, and sintering time). Moreover, the analysis of variance method was used to explore the effects of the experimental factors on the particle density, water absorption, bloating ratio, and loss on ignition of the produced LWA. Overall, the produced aggregates had particle densities ranging from 0.43 to 2.1 g/cm³ and water absorption ranging from 0.6% to 13.4%. These values are comparable to the requirements for ordinary and high-performance LWAs. The results indicated that it is considerably feasible to produce high-performance LWA by incorporating tile grinding sludge with reservoir sediments.

  9. Recent Advances in the Method of Forces: Integrated Force Method of Structural Analysis

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Coroneos, Rula M.; Hopkins, Dale A.

    1998-01-01

    Stress that can be induced in an elastic continuum can be determined directly through the simultaneous application of the equilibrium equations and the compatibility conditions. In the literature, this direct stress formulation is referred to as the integrated force method. This method, which uses forces as the primary unknowns, complements the popular equilibrium-based stiffness method, which considers displacements as the unknowns. The integrated force method produces accurate stress, displacement, and frequency results even for modest finite element models. This version of the force method should be developed as an alternative to the stiffness method because the latter method, which has been researched for the past several decades, may have entered its developmental plateau. Stress plays a primary role in the development of aerospace and other products, and its analysis is difficult. Therefore, it is advisable to use both methods to calculate stress and eliminate errors through comparison. This paper examines the role of the integrated force method in analysis, animation and design.

  10. Improvement of reverse-phase high pressure liquid chromatographic resolution of benzo(a)pyrene metabolites using organic amines: application to metabolites produced by fish

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tjessum, K.; Stegeman, J.J.

    1979-10-15

    Addition of primary organic amines, such as n-butylamine, to the mobile phase altered the capacity factors and selectivity of benzo(a)pyrene metabolites obtained with reverse-phase high pressure liquid chromatography on an ODS column. Separation of benzo(a)pyrene phenols in particular was improved with 8 of the 10 available metabolites resolved, including those known to be biologically produced. The method offers sufficiently improved resolution or convenience that it should prove useful in comparative studies of metabolism of benzo(a)-pyrene and other polynuclear aromatic hydrocarbons. Applying the method to analysis of benzo(a)pyrene metabolites produced in vitro by hepatic microsomes from the marine fish Stenotomus versicolormore » indicated the principal phenolic derivatives produced by this fish were 1-hydroxy-, 3-hydroxy-, 7-hydroxy-, and 9-hydroxybenzo(a)pyrene.« less

  11. Bacteriophage cell lysis of Shiga toxin-producing Escherichia coli for top-down proteomic identification of Shiga toxin 1 & 2 using matrix-assisted laser desorption/ionization tandem time-of-light mass spectrometry

    USDA-ARS?s Scientific Manuscript database

    RATIONALE: Analysis of bacteria by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) often relies upon sample preparation methods that result in cell lysis, e.g. bead-beating. However, Shiga toxin-producing Escherichia coli (STEC) can undergo bacteriophage...

  12. Pathway-based factor analysis of gene expression data produces highly heritable phenotypes that associate with age.

    PubMed

    Anand Brown, Andrew; Ding, Zhihao; Viñuela, Ana; Glass, Dan; Parts, Leopold; Spector, Tim; Winn, John; Durbin, Richard

    2015-03-09

    Statistical factor analysis methods have previously been used to remove noise components from high-dimensional data prior to genetic association mapping and, in a guided fashion, to summarize biologically relevant sources of variation. Here, we show how the derived factors summarizing pathway expression can be used to analyze the relationships between expression, heritability, and aging. We used skin gene expression data from 647 twins from the MuTHER Consortium and applied factor analysis to concisely summarize patterns of gene expression to remove broad confounding influences and to produce concise pathway-level phenotypes. We derived 930 "pathway phenotypes" that summarized patterns of variation across 186 KEGG pathways (five phenotypes per pathway). We identified 69 significant associations of age with phenotype from 57 distinct KEGG pathways at a stringent Bonferroni threshold ([Formula: see text]). These phenotypes are more heritable ([Formula: see text]) than gene expression levels. On average, expression levels of 16% of genes within these pathways are associated with age. Several significant pathways relate to metabolizing sugars and fatty acids; others relate to insulin signaling. We have demonstrated that factor analysis methods combined with biological knowledge can produce more reliable phenotypes with less stochastic noise than the individual gene expression levels, which increases our power to discover biologically relevant associations. These phenotypes could also be applied to discover associations with other environmental factors. Copyright © 2015 Brown et al.

  13. Pathway-Based Factor Analysis of Gene Expression Data Produces Highly Heritable Phenotypes That Associate with Age

    PubMed Central

    Anand Brown, Andrew; Ding, Zhihao; Viñuela, Ana; Glass, Dan; Parts, Leopold; Spector, Tim; Winn, John; Durbin, Richard

    2015-01-01

    Statistical factor analysis methods have previously been used to remove noise components from high-dimensional data prior to genetic association mapping and, in a guided fashion, to summarize biologically relevant sources of variation. Here, we show how the derived factors summarizing pathway expression can be used to analyze the relationships between expression, heritability, and aging. We used skin gene expression data from 647 twins from the MuTHER Consortium and applied factor analysis to concisely summarize patterns of gene expression to remove broad confounding influences and to produce concise pathway-level phenotypes. We derived 930 “pathway phenotypes” that summarized patterns of variation across 186 KEGG pathways (five phenotypes per pathway). We identified 69 significant associations of age with phenotype from 57 distinct KEGG pathways at a stringent Bonferroni threshold (P<5.38×10−5). These phenotypes are more heritable (h2=0.32) than gene expression levels. On average, expression levels of 16% of genes within these pathways are associated with age. Several significant pathways relate to metabolizing sugars and fatty acids; others relate to insulin signaling. We have demonstrated that factor analysis methods combined with biological knowledge can produce more reliable phenotypes with less stochastic noise than the individual gene expression levels, which increases our power to discover biologically relevant associations. These phenotypes could also be applied to discover associations with other environmental factors. PMID:25758824

  14. An Empirical Comparison of Heterogeneity Variance Estimators in 12,894 Meta-Analyses

    ERIC Educational Resources Information Center

    Langan, Dean; Higgins, Julian P. T.; Simmonds, Mark

    2015-01-01

    Heterogeneity in meta-analysis is most commonly estimated using a moment-based approach described by DerSimonian and Laird. However, this method has been shown to produce biased estimates. Alternative methods to estimate heterogeneity include the restricted maximum likelihood approach and those proposed by Paule and Mandel, Sidik and Jonkman, and…

  15. Exploring metabolic pathways in genome-scale networks via generating flux modes.

    PubMed

    Rezola, A; de Figueiredo, L F; Brock, M; Pey, J; Podhorski, A; Wittmann, C; Schuster, S; Bockmayr, A; Planes, F J

    2011-02-15

    The reconstruction of metabolic networks at the genome scale has allowed the analysis of metabolic pathways at an unprecedented level of complexity. Elementary flux modes (EFMs) are an appropriate concept for such analysis. However, their number grows in a combinatorial fashion as the size of the metabolic network increases, which renders the application of EFMs approach to large metabolic networks difficult. Novel methods are expected to deal with such complexity. In this article, we present a novel optimization-based method for determining a minimal generating set of EFMs, i.e. a convex basis. We show that a subset of elements of this convex basis can be effectively computed even in large metabolic networks. Our method was applied to examine the structure of pathways producing lysine in Escherichia coli. We obtained a more varied and informative set of pathways in comparison with existing methods. In addition, an alternative pathway to produce lysine was identified using a detour via propionyl-CoA, which shows the predictive power of our novel approach. The source code in C++ is available upon request.

  16. Combination of ray-tracing and the method of moments for electromagnetic radiation analysis using reduced meshes

    NASA Astrophysics Data System (ADS)

    Delgado, Carlos; Cátedra, Manuel Felipe

    2018-05-01

    This work presents a technique that allows a very noticeable relaxation of the computational requirements for full-wave electromagnetic simulations based on the Method of Moments. A ray-tracing analysis of the geometry is performed in order to extract the critical points with significant contributions. These points are then used to generate a reduced mesh, considering the regions of the geometry that surround each critical point and taking into account the electrical path followed from the source. The electromagnetic analysis of the reduced mesh produces very accurate results, requiring a fraction of the resources that the conventional analysis would utilize.

  17. Multivariate pattern analysis of fMRI: the early beginnings.

    PubMed

    Haxby, James V

    2012-08-15

    In 2001, we published a paper on the representation of faces and objects in ventral temporal cortex that introduced a new method for fMRI analysis, which subsequently came to be called multivariate pattern analysis (MVPA). MVPA now refers to a diverse set of methods that analyze neural responses as patterns of activity that reflect the varying brain states that a cortical field or system can produce. This paper recounts the circumstances and events that led to the original study and later developments and innovations that have greatly expanded this approach to fMRI data analysis, leading to its widespread application. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. Standard Method for Analyzing Gases in Titanium and Titanium Alloys. Standard Method for the Chemical Analysis of Titanium Alloys.

    DTIC Science & Technology

    1982-10-28

    form a non- soluble complex. After filtering and burning the non-pure molybdenum trioxide is weighed. Ammonia water is used to dissolve the molybdenum...niobium and tantalum should use the methyl alcohol distillation - curcumin absorption luminosity 66 method for determination. II. The Methyl Alcohol...Distillation - Curcumin Absorption Luminosity Method 1. Summary of Method In a phosphorus sulfate medium, boron and methyl alcohol produce methyl borate

  19. Use of cyclodextrin-based polymer for patulin analysis in apple juice

    USDA-ARS?s Scientific Manuscript database

    Penicillium expansum, one of the patulin producing fungi that causes decay on apple, is recognized as the main source of patulin contamination on apple and apple products. The widely used method for patulin analysis in apple juice is liquid-liquid extraction with ethyl acetate followed by HPLC-UV or...

  20. Meta-Analysis of Reading Comprehension Interventions for Students with Learning Disabilities: Strategies and Implications

    ERIC Educational Resources Information Center

    Sencibaugh, Joseph M.

    2005-01-01

    This paper examines research studies, which focus on interventions commonly used with students who are learning disabled and identify effective methods that produce substantial benefits concerning reading comprehension. This paper synthesizes previous observation studies by conducting a meta-analysis of strategies used to improve the reading…

  1. Meta-Analysis of Reading Comprehension Interventions for Students with Learning Disabilities: Strategies and Implications

    ERIC Educational Resources Information Center

    Sencibaugh, Joseph M.

    2007-01-01

    This paper examines research studies, which focus on interventions commonly used with students who are learning disabled and identifies effective methods that produce substantial benefits concerning reading comprehension. This paper synthesizes previous observation studies by conducting a meta-analysis of strategies used to improve the reading…

  2. Analysis of the Technical Writing Profession through the DACUM Process.

    ERIC Educational Resources Information Center

    Nolan, Timothy; Green, Marc

    To help develop a curriculum program for technical writers, Cincinnati Technical College used the Developing a Curriculum (DACUM) method to produce a technical writing skills profile. DACUM develops an occupation analysis through a modified brainstorming process by a panel of expert workers under the direction of a qualified coordinator. This…

  3. Environmental Gradient Analysis, Ordination, and Classification in Environmental Impact Assessments.

    DTIC Science & Technology

    1987-09-01

    agglomerative clustering algorithms for mainframe computers: (1) the unweighted pair-group method that V uses arithmetic averages ( UPGMA ), (2) the...hierarchical agglomerative unweighted pair-group method using arithmetic averages ( UPGMA ), which is also called average linkage clustering. This method was...dendrograms produced by weighted clustering (93). Sneath and Sokal (94), Romesburg (84), and Seber• (90) also strongly recommend the UPGMA . A dendrogram

  4. The Coplane Analysis Technique for Three-Dimensional Wind Retrieval Using the HIWRAP Airborne Doppler Radar

    NASA Technical Reports Server (NTRS)

    Didlake, Anthony C., Jr.; Heymsfield, Gerald M.; Tian, Lin; Guimond, Stephen R.

    2015-01-01

    The coplane analysis technique for mapping the three-dimensional wind field of precipitating systems is applied to the NASA High Altitude Wind and Rain Airborne Profiler (HIWRAP). HIWRAP is a dual-frequency Doppler radar system with two downward pointing and conically scanning beams. The coplane technique interpolates radar measurements to a natural coordinate frame, directly solves for two wind components, and integrates the mass continuity equation to retrieve the unobserved third wind component. This technique is tested using a model simulation of a hurricane and compared to a global optimization retrieval. The coplane method produced lower errors for the cross-track and vertical wind components, while the global optimization method produced lower errors for the along-track wind component. Cross-track and vertical wind errors were dependent upon the accuracy of the estimated boundary condition winds near the surface and at nadir, which were derived by making certain assumptions about the vertical velocity field. The coplane technique was then applied successfully to HIWRAP observations of Hurricane Ingrid (2013). Unlike the global optimization method, the coplane analysis allows for a transparent connection between the radar observations and specific analysis results. With this ability, small-scale features can be analyzed more adequately and erroneous radar measurements can be identified more easily.

  5. Fluorescence method for enzyme analysis which couples aromatic amines with aromatic aldehydes

    DOEpatents

    Smith, R.E.; Dolbeare, F.A.

    1980-10-21

    Analysis of proteinases is accomplished using conventional amino acid containing aromatic amine substrates. Aromatic amines such as 4-methoxy-2-naphthylamine (4M2NA), 2-naphthylamine, aminoisophthalic acid dimethyl ester, p-nitroaniline, 4-methoxy-1-aminofluorene and coumarin derivatives resulting from enzymatic hydrolysis of the substrate couples with aromatic aldehydes such as 5-nitrosalicylaldehyde (5-NSA), benzaldehyde and p-nitrobenzaldehyde to produce Schiff-base complexes which are water insoluble. Certain Schiff-base complexes produce a shift from blue to orange-red (visible) fluorescence. Such complexes are useful in the assay of enzymes. No Drawings

  6. Fluorescence method for enzyme analysis which couples aromatic amines with aromatic aldehydes

    DOEpatents

    Smith, Robert E. [557 Escondido Cir., Livermore, CA 94550; Dolbeare, Frank A. [5178 Diane La., Livermore, CA 94550

    1980-10-21

    Analysis of proteinases is accomplished using conventional amino acid containing aromatic amine substrates. Aromatic amines such as 4-methoxy-2-naphthylamine (4M2NA), 2-naphthylamine, aminoisophthalic acid dimethyl ester, p-nitroaniline, 4-methoxy-1-aminofluorene and coumarin derivatives resulting from enzymatic hydrolysis of the substrate couples with aromatic aldehydes such as 5-nitrosalicylaldehyde (5-NSA), benzaldehyde and p-nitrobenzaldehyde to produce Schiff-base complexes which are water insoluble. Certain Schiff-base complexes produce a shift from blue to orange-red (visible) fluorescence. Such complexes are useful in the assay of enzymes.

  7. Fluorescence method for enzyme analysis which couples aromatic amines with aromatic aldehydes

    DOEpatents

    Smith, Robert E.; Dolbeare, Frank A.

    1979-01-01

    Analysis of proteinases is accomplished using conventional amino acid containing aromatic amine substrates. Aromatic amines such as 4-methoxy-2-naphthylamine (4M2NA), 2-naphthylamine, aminoisophthalic acid dimethyl ester, p-nitroaniline, 5-methoxy-1-aminofluorene and coumarin derivatives resulting from enzymatic hydrolysis of the substrate couples with aromatic aldehydes such as 5-nitrosalicylaldehyde (5-NSA), benzaldehyde and p-nitrobenzaldehyde to produce Schiff-base complexes which are water insoluble. Certain Schiff-base complexes produce a shift from blue to orange-red (visible) fluorescence. Such complexes are useful in the assay of enzymes.

  8. Effect of Metakaolin on Strength and Efflorescence Quantity of Cement-Based Composites

    PubMed Central

    Weng, Tsai-Lung; Lin, Wei-Ting; Cheng, An

    2013-01-01

    This study investigated the basic mechanical and microscopic properties of cement produced with metakaolin and quantified the production of residual white efflorescence. Cement mortar was produced at various replacement ratios of metakaolin (0, 5, 10, 15, 20, and 25% by weight of cement) and exposed to various environments. Compressive strength and efflorescence quantify (using Matrix Laboratory image analysis and the curettage method), scanning electron microscopy, and X-ray diffraction analysis were reported in this study. Specimens with metakaolin as a replacement for Portland cement present higher compressive strength and greater resistance to efflorescence; however, the addition of more than 20% metakaolin has a detrimental effect on strength and efflorescence. This may be explained by the microstructure and hydration products. The quantity of efflorescence determined using MATLAB image analysis is close to the result obtained using the curettage method. The results demonstrate the best effectiveness of replacing Portland cement with metakaolin at a 15% replacement ratio by weight. PMID:23737719

  9. Statistically optimal analysis of state-discretized trajectory data from multiple thermodynamic states

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Hao; Mey, Antonia S. J. S.; Noé, Frank

    2014-12-07

    We propose a discrete transition-based reweighting analysis method (dTRAM) for analyzing configuration-space-discretized simulation trajectories produced at different thermodynamic states (temperatures, Hamiltonians, etc.) dTRAM provides maximum-likelihood estimates of stationary quantities (probabilities, free energies, expectation values) at any thermodynamic state. In contrast to the weighted histogram analysis method (WHAM), dTRAM does not require data to be sampled from global equilibrium, and can thus produce superior estimates for enhanced sampling data such as parallel/simulated tempering, replica exchange, umbrella sampling, or metadynamics. In addition, dTRAM provides optimal estimates of Markov state models (MSMs) from the discretized state-space trajectories at all thermodynamic states. Under suitablemore » conditions, these MSMs can be used to calculate kinetic quantities (e.g., rates, timescales). In the limit of a single thermodynamic state, dTRAM estimates a maximum likelihood reversible MSM, while in the limit of uncorrelated sampling data, dTRAM is identical to WHAM. dTRAM is thus a generalization to both estimators.« less

  10. Reply

    NASA Astrophysics Data System (ADS)

    Wang, Zhenming; Shi, Baoping; Kiefer, John D.; Woolery, Edward W.

    2004-06-01

    Musson's comments on our article, ``Communicating with uncertainty: A critical issue with probabilistic seismic hazard analysis'' are an example of myths and misunderstandings. We did not say that probabilistic seismic hazard analysis (PSHA) is a bad method, but we did say that it has some limitations that have significant implications. Our response to these comments follows. There is no consensus on exactly how to select seismological parameters and to assign weights in PSHA. This was one of the conclusions reached by a senior seismic hazard analysis committee [SSHAC, 1997] that included C. A. Cornell, founder of the PSHA methodology. The SSHAC report was reviewed by a panel of the National Research Council and was well accepted by seismologists and engineers. As an example of the lack of consensus, Toro and Silva [2001] produced seismic hazard maps for the central United States region that are quite different from those produced by Frankel et al. [2002] because they used different input seismological parameters and weights (see Table 1). We disagree with Musson's conclusion that ``because a method may be applied badly on one occasion does not mean the method itself is bad.'' We do not say that the method is poor, but rather that those who use PSHA need to document their inputs and communicate them fully to the users. It seems that Musson is trying to create myth by suggesting his own methods should be used.

  11. Uniform-droplet spray forming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blue, C.A.; Sikka, V.K.; Chun, Jung-Hoon

    1997-04-01

    The uniform-droplet process is a new method of liquid-metal atomization that results in single droplets that can be used to produce mono-size powders or sprayed-on to substrates to produce near-net shapes with tailored microstructure. The mono-sized powder-production capability of the uniform-droplet process also has the potential of permitting engineered powder blends to produce components of controlled porosity. Metal and alloy powders are commercially produced by at least three different methods: gas atomization, water atomization, and rotating disk. All three methods produce powders of a broad range in size with a very small yield of fine powders with single-sized droplets thatmore » can be used to produce mono-size powders or sprayed-on substrates to produce near-net shapes with tailored microstructures. The economical analysis has shown the process to have the potential of reducing capital cost by 50% and operating cost by 37.5% when applied to powder making. For the spray-forming process, a 25% savings is expected in both the capital and operating costs. The project is jointly carried out at Massachusetts Institute of Technology (MIT), Tuffs University, and Oak Ridge National Laboratory (ORNL). Preliminary interactions with both finished parts and powder producers have shown a strong interest in the uniform-droplet process. Systematic studies are being conducted to optimize the process parameters, understand the solidification of droplets and spray deposits, and develop a uniform-droplet-system (UDS) apparatus appropriate for processing engineering alloys.« less

  12. Insulation commonality assessment (phase 1). Volume 2: Section 7.0 through 16.0. [evaluation of materials used for spacecraft thermal insulation

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The heat transfer characteristics of various materials used for the thermal insulation of spacecraft are discussed. Techniques for conducting thermal performance analysis, structural performance analysis, and dynamic analysis are described. Processes for producing and finishing the materials are explained. The methods for determining reliability, system safety, materials tests, and design effectiveness are explained.

  13. Development of an analytical microbial consortia method for enhancing performance monitoring at aerobic wastewater treatment plants.

    PubMed

    Razban, Behrooz; Nelson, Kristina Y; McMartin, Dena W; Cullimore, D Roy; Wall, Michelle; Wang, Dunling

    2012-01-01

    An analytical method to produce profiles of bacterial biomass fatty acid methyl esters (FAME) was developed employing rapid agitation followed by static incubation (RASI) using selective media of wastewater microbial communities. The results were compiled to produce a unique library for comparison and performance analysis at a Wastewater Treatment Plant (WWTP). A total of 146 samples from the aerated WWTP, comprising 73 samples of each secondary and tertiary effluent, were included analyzed. For comparison purposes, all samples were evaluated via a similarity index (SI) with secondary effluents producing an SI of 0.88 with 2.7% variation and tertiary samples producing an SI 0.86 with 5.0% variation. The results also highlighted significant differences between the fatty acid profiles of the tertiary and secondary effluents indicating considerable shifts in the bacterial community profile between these treatment phases. The WWTP performance results using this method were highly replicable and reproducible indicating that the protocol has potential as a performance-monitoring tool for aerated WWTPs. The results quickly and accurately reflect shifts in dominant bacterial communities that result when processes operations and performance change.

  14. Sample preparation of metal alloys by electric discharge machining

    NASA Technical Reports Server (NTRS)

    Chapman, G. B., II; Gordon, W. A.

    1976-01-01

    Electric discharge machining was investigated as a noncontaminating method of comminuting alloys for subsequent chemical analysis. Particulate dispersions in water were produced from bulk alloys at a rate of about 5 mg/min by using a commercially available machining instrument. The utility of this approach was demonstrated by results obtained when acidified dispersions were substituted for true acid solutions in an established spectrochemical method. The analysis results were not significantly different for the two sample forms. Particle size measurements and preliminary results from other spectrochemical methods which require direct aspiration of liquid into flame or plasma sources are reported.

  15. Analysis of fuel using the Direct LSC method determination of bio-originated fuel in the presence of quenching.

    PubMed

    Doll, Charles G; Wright, Cherylyn W; Morley, Shannon M; Wright, Bob W

    2017-04-01

    A modified version of the Direct LSC method to correct for quenching effect was investigated for the determination of bio-originated fuel content in fuel samples produced from multiple biological starting materials. The modified method was found to be accurate in determining the percent bio-originated fuel to within 5% of the actual value for samples with quenching effects ≤43%. Analysis of highly quenched samples was possible when diluted with the exception of one sample with a 100% quenching effect. Copyright © 2017. Published by Elsevier Ltd.

  16. Traceable Coulomb blockade thermometry

    NASA Astrophysics Data System (ADS)

    Hahtela, O.; Mykkänen, E.; Kemppinen, A.; Meschke, M.; Prunnila, M.; Gunnarsson, D.; Roschier, L.; Penttilä, J.; Pekola, J.

    2017-02-01

    We present a measurement and analysis scheme for determining traceable thermodynamic temperature at cryogenic temperatures using Coulomb blockade thermometry. The uncertainty of the electrical measurement is improved by utilizing two sampling digital voltmeters instead of the traditional lock-in technique. The remaining uncertainty is dominated by that of the numerical analysis of the measurement data. Two analysis methods are demonstrated: numerical fitting of the full conductance curve and measuring the height of the conductance dip. The complete uncertainty analysis shows that using either analysis method the relative combined standard uncertainty (k  =  1) in determining the thermodynamic temperature in the temperature range from 20 mK to 200 mK is below 0.5%. In this temperature range, both analysis methods produced temperature estimates that deviated from 0.39% to 0.67% from the reference temperatures provided by a superconducting reference point device calibrated against the Provisional Low Temperature Scale of 2000.

  17. Coherent Leinard-Wiechert fields produced by FELs (free-electron laser). Technical report, 14 January 1981-13 January 1982

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elias, L.R.

    1981-12-01

    Results are presented of a three-dimensional numerical analysis of the radiation fields produced in a free-electron laser. The method used here to obtain the spatial and temporal behavior of the radiated fields is based on the coherent superposition of the radiated fields is based on the coherent superposition of the exact Lienard-Wiechert fields produced by each electron in the beam. Interference effects are responsible for the narrow angular radiation patterns obtained and for the high degree of monochromaticity of the radiated fields.

  18. Microstructure and properties of Ti-Al intermetallic/Al2O3 layers produced on Ti6Al2Mo2Cr titanium alloy by PACVD method

    NASA Astrophysics Data System (ADS)

    Sitek, R.; Bolek, T.; Mizera, J.

    2018-04-01

    The paper presents investigation of microstructure and corrosion resistance of the multi-component surface layers built of intermetallic phases of the Ti-Al system and an outer Al2O3 ceramic sub-layer. The layers were produced on a two phase (α + β) Ti6Al2Mo2Cr titanium alloy using the PACVD method with the participation of trimethylaluminum vapors. The layers are characterized by a high surface hardness and good corrosion, better than that of these materials in the starting state. In order to find the correlation between their structure and properties, the layers were subjected to examinations using optical microscopy, X-ray diffraction analysis (XRD), surface analysis by XPS, scanning electron microscopy (SEM), and analyses of the chemical composition (EDS). The properties examined included: the corrosion resistance and the hydrogen absorptiveness. Moreover growth of the Al2O3 ceramic layer and its influence on the residual stress distribution was simulated using finite element method [FEM]. The results showed that the produced layer has amorphous-nano-crystalline structure, improved corrosion resistance and reduces the permeability of hydrogen as compared with the base material of Ti6Al2Mo2Cr -titanium alloy.

  19. Morphology and crystallinity of sisal nanocellulose after sonication

    NASA Astrophysics Data System (ADS)

    Sosiati, H.; Wijayanti, D. A.; Triyana, K.; Kamiel, B.

    2017-09-01

    Different preparation methods on the natural fibers resulted in different morphology. However, the relationships between type of natural fibers, preparation methods and the morphology of produced nanocellulose could not be exactly defined. The sisal nanocellulose was presently prepared by alkalization and bleaching followed by sonication to verify changes in the morphology and crystallinity of nanocellulose related to the formation mechanism. The extracted microcellulose was subjected to scanning electron microscopy (SEM) and x-ray diffraction (XRD) analysis. The isolated cellulose nanospheres were examined with respect to morphology by SEM and transmission electron microscopy (TEM) and, to crystallinity by electron diffraction analysis. Bleaching after alkalization made the microfibrils clearly separated from each other to the individual fiber whose width of the single fiber was ranging from 6 to 13 µm. The XRD crystallinity index (CI) of microcellulose gradually increased after the chemical treatments; 83.12% for raw sisal fiber, 88.57% for alkali treated fiber and 94.03% for bleached fibers. The ultrasonic agitation after bleaching that was carried out at 750 Watt, 20 kHz and amplitude of 39% for 2 h produces homogeneous cellulose nanospheres less than 50 nm in diameter with relatively low crystallinity. The electron diffraction analysis confirmed that the low crystallinity of produced nnocellulose is related to the effect of chemical treatment done before sonication.

  20. Symbiotic Fungus of Marine Sponge Axinella sp. Producing Antibacterial Agent

    NASA Astrophysics Data System (ADS)

    Trianto, A.; Widyaningsih, S.; Radjasa, OK; Pribadi, R.

    2017-02-01

    The emerging of multidrug resistance pathogenic bacteria cause the treatment of the diseaseshave become ineffective. There for, invention of a new drug with novel mode of action is an essential for curing the disease caused by an MDR pathogen. Marine fungi is prolific source of bioactive compound that has not been well explored. This study aim to obtain the marine sponges-associated fungus that producing anti-MDR bacteria substaces. We collected the sponge from Riung water, NTT, Indonesia. The fungus was isolated with affixed method, followed with purification with streak method. The overlay and disk diffusion agar methods were applied for bioactivity test for the isolate and the extract, respectively. Molecular analysis was employed for identification of the isolate. The sponge was identified based on morphological and spicular analysis. The ovelay test showed that the isolate KN15-3 active against the MDR Staphylococcus aureus and Eschericia coli. The extract of the cultured KN15-3 was also inhibited the S. aureus and E. coli with inhibition zone 2.95 mm and 4.13 mm, respectively. Based on the molecular analysis, the fungus was identified as Aspergillus sydowii. While the sponge was identified as Axinella sp.

  1. A comprehensive evaluation of popular proteomics software workflows for label-free proteome quantification and imputation.

    PubMed

    Välikangas, Tommi; Suomi, Tomi; Elo, Laura L

    2017-05-31

    Label-free mass spectrometry (MS) has developed into an important tool applied in various fields of biological and life sciences. Several software exist to process the raw MS data into quantified protein abundances, including open source and commercial solutions. Each software includes a set of unique algorithms for different tasks of the MS data processing workflow. While many of these algorithms have been compared separately, a thorough and systematic evaluation of their overall performance is missing. Moreover, systematic information is lacking about the amount of missing values produced by the different proteomics software and the capabilities of different data imputation methods to account for them.In this study, we evaluated the performance of five popular quantitative label-free proteomics software workflows using four different spike-in data sets. Our extensive testing included the number of proteins quantified and the number of missing values produced by each workflow, the accuracy of detecting differential expression and logarithmic fold change and the effect of different imputation and filtering methods on the differential expression results. We found that the Progenesis software performed consistently well in the differential expression analysis and produced few missing values. The missing values produced by the other software decreased their performance, but this difference could be mitigated using proper data filtering or imputation methods. Among the imputation methods, we found that the local least squares (lls) regression imputation consistently increased the performance of the software in the differential expression analysis, and a combination of both data filtering and local least squares imputation increased performance the most in the tested data sets. © The Author 2017. Published by Oxford University Press.

  2. A comparison of five standard methods for evaluating image intensity uniformity in partially parallel imaging MRI

    PubMed Central

    Goerner, Frank L.; Duong, Timothy; Stafford, R. Jason; Clarke, Geoffrey D.

    2013-01-01

    Purpose: To investigate the utility of five different standard measurement methods for determining image uniformity for partially parallel imaging (PPI) acquisitions in terms of consistency across a variety of pulse sequences and reconstruction strategies. Methods: Images were produced with a phantom using a 12-channel head matrix coil in a 3T MRI system (TIM TRIO, Siemens Medical Solutions, Erlangen, Germany). Images produced using echo-planar, fast spin echo, gradient echo, and balanced steady state free precession pulse sequences were evaluated. Two different PPI reconstruction methods were investigated, generalized autocalibrating partially parallel acquisition algorithm (GRAPPA) and modified sensitivity-encoding (mSENSE) with acceleration factors (R) of 2, 3, and 4. Additionally images were acquired with conventional, two-dimensional Fourier imaging methods (R = 1). Five measurement methods of uniformity, recommended by the American College of Radiology (ACR) and the National Electrical Manufacturers Association (NEMA) were considered. The methods investigated were (1) an ACR method and a (2) NEMA method for calculating the peak deviation nonuniformity, (3) a modification of a NEMA method used to produce a gray scale uniformity map, (4) determining the normalized absolute average deviation uniformity, and (5) a NEMA method that focused on 17 areas of the image to measure uniformity. Changes in uniformity as a function of reconstruction method at the same R-value were also investigated. Two-way analysis of variance (ANOVA) was used to determine whether R-value or reconstruction method had a greater influence on signal intensity uniformity measurements for partially parallel MRI. Results: Two of the methods studied had consistently negative slopes when signal intensity uniformity was plotted against R-value. The results obtained comparing mSENSE against GRAPPA found no consistent difference between GRAPPA and mSENSE with regard to signal intensity uniformity. The results of the two-way ANOVA analysis suggest that R-value and pulse sequence type produce the largest influences on uniformity and PPI reconstruction method had relatively little effect. Conclusions: Two of the methods of measuring signal intensity uniformity, described by the (NEMA) MRI standards, consistently indicated a decrease in uniformity with an increase in R-value. Other methods investigated did not demonstrate consistent results for evaluating signal uniformity in MR images obtained by partially parallel methods. However, because the spatial distribution of noise affects uniformity, it is recommended that additional uniformity quality metrics be investigated for partially parallel MR images. PMID:23927345

  3. Solution identification and quantitative analysis of fiber-capacitive drop analyzer based on multivariate statistical methods

    NASA Astrophysics Data System (ADS)

    Chen, Zhe; Qiu, Zurong; Huo, Xinming; Fan, Yuming; Li, Xinghua

    2017-03-01

    A fiber-capacitive drop analyzer is an instrument which monitors a growing droplet to produce a capacitive opto-tensiotrace (COT). Each COT is an integration of fiber light intensity signals and capacitance signals and can reflect the unique physicochemical property of a liquid. In this study, we propose a solution analytical and concentration quantitative method based on multivariate statistical methods. Eight characteristic values are extracted from each COT. A series of COT characteristic values of training solutions at different concentrations compose a data library of this kind of solution. A two-stage linear discriminant analysis is applied to analyze different solution libraries and establish discriminant functions. Test solutions can be discriminated by these functions. After determining the variety of test solutions, Spearman correlation test and principal components analysis are used to filter and reduce dimensions of eight characteristic values, producing a new representative parameter. A cubic spline interpolation function is built between the parameters and concentrations, based on which we can calculate the concentration of the test solution. Methanol, ethanol, n-propanol, and saline solutions are taken as experimental subjects in this paper. For each solution, nine or ten different concentrations are chosen to be the standard library, and the other two concentrations compose the test group. By using the methods mentioned above, all eight test solutions are correctly identified and the average relative error of quantitative analysis is 1.11%. The method proposed is feasible which enlarges the applicable scope of recognizing liquids based on the COT and improves the concentration quantitative precision, as well.

  4. Initial Experiences with Machine-Assisted Reconsiderative Test Scoring: A New Method for Partial Credit and Multiple Correct Responses.

    ERIC Educational Resources Information Center

    Anderson, Paul S.

    Initial experiences with computer-assisted reconsiderative scoring are described. Reconsiderative scoring occurs when student responses are received and reviewed by the teacher before points for correctness are assigned. Manually scored completion-style questions are reconsiderative. A new method of machine assistance produces an item analysis on…

  5. 40 CFR Appendix C to Part 136 - Inductively Coupled Plasma-Atomic Emission Spectrometric Method for Trace Element Analysis of...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Inductively Coupled Plasma-Atomic... to Part 136—Inductively Coupled Plasma—Atomic Emission Spectrometric Method for Trace Element... technique. Samples are nebulized and the aerosol that is produced is transported to the plasma torch where...

  6. 40 CFR Appendix C to Part 136 - Inductively Coupled Plasma-Atomic Emission Spectrometric Method for Trace Element Analysis of...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 23 2011-07-01 2011-07-01 false Inductively Coupled Plasma-Atomic... to Part 136—Inductively Coupled Plasma—Atomic Emission Spectrometric Method for Trace Element... technique. Samples are nebulized and the aerosol that is produced is transported to the plasma torch where...

  7. A comparison of two adaptive multivariate analysis methods (PLSR and ANN) for winter wheat yield forecasting using Landsat-8 OLI images

    NASA Astrophysics Data System (ADS)

    Chen, Pengfei; Jing, Qi

    2017-02-01

    An assumption that the non-linear method is more reasonable than the linear method when canopy reflectance is used to establish the yield prediction model was proposed and tested in this study. For this purpose, partial least squares regression (PLSR) and artificial neural networks (ANN), represented linear and non-linear analysis method, were applied and compared for wheat yield prediction. Multi-period Landsat-8 OLI images were collected at two different wheat growth stages, and a field campaign was conducted to obtain grain yields at selected sampling sites in 2014. The field data were divided into a calibration database and a testing database. Using calibration data, a cross-validation concept was introduced for the PLSR and ANN model construction to prevent over-fitting. All models were tested using the test data. The ANN yield-prediction model produced R2, RMSE and RMSE% values of 0.61, 979 kg ha-1, and 10.38%, respectively, in the testing phase, performing better than the PLSR yield-prediction model, which produced R2, RMSE, and RMSE% values of 0.39, 1211 kg ha-1, and 12.84%, respectively. Non-linear method was suggested as a better method for yield prediction.

  8. Vegetation classification of Coffea on Hawaii Island using WorldView-2 satellite imagery

    NASA Astrophysics Data System (ADS)

    Gaertner, Julie; Genovese, Vanessa Brooks; Potter, Christopher; Sewake, Kelvin; Manoukis, Nicholas C.

    2017-10-01

    Coffee is an important crop in tropical regions of the world; about 125 million people depend on coffee agriculture for their livelihoods. Understanding the spatial extent of coffee fields is useful for management and control of coffee pests such as Hypothenemus hampei and other pests that use coffee fruit as a host for immature stages such as the Mediterranean fruit fly, for economic planning, and for following changes in coffee agroecosystems over time. We present two methods for detecting Coffea arabica fields using remote sensing and geospatial technologies on WorldView-2 high-resolution spectral data of the Kona region of Hawaii Island. The first method, a pixel-based method using a maximum likelihood algorithm, attained 72% producer accuracy and 69% user accuracy (68% overall accuracy) based on analysis of 104 ground truth testing polygons. The second method, an object-based image analysis (OBIA) method, considered both spectral and textural information and improved accuracy, resulting in 76% producer accuracy and 94% user accuracy (81% overall accuracy) for the same testing areas. We conclude that the OBIA method is useful for detecting coffee fields grown in the open and use it to estimate the distribution of about 1050 hectares under coffee agriculture in the Kona region in 2012.

  9. A two dimensional power spectral estimate for some nonstationary processes. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Smith, Gregory L.

    1989-01-01

    A two dimensional estimate for the power spectral density of a nonstationary process is being developed. The estimate will be applied to helicopter noise data which is clearly nonstationary. The acoustic pressure from the isolated main rotor and isolated tail rotor is known to be periodically correlated (PC) and the combined noise from the main and tail rotors is assumed to be correlation autoregressive (CAR). The results of this nonstationary analysis will be compared with the current method of assuming that the data is stationary and analyzing it as such. Another method of analysis is to introduce a random phase shift into the data as shown by Papoulis to produce a time history which can then be accurately modeled as stationary. This method will also be investigated for the helicopter data. A method used to determine the period of a PC process when the period is not know is discussed. The period of a PC process must be known in order to produce an accurate spectral representation for the process. The spectral estimate is developed. The bias and variability of the estimate are also discussed. Finally, the current method for analyzing nonstationary data is compared to that of using a two dimensional spectral representation. In addition, the method of phase shifting the data is examined.

  10. Using multi-criteria analysis of simulation models to understand complex biological systems

    Treesearch

    Maureen C. Kennedy; E. David Ford

    2011-01-01

    Scientists frequently use computer-simulation models to help solve complex biological problems. Typically, such models are highly integrated, they produce multiple outputs, and standard methods of model analysis are ill suited for evaluating them. We show how multi-criteria optimization with Pareto optimality allows for model outputs to be compared to multiple system...

  11. Analysis of standard reference materials by absolute INAA

    NASA Astrophysics Data System (ADS)

    Heft, R. E.; Koszykowski, R. F.

    1981-07-01

    Three standard reference materials: flyash, soil, and ASI 4340 steel, are analyzed by a method of absolute instrumental neutron activation analysis. Two different light water pool-type reactors were used to produce equivalent analytical results even though the epithermal to thermal flux ratio in one reactor was higher than that in the other by a factor of two.

  12. Analysis of Job Announcements and the Required Competencies for Instructional Technology Professionals.

    ERIC Educational Resources Information Center

    Moallem, Mahnaz

    A study was conducted to analyze current job announcements in the field of instructional design and technology and to produce descriptive information that portrays the required skills and areas of knowledge for instructional technology graduates. Content analysis, in its general terms, was used as the research method for this study. One hundred…

  13. A Comparative Analysis of Numbers and Biology Content Domains between Turkey and the USA

    ERIC Educational Resources Information Center

    Incikabi, Lutfi; Ozgelen, Sinan; Tjoe, Hartono

    2012-01-01

    This study aimed to compare Mathematics and Science programs focusing on TIMSS content domains of Numbers and Biology that produced the largest achievement gap among students from Turkey and the USA. Specifically, it utilized the content analysis method within Turkish and New York State (NYS) frameworks. The procedures of study included matching…

  14. Accuracy Evaluation of the Unified P-Value from Combining Correlated P-Values

    PubMed Central

    Alves, Gelio; Yu, Yi-Kuo

    2014-01-01

    Meta-analysis methods that combine -values into a single unified -value are frequently employed to improve confidence in hypothesis testing. An assumption made by most meta-analysis methods is that the -values to be combined are independent, which may not always be true. To investigate the accuracy of the unified -value from combining correlated -values, we have evaluated a family of statistical methods that combine: independent, weighted independent, correlated, and weighted correlated -values. Statistical accuracy evaluation by combining simulated correlated -values showed that correlation among -values can have a significant effect on the accuracy of the combined -value obtained. Among the statistical methods evaluated those that weight -values compute more accurate combined -values than those that do not. Also, statistical methods that utilize the correlation information have the best performance, producing significantly more accurate combined -values. In our study we have demonstrated that statistical methods that combine -values based on the assumption of independence can produce inaccurate -values when combining correlated -values, even when the -values are only weakly correlated. Therefore, to prevent from drawing false conclusions during hypothesis testing, our study advises caution be used when interpreting the -value obtained from combining -values of unknown correlation. However, when the correlation information is available, the weighting-capable statistical method, first introduced by Brown and recently modified by Hou, seems to perform the best amongst the methods investigated. PMID:24663491

  15. A versatile, stability-indicating and high-throughput ultra-fast liquid chromatography method for the determination of isoflavone aglycones in soybeans, topical formulations, and permeation assays.

    PubMed

    Nemitz, Marina C; Yatsu, Francini K J; Bidone, Juliana; Koester, Letícia S; Bassani, Valquiria L; Garcia, Cássia V; Mendez, Andreas S L; von Poser, Gilsane L; Teixeira, Helder F

    2015-03-01

    There is a growing interest in the pharmaceutical field concerning isoflavones topical delivery systems, especially with regard to their skin care properties and antiherpetic activity. In this context, the present work describes an ultra-fast liquid chromatography method (UFLC) for determining daidzein, glycitein, and genistein in different matrices during the development of topical systems containing isoflavone aglycones (IA) obtained from soybeans. The method showed to be specific, precise, accurate, and linear (0.1 to 5 µg mL(-1)) for IA determination in soybean acid extract, IA-rich fraction obtained after the purification process, IA loaded-nanoemulsions, and topical hydrogel, as well as for permeation/retention assays in porcine skin and porcine esophageal mucosa. The matrix effect was determined for all complex matrices, demonstrating low effect during the analysis. The stability indicating UFLC method was verified by submitting IA to acidic, alkaline, oxidative, and thermal stress conditions, and no interference of degradation products was detected during analysis. Mass spectrometry was performed to show the main compounds produced after acid hydrolysis of soybeans, as well as suggest the main degradation products formed after stress conditions. Besides the IA, hydroxymethylfurfural and ethoxymethylfurfural were produced and identified after acid hydrolysis of the soybean extract and well separated by the UFLC method. The method's robustness was confirmed using the Plackett-Burman experimental design. Therefore, the new method affords fast IA analysis during routine processes, extract purification, products development, and bioanalytical assays. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Evaluating the impact of field-scale management strategies on sediment transport to the watershed outlet.

    PubMed

    Sommerlot, Andrew R; Pouyan Nejadhashemi, A; Woznicki, Sean A; Prohaska, Michael D

    2013-10-15

    Non-point source pollution from agricultural lands is a significant contributor of sediment pollution in United States lakes and streams. Therefore, quantifying the impact of individual field management strategies at the watershed-scale provides valuable information to watershed managers and conservation agencies to enhance decision-making. In this study, four methods employing some of the most cited models in field and watershed scale analysis were compared to find a practical yet accurate method for evaluating field management strategies at the watershed outlet. The models used in this study including field-scale model (the Revised Universal Soil Loss Equation 2 - RUSLE2), spatially explicit overland sediment delivery models (SEDMOD), and a watershed-scale model (Soil and Water Assessment Tool - SWAT). These models were used to develop four modeling strategies (methods) for the River Raisin watershed: Method 1) predefined field-scale subbasin and reach layers were used in SWAT model; Method 2) subbasin-scale sediment delivery ratio was employed; Method 3) results obtained from the field-scale RUSLE2 model were incorporated as point source inputs to the SWAT watershed model; and Method 4) a hybrid solution combining analyses from the RUSLE2, SEDMOD, and SWAT models. Method 4 was selected as the most accurate among the studied methods. In addition, the effectiveness of six best management practices (BMPs) in terms of the water quality improvement and associated cost were assessed. Economic analysis was performed using Method 4, and producer requested prices for BMPs were compared with prices defined by the Environmental Quality Incentives Program (EQIP). On a per unit area basis, producers requested higher prices than EQIP in four out of six BMP categories. Meanwhile, the true cost of sediment reduction at the field and watershed scales was greater than EQIP in five of six BMP categories according to producer requested prices. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Global Gene Expression Analysis of Yeast Cells during Sake Brewing▿ †

    PubMed Central

    Wu, Hong; Zheng, Xiaohong; Araki, Yoshio; Sahara, Hiroshi; Takagi, Hiroshi; Shimoi, Hitoshi

    2006-01-01

    During the brewing of Japanese sake, Saccharomyces cerevisiae cells produce a high concentration of ethanol compared with other ethanol fermentation methods. We analyzed the gene expression profiles of yeast cells during sake brewing using DNA microarray analysis. This analysis revealed some characteristics of yeast gene expression during sake brewing and provided a scaffold for a molecular level understanding of the sake brewing process. PMID:16997994

  18. Inverting travel times with a triplication. [spline fitting technique applied to lunar seismic data reduction

    NASA Technical Reports Server (NTRS)

    Jarosch, H. S.

    1982-01-01

    A method based on the use of constrained spline fits is used to overcome the difficulties arising when body-wave data in the form of T-delta are reduced to the tau-p form in the presence of cusps. In comparison with unconstrained spline fits, the method proposed here tends to produce much smoother models which lie approximately in the middle of the bounds produced by the extremal method. The method is noniterative and, therefore, computationally efficient. The method is applied to the lunar seismic data, where at least one triplication is presumed to occur in the P-wave travel-time curve. It is shown, however, that because of an insufficient number of data points for events close to the antipode of the center of the lunar network, the present analysis is not accurate enough to resolve the problem of a possible lunar core.

  19. Measuring toothbrush interproximal penetration using image analysis

    NASA Astrophysics Data System (ADS)

    Hayworth, Mark S.; Lyons, Elizabeth K.

    1994-09-01

    An image analysis method of measuring the effectiveness of a toothbrush in reaching the interproximal spaces of teeth is described. Artificial teeth are coated with a stain that approximates real plaque and then brushed with a toothbrush on a brushing machine. The teeth are then removed and turned sideways so that the interproximal surfaces can be imaged. The areas of stain that have been removed within masked regions that define the interproximal regions are measured and reported. These areas correspond to the interproximal areas of the tooth reached by the toothbrush bristles. The image analysis method produces more precise results (10-fold decrease in standard deviation) in a fraction (22%) of the time as compared to our prior visual grading method.

  20. Sentiment Analysis of Health Care Tweets: Review of the Methods Used.

    PubMed

    Gohil, Sunir; Vuik, Sabine; Darzi, Ara

    2018-04-23

    Twitter is a microblogging service where users can send and read short 140-character messages called "tweets." There are several unstructured, free-text tweets relating to health care being shared on Twitter, which is becoming a popular area for health care research. Sentiment is a metric commonly used to investigate the positive or negative opinion within these messages. Exploring the methods used for sentiment analysis in Twitter health care research may allow us to better understand the options available for future research in this growing field. The first objective of this study was to understand which tools would be available for sentiment analysis of Twitter health care research, by reviewing existing studies in this area and the methods they used. The second objective was to determine which method would work best in the health care settings, by analyzing how the methods were used to answer specific health care questions, their production, and how their accuracy was analyzed. A review of the literature was conducted pertaining to Twitter and health care research, which used a quantitative method of sentiment analysis for the free-text messages (tweets). The study compared the types of tools used in each case and examined methods for tool production, tool training, and analysis of accuracy. A total of 12 papers studying the quantitative measurement of sentiment in the health care setting were found. More than half of these studies produced tools specifically for their research, 4 used open source tools available freely, and 2 used commercially available software. Moreover, 4 out of the 12 tools were trained using a smaller sample of the study's final data. The sentiment method was trained against, on an average, 0.45% (2816/627,024) of the total sample data. One of the 12 papers commented on the analysis of accuracy of the tool used. Multiple methods are used for sentiment analysis of tweets in the health care setting. These range from self-produced basic categorizations to more complex and expensive commercial software. The open source and commercial methods are developed on product reviews and generic social media messages. None of these methods have been extensively tested against a corpus of health care messages to check their accuracy. This study suggests that there is a need for an accurate and tested tool for sentiment analysis of tweets trained using a health care setting-specific corpus of manually annotated tweets first. ©Sunir Gohil, Sabine Vuik, Ara Darzi. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 23.04.2018.

  1. Analysis of heparin oligosaccharides by capillary electrophoresis-negative-ion electrospray ionization mass spectrometry.

    PubMed

    Lin, Lei; Liu, Xinyue; Zhang, Fuming; Chi, Lianli; Amster, I Jonathan; Leach, Franklyn E; Xia, Qiangwei; Linhardt, Robert J

    2017-01-01

    Most hyphenated analytical approaches that rely on liquid chromatography-MS require relatively long separation times, produce incomplete resolution of oligosaccharide mixtures, use eluents that are incompatible with electrospray ionization, or require oligosaccharide derivatization. Here we demonstrate the analysis of heparin oligosaccharides, including disaccharides, ultralow molecular weight heparin, and a low molecular weight heparin, using a novel electrokinetic pump-based CE-MS coupling eletrospray ion source. Reverse polarity CE separation and negative-mode electrospray ionization were optimized using a volatile methanolic ammonium acetate electrolyte and sheath fluid. The online CE hyphenated negative-ion electrospray ionization MS on an LTQ Orbitrap mass spectrometer was useful in disaccharide compositional analysis and bottom-up and top-down analysis of low molecular weight heparin. The application of this CE-MS method to ultralow molecular heparin suggests that a charge state distribution and the low level of sulfate group loss that is achieved make this method useful for online tandem MS analysis of heparins. Graphical abstract Most hyphenated analytical approaches that rely on liquid chromatography-MS require relatively long separation times, produce incomplete resolution of oligosaccharide mixtures, use eluents that are incompatible with electrospray ionization, or require oligosaccharide derivatization. Here we demonstrate the analysis of heparin oligosaccharides, including disaccharides, ultralow molecular weight heparin, and a low molecular weight heparin, using a novel electrokinetic pump-based CE-MS coupling eletrospray ion source. Reverse polarity CE separation and negative-mode electrospray ionization were optimized using a volatile methanolic ammonium acetate electrolyte and sheath fluid. The online CE hyphenated negative-ion electrospray ionization MS on an LTQ Orbitrap mass spectrometer was useful in disaccharide compositional analysis and bottom-up and top-down analysis of low molecular weight heparin. The application of this CE-MS method to ultralow molecular heparin suggests that a charge state distribution and the low level of sulfate group loss that is achieved make this method useful for online tandem MS analysis of heparins.

  2. Angular filter refractometry analysis using simulated annealing [An improved method for characterizing plasma density profiles using angular filter refractometry

    DOE PAGES

    Angland, P.; Haberberger, D.; Ivancic, S. T.; ...

    2017-10-30

    Here, a new method of analysis for angular filter refractometry images was developed to characterize laser-produced, long-scale-length plasmas using an annealing algorithm to iterative converge upon a solution. Angular filter refractometry (AFR) is a novel technique used to characterize the density pro files of laser-produced, long-scale-length plasmas. A synthetic AFR image is constructed by a user-defined density profile described by eight parameters, and the algorithm systematically alters the parameters until the comparison is optimized. The optimization and statistical uncertainty calculation is based on a minimization of themore » $$\\chi$$2 test statistic. The algorithm was successfully applied to experimental data of plasma expanding from a flat, laser-irradiated target, resulting in average uncertainty in the density profile of 5-10% in the region of interest.« less

  3. Angular filter refractometry analysis using simulated annealing [An improved method for characterizing plasma density profiles using angular filter refractometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Angland, P.; Haberberger, D.; Ivancic, S. T.

    Here, a new method of analysis for angular filter refractometry images was developed to characterize laser-produced, long-scale-length plasmas using an annealing algorithm to iterative converge upon a solution. Angular filter refractometry (AFR) is a novel technique used to characterize the density pro files of laser-produced, long-scale-length plasmas. A synthetic AFR image is constructed by a user-defined density profile described by eight parameters, and the algorithm systematically alters the parameters until the comparison is optimized. The optimization and statistical uncertainty calculation is based on a minimization of themore » $$\\chi$$2 test statistic. The algorithm was successfully applied to experimental data of plasma expanding from a flat, laser-irradiated target, resulting in average uncertainty in the density profile of 5-10% in the region of interest.« less

  4. On the release of cppxfel for processing X-ray free-electron laser images.

    PubMed

    Ginn, Helen Mary; Evans, Gwyndaf; Sauter, Nicholas K; Stuart, David Ian

    2016-06-01

    As serial femtosecond crystallography expands towards a variety of delivery methods, including chip-based methods, and smaller collected data sets, the requirement to optimize the data analysis to produce maximum structure quality is becoming increasingly pressing. Here cppxfel , a software package primarily written in C++, which showcases several data analysis techniques, is released. This software package presently indexes images using DIALS (diffraction integration for advanced light sources) and performs an initial orientation matrix refinement, followed by post-refinement of individual images against a reference data set. Cppxfel is released with the hope that the unique and useful elements of this package can be repurposed for existing software packages. However, as released, it produces high-quality crystal structures and is therefore likely to be also useful to experienced users of X-ray free-electron laser (XFEL) software who wish to maximize the information extracted from a limited number of XFEL images.

  5. On the release of cppxfel for processing X-ray free-electron laser images

    DOE PAGES

    Ginn, Helen Mary; Evans, Gwyndaf; Sauter, Nicholas K.; ...

    2016-05-11

    As serial femtosecond crystallography expands towards a variety of delivery methods, including chip-based methods, and smaller collected data sets, the requirement to optimize the data analysis to produce maximum structure quality is becoming increasingly pressing. Herecppxfel, a software package primarily written in C++, which showcases several data analysis techniques, is released. This software package presently indexes images using DIALS (diffraction integration for advanced light sources) and performs an initial orientation matrix refinement, followed by post-refinement of individual images against a reference data set.Cppxfelis released with the hope that the unique and useful elements of this package can be repurposed formore » existing software packages. However, as released, it produces high-quality crystal structures and is therefore likely to be also useful to experienced users of X-ray free-electron laser (XFEL) software who wish to maximize the information extracted from a limited number of XFEL images.« less

  6. Advanced composite elevator for Boeing 727 aircraft, volume 2

    NASA Technical Reports Server (NTRS)

    Chovil, D. V.; Grant, W. D.; Jamison, E. S.; Syder, H.; Desper, O. E.; Harvey, S. T.; Mccarty, J. E.

    1980-01-01

    Preliminary design activity consisted of developing and analyzing alternate design concepts and selecting the optimum elevator configuration. This included trade studies in which durability, inspectability, producibility, repairability, and customer acceptance were evaluated. Preliminary development efforts consisted of evaluating and selecting material, identifying ancillary structural development test requirements, and defining full scale ground and flight test requirements necessary to obtain Federal Aviation Administration (FAA) certification. After selection of the optimum elevator configuration, detail design was begun and included basic configuration design improvements resulting from manufacturing verification hardware, the ancillary test program, weight analysis, and structural analysis. Detail and assembly tools were designed and fabricated to support a full-scope production program, rather than a limited run. The producibility development programs were used to verify tooling approaches, fabrication processes, and inspection methods for the production mode. Quality parts were readily fabricated and assembled with a minimum rejection rate, using prior inspection methods.

  7. A new method of linkage analysis using LOD scores for quantitative traits supports linkage of monoamine oxidase activity to D17S250 in the Collaborative Study on the Genetics of Alcoholism pedigrees.

    PubMed

    Curtis, David; Knight, Jo; Sham, Pak C

    2005-09-01

    Although LOD score methods have been applied to diseases with complex modes of inheritance, linkage analysis of quantitative traits has tended to rely on non-parametric methods based on regression or variance components analysis. Here, we describe a new method for LOD score analysis of quantitative traits which does not require specification of a mode of inheritance. The technique is derived from the MFLINK method for dichotomous traits. A range of plausible transmission models is constructed, constrained to yield the correct population mean and variance for the trait but differing with respect to the contribution to the variance due to the locus under consideration. Maximized LOD scores under homogeneity and admixture are calculated, as is a model-free LOD score which compares the maximized likelihoods under admixture assuming linkage and no linkage. These LOD scores have known asymptotic distributions and hence can be used to provide a statistical test for linkage. The method has been implemented in a program called QMFLINK. It was applied to data sets simulated using a variety of transmission models and to a measure of monoamine oxidase activity in 105 pedigrees from the Collaborative Study on the Genetics of Alcoholism. With the simulated data, the results showed that the new method could detect linkage well if the true allele frequency for the trait was close to that specified. However, it performed poorly on models in which the true allele frequency was much rarer. For the Collaborative Study on the Genetics of Alcoholism data set only a modest overlap was observed between the results obtained from the new method and those obtained when the same data were analysed previously using regression and variance components analysis. Of interest is that D17S250 produced a maximized LOD score under homogeneity and admixture of 2.6 but did not indicate linkage using the previous methods. However, this region did produce evidence for linkage in a separate data set, suggesting that QMFLINK may have been able to detect a true linkage which was not picked up by the other methods. The application of model-free LOD score analysis to quantitative traits is novel and deserves further evaluation of its merits and disadvantages relative to other methods.

  8. Plane elasto-plastic analysis of v-notched plate under bending by boundary integral equation method. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Rzasnicki, W.

    1973-01-01

    A method of solution is presented, which, when applied to the elasto-plastic analysis of plates having a v-notch on one edge and subjected to pure bending, will produce stress and strain fields in much greater detail than presently available. Application of the boundary integral equation method results in two coupled Fredholm-type integral equations, subject to prescribed boundary conditions. These equations are replaced by a system of simultaneous algebraic equations and solved by a successive approximation method employing Prandtl-Reuss incremental plasticity relations. The method is first applied to number of elasto-static problems and the results compared with available solutions. Good agreement is obtained in all cases. The elasto-plastic analysis provides detailed stress and strain distributions for several cases of plates with various notch angles and notch depths. A strain hardening material is assumed and both plane strain and plane stress conditions are considered.

  9. Feature Selection for Classification of Polar Regions Using a Fuzzy Expert System

    NASA Technical Reports Server (NTRS)

    Penaloza, Mauel A.; Welch, Ronald M.

    1996-01-01

    Labeling, feature selection, and the choice of classifier are critical elements for classification of scenes and for image understanding. This study examines several methods for feature selection in polar regions, including the list, of a fuzzy logic-based expert system for further refinement of a set of selected features. Six Advanced Very High Resolution Radiometer (AVHRR) Local Area Coverage (LAC) arctic scenes are classified into nine classes: water, snow / ice, ice cloud, land, thin stratus, stratus over water, cumulus over water, textured snow over water, and snow-covered mountains. Sixty-seven spectral and textural features are computed and analyzed by the feature selection algorithms. The divergence, histogram analysis, and discriminant analysis approaches are intercompared for their effectiveness in feature selection. The fuzzy expert system method is used not only to determine the effectiveness of each approach in classifying polar scenes, but also to further reduce the features into a more optimal set. For each selection method,features are ranked from best to worst, and the best half of the features are selected. Then, rules using these selected features are defined. The results of running the fuzzy expert system with these rules show that the divergence method produces the best set features, not only does it produce the highest classification accuracy, but also it has the lowest computation requirements. A reduction of the set of features produced by the divergence method using the fuzzy expert system results in an overall classification accuracy of over 95 %. However, this increase of accuracy has a high computation cost.

  10. Genetic diversity of Bacillus sp producers of amylase isolated from the soil.

    PubMed

    Xavier, A R E O; Lima, E R; Oliveira, A M E; Cardoso, L; Santos, J; Cangussu, C H C; Leite, L N; Quirino, M C L; Júnior, I G C; Oliveira, D A; Xavier, M A S

    2017-09-27

    The microorganisms are the best source of extracellular enzymes since they allow an economical technology with low-resource consumption compared to animals and plants. The amylases are among the most important enzymes being the genus Bacillus one of the most investigated due to its ability to produce this enzyme. The objective of this study was to isolate and analyze the genetic diversity among bacteria of the genus Bacillus sp producer of amylase originated from the soil. To this end, soil samples were collected and submitted to the condition of extreme temperature. The serial dilution procedure followed by seeding on solid medium containing starch was used for isolation of strains that produce amylase. The microorganisms isolated were subjected to standard morphological methods for presumptive identification of the genus Bacillus. The PCR assay with the universal genetic marker 16S rDNA was used for confirmation of bacterial strain. All the 10 isolates presumptively identified as bacteria amplified a fragment of 370 bp corresponding to the 16S rDNA gene. The enzymatic activity was expressed as an enzymatic index (EI), after 24 h of incubation. All isolate producers of amylase exhibited EI ≥ 2.0. The determination of the genetic profile and the clonal relationship among the isolates were performed by the method of ERIC-PCR polymorphism. The isolates of Bacillus spp were divided into 2 groups (I and II). Through this method, the discriminatory capacity of this analysis of polymorphisms was verified in differing producer strains from those not producing amylase.

  11. A novel combined approach of diffuse reflectance UV-Vis-NIR spectroscopy and multivariate analysis for non-destructive examination of blue ballpoint pen inks in forensic application

    NASA Astrophysics Data System (ADS)

    Kumar, Raj; Sharma, Vishal

    2017-03-01

    The present research is focused on the analysis of writing inks using destructive UV-Vis spectroscopy (dissolution of ink by the solvent) and non-destructive diffuse reflectance UV-Vis-NIR spectroscopy along with Chemometrics. Fifty seven samples of blue ballpoint pen inks were analyzed under optimum conditions to determine the differences in spectral features of inks among same and different manufacturers. Normalization was performed on the spectroscopic data before chemometric analysis. Principal Component Analysis (PCA) and K-mean cluster analysis were used on the data to ascertain whether the blue ballpoint pen inks could be differentiated by their UV-Vis/UV-Vis NIR spectra. The discriminating power is calculated by qualitative analysis by the visual comparison of the spectra (absorbance peaks), produced by the destructive and non-destructive methods. In the latter two methods, the pairwise comparison is made by incorporating the clustering method. It is found that chemometric method provides better discriminating power (98.72% and 99.46%, in destructive and non-destructive, respectively) in comparison to the qualitative analysis (69.67%).

  12. Anti-bacteria activity of carbon nanotubes grown on trimetallic catalyst

    NASA Astrophysics Data System (ADS)

    Ibrahim, S. O.; Abdulkareem, A. S.; Isah, K. U.; Ahmadu, U.; Bankole, M. T.; Kariim, I.

    2018-06-01

    Trimetallic catalyst was prepared using wet impregnation method to produce carbon nanotubes (CNTs) through the method of catalytic chemical vapor deposition (CCVD). Characterization of the developed catalyst and CNTs were carried out using thermogravimetric analysis (TGA), x-ray diffraction (XRD), specific surface area Brunauer-Emmett-Teller (BET), Fourier-transform infrared spectroscopy (FTIR), high-resolution scanning electron microscopy (HRSEM)/energy dispersive x-ray spectroscopy (EDS) and high-resolution transmission electron microscopy (HRTEM)/selected area electron diffraction (SAED). The BET and TGA analysis indicated that the catalyst has a high surface area and is thermally stable. The FTIR of the developed catalyst shows notable functional group with presence of unbound water. The HRSEM of the catalyst revealed agglomerated, homogeneous and porous particles while the HRSEM/HRTEM of the produced CNTs gave the formation of long strand of multiwalled carbon nanotubes (MWCNTs), and homogeneous crystalline fringe like structure with irregular diameter. EDS revealed the dominance of carbon in the elemental composition. XRD/SAED patterns of the catalyst suggest high dispersion of the metallic particles in the catalyst mixture while that of the CNTs confirmed that the produced MWCNTs were highly graphitized and crystalline in nature with little structural defects. The anti-bacteria activity of the produced MWCNTs on Klebsiella pneumoneae, Escherichia coli, and Pseudomonas aeruginosa was also carried out. It was observed that the produced MWCNTs have an inhibitory property on bacteria; Escherichia coli and Klebsiella pneumoneae from zero day ( and ) through to twelfth day (Nil count) respectively. It has no effect on Pseudomonas aeruginosa with too numerous to count at zero-sixth day, but a breakdown in its growth at ninth-twelfth day (). This study implied that MWCNTs with varying diameter and well-ordered nano-structure can be produced from catalyst via CCVD method, and it can be recommended that the MWCNTs can be used to treat infected media contaminated with Klebsiella pneumoneae, Escherichia coli, and Pseudomonas aeruginosa.

  13. FINDING THE CENTER: AN ANALYSIS OF THE TILTED RING MODEL FITS TO THE INNER AND OUTER PARTS OF SIX DWARF GALAXIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boisvert, John H.; Rhee, George

    2016-07-01

    We present a study of the H i emission of six dwarf galaxies. Profiles of dark matter halos of galaxies such as these have been the subject of much debate. In this paper we investigate the accuracy with which the dynamical center (the center of rotation) of each galaxy can be determined. We have used the tilted ring model. We find that the tilted ring method produces plausible centers that are consistent with other published works that used rings at radii larger than 1 kpc. At a radius of 1 kpc the method often converges on centers that do notmore » make sense, producing, for example, radial velocities for the galaxies that are inconsistent with the data. The only way to get the method to work in the centers of galaxies is to use prior information about the redshifts to rule out implausible centers. This suggests that the tilted ring ring method may not be producing reliable rotational velocities in the central kiloparsecs of dwarf galaxies.« less

  14. WE-H-BRA-08: A Monte Carlo Cell Nucleus Model for Assessing Cell Survival Probability Based On Particle Track Structure Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, B; Georgia Institute of Technology, Atlanta, GA; Wang, C

    Purpose: To correlate the damage produced by particles of different types and qualities to cell survival on the basis of nanodosimetric analysis and advanced DNA structures in the cell nucleus. Methods: A Monte Carlo code was developed to simulate subnuclear DNA chromatin fibers (CFs) of 30nm utilizing a mean-free-path approach common to radiation transport. The cell nucleus was modeled as a spherical region containing 6000 chromatin-dense domains (CDs) of 400nm diameter, with additional CFs modeled in a sparser interchromatin region. The Geant4-DNA code was utilized to produce a particle track database representing various particles at different energies and dose quantities.more » These tracks were used to stochastically position the DNA structures based on their mean free path to interaction with CFs. Excitation and ionization events intersecting CFs were analyzed using the DBSCAN clustering algorithm for assessment of the likelihood of producing DSBs. Simulated DSBs were then assessed based on their proximity to one another for a probability of inducing cell death. Results: Variations in energy deposition to chromatin fibers match expectations based on differences in particle track structure. The quality of damage to CFs based on different particle types indicate more severe damage by high-LET radiation than low-LET radiation of identical particles. In addition, the model indicates more severe damage by protons than of alpha particles of same LET, which is consistent with differences in their track structure. Cell survival curves have been produced showing the L-Q behavior of sparsely ionizing radiation. Conclusion: Initial results indicate the feasibility of producing cell survival curves based on the Monte Carlo cell nucleus method. Accurate correlation between simulated DNA damage to cell survival on the basis of nanodosimetric analysis can provide insight into the biological responses to various radiation types. Current efforts are directed at producing cell survival curves for high-LET radiation.« less

  15. Investigating the Use of a Diffusion Flame to Produce Black Carbon Standards for Thermal- Optical Analysis of Carbonaceous Aerosols

    NASA Astrophysics Data System (ADS)

    Ortiz Montalvo, D. L.; Kirchstetter, T. W.; Soto-García, L. L.; Mayol-Bracero, O. L.

    2006-12-01

    Combustion generated particles are a concern to both climate and public health due to their ability to scatter and absorb solar radiation and alter cloud properties, and because they are small enough to be inhaled and deposit in the lungs where they may cause respiratory and other health problems. Specific concern is focused on particles that originate from the combustion of diesel fuel. Diesels particles are composed mainly of carbonaceous material, especially in locations where diesel fuel sulfur is low. These particles are black due to the strongly light absorbing nature of the refractory carbon components, appropriately called black carbon (BC). This research project focuses on the uncertainty in the measurement of BC mass concentration, which is typically determined by analysis of particles collected on a filter using a thermal-optical analysis (TOA) method. Many studies have been conducted to examine the accuracy of the commonly used variations of the TOA method, which vary in their sample heating protocol, carrier gas, and optical measurement. These studies show that BC measurements are inaccurate due to the presence of organic carbon (OC) in the aerosols. OC may co-evolve with BC or char to form BC during analysis, both of which make it difficult to distinguish between the OC and BC in the sample. The goal of this study is to develop the capability of producing standard samples of known amounts of BC, either alone or mixed with other aerosol constituents, and then evaluate which TOA methods accurately determine the BC amount. An inverted diffusion flame of methane and air was used to produce particle samples containing only BC as well as samples of BC mixed with humic acid (HA). Our study found that HA is light absorbing and catalyzes the combustion of BC. It is expected that both of these attributes will challenge the ability of TOA methods in distinguishing between OC and BC, such as the simple two step TOA method which relies solely on temperature to distinguish between OC and BC. These samples were analyzed using two TOA methods to compare the estimates of BC concentration. Future work will focus on the preparation of a variety of BC standards and comparing measurements of the prepared samples using a range of other TOA methods.

  16. Isolation and characterization of mouse innate lymphoid cells.

    PubMed

    Halim, Timotheus Y F; Takei, Fumio

    2014-08-01

    Innate lymphoid cells (ILCs) are rare populations of cytokine-producing lymphocytes and are divided into three groups, namely ILC1, ILC2, and ILC3, based on the cytokines that they produce. They comprise less than 1% of lymphocytes in mucosal tissues and express no unique cell surface markers. Therefore, they can only be identified by combinations of multiple cell surface markers and further characterized by cytokine production in vitro. Thus, multicolor flow cytometry is the only reliable method to purify and characterize ILCs. Here we describe the methods for cell preparation, flow cytometric analysis, and purification of murine ILC2 and ILC3. Copyright © 2014 John Wiley & Sons, Inc.

  17. Fabricating TiO2 nanocolloids by electric spark discharge method at normal temperature and pressure.

    PubMed

    Tseng, Kuo-Hsiung; Chang, Chaur-Yang; Chung, Meng-Yun; Cheng, Ting-Shou

    2017-11-17

    In this study, TiO 2 nanocolloids were successfully fabricated in deionized water without using suspending agents through using the electric spark discharge method at room temperature and under normal atmospheric pressure. This method was exceptional because it did not create nanoparticle dispersion and the produced colloids contained no derivatives. The proposed method requires only traditional electrical discharge machines (EDMs), self-made magnetic stirrers, and Ti wires (purity, 99.99%). The EDM pulse on time (T on ) and pulse off time (T off ) were respectively set at 50 and 100 μs, 100 and 100 μs, 150 and 100 μs, and 200 and 100 μs to produce four types of TiO 2 nanocolloids. Zetasizer analysis of the nanocolloids showed that a decrease in T on increased the suspension stability, but there were no significant correlations between T on and particle size. Colloids produced from the four production configurations showed a minimum particle size between 29.39 and 52.85 nm and a zeta-potential between -51.2 and -46.8 mV, confirming that the method introduced in this study can be used to produce TiO 2 nanocolloids with excellent suspension stability. Scanning electron microscopy with energy dispersive spectroscopy also indicated that the TiO 2 colloids did not contain elements other than Ti and oxygen.

  18. Characteristics and management of flowback/produced water from hydraulically fractured wells in California - findings from the California SB 4 assessment

    NASA Astrophysics Data System (ADS)

    Varadharajan, C.; Cooley, H.; Heberger, M. G.; Stringfellow, W. T.; Domen, J. K.; Sandelin, W.; Camarillo, M. K.; Jordan, P. D.; Reagan, M. T.; Donnelly, K.; Birkholzer, J. T.; Long, J. C. S.

    2015-12-01

    As part of a recent assessment of well stimulation in California, we analyzed the hazards and potential impacts of hydraulic fracturing (the primary form of well stimulation in California) on water resources, which included an analysis of the quantity and quality of flowback/produced water generated, current management and disposal practices, associated potential release mechanisms and transport pathways that can lead to contaminants being released into the environment, and practices to mitigate or avoid impacts from produced water on water resources. The wastewater returned after stimulation includes "recovered fluids" (flowback fluids collected into tanks following stimulation, but before the start of production) and "produced water" (water extracted with oil and gas during production). In contrast to hydraulic fracturing in regions with primarily gas production, the quantities of recovered fluids from hydraulically fractured wells in California are small in comparison to the fluids injected (typically <5%), and large quantities of produced water are generated. Our analysis indicates some fraction of returning fracturing fluids is likely present in produced water from wells that have been hydraulically fractured. Chemical measurements of recovered fluids show that some samples can contain high levels of some contaminants, including total carbohydrates (indicating the presence of guar, a component of fracturing fluid), total dissolved solids (TDS), trace elements and naturally occurring radioactive material (NORM). Data on produced water chemistry are more limited. In California, produced water is typically managed via pipelines and disposed or reused in many ways. A majority of produced water from hydraulically fractured wells in California is disposed in percolation pits, many of which may lie in areas with good groundwater quality. Some of the remaining produced water is injected into Class II wells; although a few of the wells are under review or have been shut down since they were injecting into aquifers. Other methods of management of produced water include reuse for irrigation and discharge into sewer systems. Each of these disposal and reuse methods presents its own unique set of concerns that need to be considered together, in designing a produced water management plan.

  19. Medicines produced in Telsiai pharmacy (Vilnius governorate): analysis of prescription book from 1830.

    PubMed

    Gudiene, V; Simaitiene, Z

    2014-01-01

    The Museum of the History of Lithuanian Medicine and Pharmacy has a prescription book by a pharmacist Teodor. Geldner from Telsiai (the tsarist Russia's Lithuanian Vilnius Governorate), dated 1830. Each medication that was produced at the pharmacy had to be registered in this book. The entries included the composition of the drug, its form, usage, price, the physician's name, and (sometimes) the method of production. This paper presents the content analysis of this book. The study revealed which medicines were used for the treatment of patients back then, which principles of therapy predominated, and what the social status of the patients was.

  20. The UHPLC-DAD fingerprinting method for analysis of extracellular metabolites of fungi of the genus Geosmithia (Acomycota: Hypocreales).

    PubMed

    Tylová, Tereza; Kolařík, Miroslav; Olšovská, Jana

    2011-07-01

    A new simple ultra-high-performance liquid chromatography method with diode array detection (UHPLC-DAD) was developed for chemical fingerprinting analysis of extracellular metabolites in fermentation broth of Geosmithia spp. The SPE method employing Oasis MCX strong cation-exchange mixed-mode polymeric sorbent was chosen for extraction of the metabolites. The analyses were performed on an Acquity UPLC BEH C18 column (100 × 2.1 mm i.d.; particle size, 1.7 μm; Waters) using a gradient elution program with an aqueous solution of trifluoroacetic acid and acetonitrile as the mobile phase. The applicability of the method was proved by analysis of 38 strains produced by different species and isolated from different sources (hosts). The results revealed the correlation of obtained UHPLC-DAD fingerprints with taxonomical identity.

  1. Optimized and validated flow-injection spectrophotometric analysis of topiramate, piracetam and levetiracetam in pharmaceutical formulations.

    PubMed

    Hadad, Ghada M; Abdel-Salam, Randa A; Emara, Samy

    2011-12-01

    Application of a sensitive and rapid flow injection analysis (FIA) method for determination of topiramate, piracetam, and levetiracetam in pharmaceutical formulations has been investigated. The method is based on the reaction with ortho-phtalaldehyde and 2-mercaptoethanol in a basic buffer and measurement of absorbance at 295 nm under flow conditions. Variables affecting the determination such as sample injection volume, pH, ionic strength, reagent concentrations, flow rate of reagent and other FIA parameters were optimized to produce the most sensitive and reproducible results using a quarter-fraction factorial design, for five factors at two levels. Also, the method has been optimized and fully validated in terms of linearity and range, limit of detection and quantitation, precision, selectivity and accuracy. The method was successfully applied to the analysis of pharmaceutical preparations.

  2. Efficient Formation of Stratospheric Aerosol for Climate Engineering by Emission of Condensible Vapor from Aircraft

    NASA Technical Reports Server (NTRS)

    Pierce, Jeffrey R.; Weisenstein, Debra K.; Heckendorn, Patricia; Peter. Thomas; Keith, David W.

    2010-01-01

    Recent analysis suggests that the effectiveness of stratospheric aerosol climate engineering through emission of non-condensable vapors such as SO2 is limited because the slow conversion to H2SO4 tends to produce aerosol particles that are too large; SO2 injection may be so inefficient that it is difficult to counteract the radiative forcing due to a CO2 doubling. Here we describe an alternate method in which aerosol is formed rapidly in the plume following injection of H2SO4, a condensable vapor, from an aircraft. This method gives better control of particle size and can produce larger radiative forcing with lower sulfur loadings than SO2 injection. Relative to SO2 injection, it may reduce some of the adverse effects of geoengineering such as radiative heating of the lower stratosphere. This method does not, however, alter the fact that such a geoengineered radiative forcing can, at best, only partially compensate for the climate changes produced by CO2.

  3. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    PubMed Central

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  4. Optimization of mass spectrometric parameters improve the identification performance of capillary zone electrophoresis for single-shot bottom-up proteomics analysis.

    PubMed

    Zhang, Zhenbin; Dovichi, Norman J

    2018-02-25

    The effects of MS1 injection time, MS2 injection time, dynamic exclusion time, intensity threshold, and isolation width were investigated on the numbers of peptide and protein identifications for single-shot bottom-up proteomics analysis using CZE-MS/MS analysis of a Xenopus laevis tryptic digest. An electrokinetically pumped nanospray interface was used to couple a linear-polyacrylamide coated capillary to a Q Exactive HF mass spectrometer. A sensitive method that used a 1.4 Th isolation width, 60,000 MS2 resolution, 110 ms MS2 injection time, and a top 7 fragmentation produced the largest number of identifications when the CZE loading amount was less than 100 ng. A programmable autogain control method (pAGC) that used a 1.4 Th isolation width, 15,000 MS2 resolution, 110 ms MS2 injection time, and top 10 fragmentation produced the largest number of identifications for CZE loading amounts greater than 100 ng; 7218 unique peptides and 1653 protein groups were identified from 200 ng by using the pAGC method. The effect of mass spectrometer conditions on the performance of UPLC-MS/MS was also investigated. A fast method that used a 1.4 Th isolation width, 30,000 MS2 resolution, 45 ms MS2 injection time, and top 12 fragmentation produced the largest number of identifications for 200 ng UPLC loading amount (6025 unique peptides and 1501 protein groups). This is the first report where the identification number for CZE surpasses that of the UPLC at the 200 ng loading level. However, more peptides (11476) and protein groups (2378) were identified by using UPLC-MS/MS when the sample loading amount was increased to 2 μg with the fast method. To exploit the fast scan speed of the Q-Exactive HF mass spectrometer, higher sample loading amounts are required for single-shot bottom-up proteomics analysis using CZE-MS/MS. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Validation and implementation of a novel high-throughput behavioral phenotyping instrument for mice

    PubMed Central

    Brodkin, Jesse; Frank, Dana; Grippo, Ryan; Hausfater, Michal; Gulinello, Maria; Achterholt, Nils; Gutzen, Christian

    2015-01-01

    Background Behavioral assessment of mutant mouse models and novel candidate drugs is a slow and labor intensive process. This limitation produces a significant impediment to CNS drug discovery. New method By combining video and vibration analysis we created an automated system that provides the most detailed description of mouse behavior available. Our system (The Behavioral Spectrometer) allowed for the rapid assessment of behavioral abnormalities in the BTBR model of Autism, the restraint model of stress and the irritant model of inflammatory pain. Results We found that each model produced a unique alteration of the spectrum of behavior emitted by the mice. BTBR mice engaged in more grooming and less rearing behaviors. Prior restraint stress produced dramatic increases in grooming activity at the expense of locomotor behavior. Pain produced profound decreases in emitted behavior that were reversible with analgesic treatment. Comparison with existing method(s) We evaluated our system through a direct comparison on the same subjects with the current “gold standard” of human observation of video recordings. Using the same mice evaluated over the same range of behaviors, the Behavioral Spectrometer produced a quantitative categorization of behavior that was highly correlated with the scores produced by trained human observers (r=0.97). Conclusions Our results show that this new system is a highly valid and sensitive method to characterize behavioral effects in mice. As a fully automated and easily scalable instrument the Behavioral Spectrometer represents a high-throughput behavioral tool that reduces the time and labor involved in behavioral research. PMID:24384067

  6. Comparison of risk assessment procedures used in OCRA and ULRA methods

    PubMed Central

    Roman-Liu, Danuta; Groborz, Anna; Tokarski, Tomasz

    2013-01-01

    The aim of this study was to analyse the convergence of two methods by comparing exposure and the assessed risk of developing musculoskeletal disorders at 18 repetitive task workstations. The already established occupational repetitive actions (OCRA) and the recently developed upper limb risk assessment (ULRA) produce correlated results (R = 0.84, p = 0.0001). A discussion of the factors that influence the values of the OCRA index and ULRA's repetitive task indicator shows that both similarities and differences in the results produced by the two methods can arise from the concepts that underlie them. The assessment procedure and mathematical calculations that the basic parameters are subjected to are crucial to the results of risk assessment. The way the basic parameters are defined influences the assessment of exposure and risk assessment to a lesser degree. The analysis also proved that not always do great differences in load indicator values result in differences in risk zones. Practitioner Summary: We focused on comparing methods that, even though based on different concepts, serve the same purpose. The results proved that different methods with different assumptions can produce similar assessment of upper limb load; sharp criteria in risk assessment are not the best solution. PMID:24041375

  7. Evaluation of a Stirling Solar Dynamic System for Lunar Oxygen Production

    NASA Technical Reports Server (NTRS)

    Colozza, Anthony J.; Wong, Wayne A.

    2006-01-01

    An evaluation of a solar concentrator-based system for producing oxygen from the lunar regolith was performed. The system utilizes a solar concentrator mirror to provide thermal energy for the oxygen production process as well as thermal energy to power a Stirling heat engine for the production of electricity. The electricity produced is utilized to operate the equipment needed in the oxygen production process. The oxygen production method utilized in the analysis was the hydrogen reduction of ilmenite. Utilizing this method of oxygen production a baseline system design was produced. This baseline system had an oxygen production rate of 0.6 kg/hr with a concentrator mirror size of 5 m. Variations were performed on the baseline design to show how changes in the system size and process rate effected the oxygen production rate.

  8. Specific activity and isotope abundances of strontium in purified strontium-82

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fitzsimmons, J. M.; Medvedev, D. G.; Mausner, L. F.

    2015-11-12

    A linear accelerator was used to irradiate a rubidium chloride target with protons to produce strontium-82 (Sr-82), and the Sr-82 was purified by ion exchange chromatography. The amount of strontium associated with the purified Sr-82 was determined by either: ICP-OES or method B which consisted of a summation of strontium quantified by gamma spectroscopy and ICP-MS. The summation method agreed within 10% to the ICP-OES for the total mass of strontium and the subsequent specific activities were determined to be 0.25–0.52 TBq mg -1. Method B was used to determine the isotope abundances by weight% of the purified Sr-82, andmore » the abundances were: Sr-82 (10–20.7%), Sr-83 (0–0.05%), Sr-84 (35–48.5%), Sr-85 (16–25%), Sr-86 (12.5–23%), Sr-87 (0%), and Sr-88 (0–10%). The purified strontium contained mass amounts of Sr-82, Sr-84, Sr-85, Sr-86, and Sr-88 in abundances not associated with natural abundance, and 90% of the strontium was produced by the proton irradiation. A comparison of ICP-OES and method B for the analysis of Sr-82 indicated analysis by ICP-OES would be easier to determine total mass of strontium and comply with regulatory requirements. An ICP-OES analytical method for Sr-82 analysis was established and validated according to regulatory guidelines.« less

  9. Current state of purification, isolation and analysis of bacteriocins produced by lactic acid bacteria.

    PubMed

    Kaškonienė, Vilma; Stankevičius, Mantas; Bimbiraitė-Survilienė, Kristina; Naujokaitytė, Gintarė; Šernienė, Loreta; Mulkytė, Kristina; Malakauskas, Mindaugas; Maruška, Audrius

    2017-02-01

    The scientific interest for the search of natural means of microbial inhibitors has not faded for several years. A search of natural antibiotics, so-called bacteriocins which are produced by lactic acid bacteria (LAB), gains a huge attention of the scientists in the last century, in order to reduce the usage of synthetic food additives. Pure bacteriocins with wide spectra of antibacterial activity are promising among the natural biopreservatives. The usage of bacteriocin(s) producing LAB as starter culture for the fermentation of some food products, in order to increase their shelf-life, when synthetic preservatives are not allowable, is also possible. There are a lot of studies focusing on the isolation of new bacteriocins from traditional fermented food, dairy products and other foods or sometimes even from unusual non-food matrices. Bacteriocins producing bacteria have been isolated from different sources with the different antibacterial activity against food-borne microorganisms. This review covers the classification of bacteriocins, diversity of sources of bacteriocin(s) producing LAB, antibacterial spectra of isolated bacteriocins and analytical methods for the bacteriocin purification and analysis within the last 15 years.

  10. Poor interoperability of the Adams-Harbertson method for analysis of anthocyanins: comparison with AOAC pH differential method.

    PubMed

    Brooks, Larry M; Kuhlman, Benjamin J; McKesson, Doug W; McCloskey, Leo

    2013-01-01

    The poor interoperability of anthocyanin glycosides measurements by two pH differential methods is documented. Adams-Harbertson, which was proposed for commercial winemaking, was compared to AOAC Official Method 2005.02 for wine. California bottled wines (Pinot Noir, Merlot, and Cabernet Sauvignon) were assayed in a collaborative study (n=105), which found mean precision of Adams-Harbertson winery versus reference measurements to be 77 +/- 20%. Maximum error is expected to be 48% for Pinot Noir, 42% for Merlot, and 34% for Cabernet Sauvignon from reproducibility RSD. Range of measurements was actually 30 to 91% for Pinot Noir. An interoperability study (n=30) found Adams-Harbertson produces measurements that are nominally 150% of the AOAC pH differential method. Large analytical chemistry differences are: AOAC method uses Beer-Lambert equation and measures absorbance at pH 1.0 and 4.5, proposed a priori by Flueki and Francis; whereas Adams-Harbertson uses "universal" standard curve and measures absorbance ad hoc at pH 1.8 and 4.9 to reduce the effects of so-called co-pigmentation. Errors relative to AOAC are produced by Adams-Harbertson standard curve over Beer-Lambert and pH 1.8 over pH 1.0. The study recommends using AOAC Official Method 2005.02 for analysis of wine anthocyanin glycosides.

  11. Extinction coefficients and purity of single-walled carbon nanotubes.

    PubMed

    Zhao, B; Itkis, M E; Niyogi, S; Hu, H; Perea, D E; Haddon, R C

    2004-11-01

    Single-walled carbon nanotubes (SWNTs) hold great promise for advanced applications in aerospace, electronics and medicine, yet these industries require materials with rigorous quality control. There are currently no accepted standards for quality assurance or quality control among the commercial suppliers of SWNTs. We briefly discuss the applicability of various techniques to measure SWNT purity and review, in detail, the advantages of near infrared (NIR) spectroscopy for the quantitative assessment of the bulk carbonaceous purity of SWNTs. We review the use of solution phase NIR spectroscopy for the analysis and characterization of a variety of carbon materials, emphasizing SWNTs produced by the electric arc (EA), laser oven (LO) and HiPco (HC) methods. We consider the applicability of Beer's law to carbon materials dispersed in dimethylformamide (DMF) and the effective extinction coefficients that are obtained from such dispersions. Analysis of the areal absorptivities of the second interband transition of semiconducting EA-produced SWNTs for a number of samples of differing purities has lead to an absolute molar extinction coefficient for the carbonaceous impurities in EA-produced SWNT samples. We conclude that NIR spectroscopy is the clear method of choice for the assessment of the bulk carbonaceous purity of EA-produced SWNTs, and we suggest that an absolute determination of the purity of SWNTs is within reach. Continued work in this area is expected to lead to a universal method for the assessment of the absolute bulk purity of SWNTs from all sources--such a development will be of great importance for nanotube science and for future customers for this product.

  12. Isolation and analysis of group 2 innate lymphoid cells in mice.

    PubMed

    Moro, Kazuyo; Ealey, Kafi N; Kabata, Hiroki; Koyasu, Shigeo

    2015-05-01

    Recent studies have identified distinct subsets of innate lymphocytes, collectively called innate lymphoid cells (ILCs), which lack antigen receptor expression but produce various effector cytokines. Group 2 ILCs (ILC2s) respond to epithelial cell-derived cytokines such as interleukin (IL)-25, IL-33 and thymic stromal lymphopoietin (TSLP), produce large amounts of type 2 cytokines, and have a key role in anti-helminth innate immunity and in the pathophysiology of allergic inflammation. The reported phenotypic characteristics of mouse ILC2s vary, depending on the tissue source and preparation method. This protocol describes improved methods for tissue-specific isolation and analysis of mouse ILC2s of high purity and yield from fat tissue, lung, bronchoalveolar lavage fluid (BALF) and small intestine. These improved methods are the result of our thorough investigation of enzymes used for tissue digestion, methods for the elimination of undesired cells, and a combination of antibodies for the detection and isolation of ILC2s. In addition, this new protocol now enables the isolation of ILC2s of high yield, even from inflamed tissues. Depending on the tissue being analyzed, it takes ∼2-4 h for isolation and flow cytometric analysis of ILC2s from the various tissues of a single mouse and ∼4-8 h to sort purified ILC2s from pooled tissues of multiple mice.

  13. Novel image cytometric method for detection of physiological and metabolic changes in Saccharomyces cerevisiae.

    PubMed

    Chan, Leo L; Kury, Alexandria; Wilkinson, Alisha; Berkes, Charlotte; Pirani, Alnoor

    2012-11-01

    The studying and monitoring of physiological and metabolic changes in Saccharomyces cerevisiae (S. cerevisiae) has been a key research area for the brewing, baking, and biofuels industries, which rely on these economically important yeasts to produce their products. Specifically for breweries, physiological and metabolic parameters such as viability, vitality, glycogen, neutral lipid, and trehalose content can be measured to better understand the status of S. cerevisiae during fermentation. Traditionally, these physiological and metabolic changes can be qualitatively observed using fluorescence microscopy or flow cytometry for quantitative fluorescence analysis of fluorescently labeled cellular components associated with each parameter. However, both methods pose known challenges to the end-users. Specifically, conventional fluorescent microscopes lack automation and fluorescence analysis capabilities to quantitatively analyze large numbers of cells. Although flow cytometry is suitable for quantitative analysis of tens of thousands of fluorescently labeled cells, the instruments require a considerable amount of maintenance, highly trained technicians, and the system is relatively expensive to both purchase and maintain. In this work, we demonstrate the first use of Cellometer Vision for the kinetic detection and analysis of vitality, glycogen, neutral lipid, and trehalose content of S. cerevisiae. This method provides an important research tool for large and small breweries to study and monitor these physiological behaviors during production, which can improve fermentation conditions to produce consistent and higher-quality products.

  14. The synthesis method for design of electron flow sources

    NASA Astrophysics Data System (ADS)

    Alexahin, Yu I.; Molodozhenzev, A. Yu

    1997-01-01

    The synthesis method to design a relativistic magnetically - focused beam source is described in this paper. It allows to find a shape of electrodes necessary to produce laminar space charge flows. Electron guns with shielded cathodes designed with this method were analyzed using the EGUN code. The obtained results have shown the coincidence of the synthesis and analysis calculations [1]. This method of electron gun calculation may be applied for immersed electron flows - of interest for the EBIS electron gun design.

  15. Quasi-modal vibration control by means of active control bearings

    NASA Technical Reports Server (NTRS)

    Nonami, K.; Fleming, D. P.

    1986-01-01

    This paper investigates a design method of an active control bearing system with only velocity feedback. The study provides a new quasi-modal control method for a control system design of an active control bearing system in which feedback coefficients are determined on the basis of a modal analysis. Although the number of sensors and actuators is small, this quasi-modal control method produces a control effect close to an ideal modal control.

  16. Phylogenetic Analysis of Polygalacturonase-Producing Bacillus and Pseudomonas Isolated From Plant Waste Material

    PubMed Central

    Sohail, Muhammad; Latif, Zakia

    2016-01-01

    Background: Keeping in mind the commercial application of polygalacturonase (PG) in juice and beverages industry, bacterial strains were isolated from rotten fruits and vegetables to screen for competent producers of PG. Objectives: In this study, the plate method was used for preliminary screening of polygalacturonase-producing bacteria, while the Dinitrosalicylic Acid (DNS) method was used for quantifications of PG. Materials and Methods: Biochemically-identified polygalacturonase-producing Bacillus and Pseudomonas species were further characterized by molecular markers. The genetic diversity among these selected strains was analyzed by investigating microsatellite distribution in their genome. Out of 110 strains, 17 competent strains of Bacillus and eight strains of Pseudomonas were selected, identified and confirmed biochemically. Selected strains were characterized by 16S rRNA sequencing and data was submitted to the national center for biotechnology information (NCBI) website for accession numbers. Results: Among the Bacillus, Bacillus vallismortis (JQ990307) isolated from mango was the most competent producer of PG; producing up to 4.4 U/µL. Amongst Pseudomonas, Pseudomonas aeruginosa (JQ990314) isolated from oranges was the most competent PG producer equivalent to B. vallismortis (JQ990307). To determine genetic diversity of different strains of Pseudomonas and Bacillus varying in PG production, fingerprinting was done on the basis of Simple Sequence Repeats (SSR) or microsatellites. The data was analyzed and a phylogenetic tree was constructed using the Minitab 3 software for comparison of bacterial isolates producing different concentrations of PG. Fingerprinting showed that presence or absence of certain microsatellites correlated with the ability of PG production. Conclusions: Bacteria from biological waste were competent producers of PG and must be used on an industrial scale to cope with the demand of PG in the food industry. PMID:27099686

  17. Statistical analysis of hydrodynamic cavitation events

    NASA Astrophysics Data System (ADS)

    Gimenez, G.; Sommer, R.

    1980-10-01

    The frequency (number of events per unit time) of pressure pulses produced by hydrodynamic cavitation bubble collapses is investigated using statistical methods. The results indicate that this frequency is distributed according to a normal law, its parameters not being time-evolving.

  18. A Multi-Method Analysis of Evaluation Criteria Used to Assess the Speaking Proficiency of Graduate Student Instructors

    ERIC Educational Resources Information Center

    Plough, India C.; Briggs, Sarah L.; Van Bonn, Sarah

    2010-01-01

    The study reported here examined the evaluation criteria used to assess the proficiency and effectiveness of the language produced in an oral performance test of English conducted in an American university context. Empirical methods were used to analyze qualitatively and quantitatively transcriptions of the Oral English Tests (OET) of 44…

  19. A laboratory study of nonlinear changes in the directionality of extreme seas

    NASA Astrophysics Data System (ADS)

    Latheef, M.; Swan, C.; Spinneken, J.

    2017-03-01

    This paper concerns the description of surface water waves, specifically nonlinear changes in the directionality. Supporting calculations are provided to establish the best method of directional wave generation, the preferred method of directional analysis and the inputs on which such a method should be based. These calculations show that a random directional method, in which the phasing, amplitude and direction of propagation of individual wave components are chosen randomly, has benefits in achieving the required ergodicity. In terms of analysis procedures, the extended maximum entropy principle, with inputs based upon vector quantities, produces the best description of directionality. With laboratory data describing the water surface elevation and the two horizontal velocity components at a single point, several steep sea states are considered. The results confirm that, as the steepness of a sea state increases, the overall directionality of the sea state reduces. More importantly, it is also shown that the largest waves become less spread or more unidirectional than the sea state as a whole. This provides an important link to earlier descriptions of deterministic wave groups produced by frequency focusing, helps to explain recent field observations and has important practical implications for the design of marine structures and vessels.

  20. Evaluation of Three Protein-Extraction Methods for Proteome Analysis of Maize Leaf Midrib, a Compound Tissue Rich in Sclerenchyma Cells.

    PubMed

    Wang, Ning; Wu, Xiaolin; Ku, Lixia; Chen, Yanhui; Wang, Wei

    2016-01-01

    Leaf morphology is closely related to the growth and development of maize (Zea mays L.) plants and final kernel production. As an important part of the maize leaf, the midrib holds leaf blades in the aerial position for maximum sunlight capture. Leaf midribs of adult plants contain substantial sclerenchyma cells with heavily thickened and lignified secondary walls and have a high amount of phenolics, making protein extraction and proteome analysis difficult in leaf midrib tissue. In the present study, three protein-extraction methods that are commonly used in plant proteomics, i.e., phenol extraction, TCA/acetone extraction, and TCA/acetone/phenol extraction, were qualitatively and quantitatively evaluated based on 2DE maps and MS/MS analysis using the midribs of the 10th newly expanded leaves of maize plants. Microscopy revealed the existence of substantial amounts of sclerenchyma underneath maize midrib epidermises (particularly abaxial epidermises). The spot-number order obtained via 2DE mapping was as follows: phenol extraction (655) > TCA/acetone extraction (589) > TCA/acetone/phenol extraction (545). MS/MS analysis identified a total of 17 spots that exhibited 2-fold changes in abundance among the three methods (using phenol extraction as a control). Sixteen of the proteins identified were hydrophilic, with GRAVY values ranging from -0.026 to -0.487. For all three methods, we were able to obtain high-quality protein samples and good 2DE maps for the maize leaf midrib. However, phenol extraction produced a better 2DE map with greater resolution between spots, and TCA/acetone extraction produced higher protein yields. Thus, this paper includes a discussion regarding the possible reasons for differential protein extraction among the three methods. This study provides useful information that can be used to select suitable protein extraction methods for the proteome analysis of recalcitrant plant tissues that are rich in sclerenchyma cells.

  1. DNA analysis of hair and scat collected along snow tracks to document the presence of Canada Lynx.

    Treesearch

    Kevin S. McKelvey; Jeffrey von Kienast; Keith B. Aubry; Gary M. Koehler; Bejamin T. Maletzke; John R. Squires; Edward L. Lindquist; Steve Loch; Michael K. Schwartz

    2006-01-01

    Snow tracking is often used to inventory carnivore communities, but species identification using this method can produce ambiguous and misleading results. DNA can be extracted from hair and scat samples collected from tracks made in snow. Using DNA analysis could allow positive track identification across a broad range of snow conditions, thus increasing survey...

  2. Analysis of Covariance: Is It the Appropriate Model to Study Change?

    ERIC Educational Resources Information Center

    Marston, Paul T., Borich, Gary D.

    The four main approaches to measuring treatment effects in schools; raw gain, residual gain, covariance, and true scores; were compared. A simulation study showed true score analysis produced a large number of Type-I errors. When corrected for this error, this method showed the least power of the four. This outcome was clearly the result of the…

  3. Trial-by-Trial Analysis or Averaging: Implications for Electromyographic Models of Rapid Limb Control

    ERIC Educational Resources Information Center

    Sherwood, David E.; Enebo, Brian A.

    2007-01-01

    The control of human limb movement has been the focus of research for more than a century. A major issue to emerge from this work is the manner in which the central nervous system regulates electromyographic (EMG) activity to produce movements that differ in distance, velocity, and movement time. However, the different methods of analysis often…

  4. Quantitative Analysis of Cancer Cell Migration in Gradients Of EGF, HGF, and SDF-alpha Using a Microfluidic Chemotaxis Device

    DTIC Science & Technology

    2005-01-01

    Quantitative Analysis of Cancer Cell Migration in Gradients of EGF, HGF, and SDF-alpha Using a Microfluidic Chemotaxis Device The University of California...allowing for parallel analysis . Additionally, simple methods of localizing gels into microdevices are demonstrated. The device was characterized by...To overcome some of these drawbacks, several approaches have utilized free diffusion to produce gradients in static environ - ments.5-9 However

  5. Training Plan. Central Archive for Reusable Defense Software (CARDS)

    DTIC Science & Technology

    1994-01-29

    Modeling Software Reuse Technology: Feature Oriented Domain Analysis ( FODA ). SEI, Carnegie Mellon University, May 1992. 8. Component Provider’s...events to the services of the domain. 4. Feature Oriented Domain Analysis ( FODA ) [COHEN92] The FODA method produces feature models. Feature models provide...Architecture FODA Feature-Oriented Domain Analysis GOTS Government-Off-The-Shelf Pap A-49 STARS-VC-B003/001/00 29 imaty 1994 MS Master of Science NEC

  6. Process model and economic analysis of ethanol production from sugar beet raw juice as part of the cleaner production concept.

    PubMed

    Vučurović, Damjan G; Dodić, Siniša N; Popov, Stevan D; Dodić, Jelena M; Grahovac, Jovana A

    2012-01-01

    The batch fermentation process of sugar beet processing intermediates by free yeast cells is the most widely used method in the Autonomous Province of Vojvodina for producing ethanol as fuel. In this study a process and cost model was developed for producing ethanol from raw juice. The model can be used to calculate capital investment costs, unit production costs and operating costs for a plant producing 44 million l of 99.6% pure ethanol annually. In the sensitivity analysis the influence of sugar beet and yeast price, as well as the influence of recycled biomass on process economics, ethanol production costs and project feasibility was examined. The results of this study clearly demonstrate that the raw material costs have a significant influence on the expenses for producing ethanol. Also, the optimal percentage of recycled biomass turned out to be in the range from 50% to 70%. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. Improving the complementary methods to estimate evapotranspiration under diverse climatic and physical conditions

    NASA Astrophysics Data System (ADS)

    Anayah, F. M.; Kaluarachchi, J. J.

    2014-06-01

    Reliable estimation of evapotranspiration (ET) is important for the purpose of water resources planning and management. Complementary methods, including complementary relationship areal evapotranspiration (CRAE), advection aridity (AA) and Granger and Gray (GG), have been used to estimate ET because these methods are simple and practical in estimating regional ET using meteorological data only. However, prior studies have found limitations in these methods especially in contrasting climates. This study aims to develop a calibration-free universal method using the complementary relationships to compute regional ET in contrasting climatic and physical conditions with meteorological data only. The proposed methodology consists of a systematic sensitivity analysis using the existing complementary methods. This work used 34 global FLUXNET sites where eddy covariance (EC) fluxes of ET are available for validation. A total of 33 alternative model variations from the original complementary methods were proposed. Further analysis using statistical methods and simplified climatic class definitions produced one distinctly improved GG-model-based alternative. The proposed model produced a single-step ET formulation with results equal to or better than the recent studies using data-intensive, classical methods. Average root mean square error (RMSE), mean absolute bias (BIAS) and R2 (coefficient of determination) across 34 global sites were 20.57 mm month-1, 10.55 mm month-1 and 0.64, respectively. The proposed model showed a step forward toward predicting ET in large river basins with limited data and requiring no calibration.

  8. Galaxy two-point covariance matrix estimation for next generation surveys

    NASA Astrophysics Data System (ADS)

    Howlett, Cullan; Percival, Will J.

    2017-12-01

    We perform a detailed analysis of the covariance matrix of the spherically averaged galaxy power spectrum and present a new, practical method for estimating this within an arbitrary survey without the need for running mock galaxy simulations that cover the full survey volume. The method uses theoretical arguments to modify the covariance matrix measured from a set of small-volume cubic galaxy simulations, which are computationally cheap to produce compared to larger simulations and match the measured small-scale galaxy clustering more accurately than is possible using theoretical modelling. We include prescriptions to analytically account for the window function of the survey, which convolves the measured covariance matrix in a non-trivial way. We also present a new method to include the effects of super-sample covariance and modes outside the small simulation volume which requires no additional simulations and still allows us to scale the covariance matrix. As validation, we compare the covariance matrix estimated using our new method to that from a brute-force calculation using 500 simulations originally created for analysis of the Sloan Digital Sky Survey Main Galaxy Sample. We find excellent agreement on all scales of interest for large-scale structure analysis, including those dominated by the effects of the survey window, and on scales where theoretical models of the clustering normally break down, but the new method produces a covariance matrix with significantly better signal-to-noise ratio. Although only formally correct in real space, we also discuss how our method can be extended to incorporate the effects of redshift space distortions.

  9. Coloc-stats: a unified web interface to perform colocalization analysis of genomic features.

    PubMed

    Simovski, Boris; Kanduri, Chakravarthi; Gundersen, Sveinung; Titov, Dmytro; Domanska, Diana; Bock, Christoph; Bossini-Castillo, Lara; Chikina, Maria; Favorov, Alexander; Layer, Ryan M; Mironov, Andrey A; Quinlan, Aaron R; Sheffield, Nathan C; Trynka, Gosia; Sandve, Geir K

    2018-06-05

    Functional genomics assays produce sets of genomic regions as one of their main outputs. To biologically interpret such region-sets, researchers often use colocalization analysis, where the statistical significance of colocalization (overlap, spatial proximity) between two or more region-sets is tested. Existing colocalization analysis tools vary in the statistical methodology and analysis approaches, thus potentially providing different conclusions for the same research question. As the findings of colocalization analysis are often the basis for follow-up experiments, it is helpful to use several tools in parallel and to compare the results. We developed the Coloc-stats web service to facilitate such analyses. Coloc-stats provides a unified interface to perform colocalization analysis across various analytical methods and method-specific options (e.g. colocalization measures, resolution, null models). Coloc-stats helps the user to find a method that supports their experimental requirements and allows for a straightforward comparison across methods. Coloc-stats is implemented as a web server with a graphical user interface that assists users with configuring their colocalization analyses. Coloc-stats is freely available at https://hyperbrowser.uio.no/coloc-stats/.

  10. Novel hyperspectral prediction method and apparatus

    NASA Astrophysics Data System (ADS)

    Kemeny, Gabor J.; Crothers, Natalie A.; Groth, Gard A.; Speck, Kathy A.; Marbach, Ralf

    2009-05-01

    Both the power and the challenge of hyperspectral technologies is the very large amount of data produced by spectral cameras. While off-line methodologies allow the collection of gigabytes of data, extended data analysis sessions are required to convert the data into useful information. In contrast, real-time monitoring, such as on-line process control, requires that compression of spectral data and analysis occur at a sustained full camera data rate. Efficient, high-speed practical methods for calibration and prediction are therefore sought to optimize the value of hyperspectral imaging. A novel method of matched filtering known as science based multivariate calibration (SBC) was developed for hyperspectral calibration. Classical (MLR) and inverse (PLS, PCR) methods are combined by spectroscopically measuring the spectral "signal" and by statistically estimating the spectral "noise." The accuracy of the inverse model is thus combined with the easy interpretability of the classical model. The SBC method is optimized for hyperspectral data in the Hyper-CalTM software used for the present work. The prediction algorithms can then be downloaded into a dedicated FPGA based High-Speed Prediction EngineTM module. Spectral pretreatments and calibration coefficients are stored on interchangeable SD memory cards, and predicted compositions are produced on a USB interface at real-time camera output rates. Applications include minerals, pharmaceuticals, food processing and remote sensing.

  11. Building a composite score of general practitioners' intrinsic motivation: a comparison of methods.

    PubMed

    Sicsic, Jonathan; Le Vaillant, Marc; Franc, Carine

    2014-04-01

    Pay-for-performance programmes have been widely implemented in primary care, but few studies have investigated their potential adverse effects on the intrinsic motivation of general practitioners (GPs) even though intrinsic motivation may be a key determinant of quality in health care. Our aim was to compare methods for developing a composite score of GPs' intrinsic motivation and to select one that is most consistent with self-reported data. A postal survey. French GPs practicing in private practice. Using a set of variables selected to characterize the dimensions of intrinsic motivation, three alternative composite scores were calculated based on a multiple correspondence analysis (MCA), a confirmatory factor analysis (CFA) and a two-parameter logistic model (2-PLM). Weighted kappa coefficients were used to evaluate variation in GPs' ranks according to each method. The three methods produced similar results on both the estimation of the indicators' weights and the order of GP rank lists. All weighted kappa coefficients were >0.80. The CFA and 2-PLM produced the most similar results. There was little difference regarding the three methods' results, validating our measure of GPs' intrinsic motivation. The 2-PLM appeared theoretically and empirically more robust for establishing the intrinsic motivation score. Code JEL C38, C43, I18.

  12. Assessing the accuracy of cranial and pelvic ageing methods on human skeletal remains from a modern Greek assemblage.

    PubMed

    Xanthopoulou, Panagiota; Valakos, Efstratios; Youlatos, Dionisios; Nikita, Efthymia

    2018-05-01

    The present study tests the accuracy of commonly adopted ageing methods based on the morphology of the pubic symphysis, auricular surface and cranial sutures. These methods are examined both in their traditional form as well as in the context of transition analysis using the ADBOU software in a modern Greek documented collection consisting of 140 individuals who lived mainly in the second half of the twentieth century and come from cemeteries in the area of Athens. The auricular surface overall produced the most accurate age estimates in our material, with different methods based on this anatomical area showing varying degrees of success for different age groups. The pubic symphysis produced accurate results primarily for young adults and the same applied to cranial sutures but the latter appeared completely inappropriate for older individuals. The use of transition analysis through the ADBOU software provided less accurate results than the corresponding traditional ageing methods in our sample. Our results are in agreement with those obtained from validation studies based on material from across the world, but certain differences identified with other studies on Greek material highlight the importance of taking into account intra- and inter-population variability in age estimation. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. The longevity in the yeast Saccharomyces cerevisiae: A comparison of two approaches for assessment the lifespan.

    PubMed

    Molon, Mateusz; Zadrag-Tecza, Renata; Bilinski, Tomasz

    2015-05-08

    Longevity of the selected "longevity mutants" of yeast was studied using two methods. The standard method was based on counting the number of daughter cells produced. Modification of that method allowed for establishing the length of life expressed in units of time. It appeared that all the studied "deletion longevity mutants" showed a statistically meaningful increase in the number of daughters produced (replicative lifespan), whereas only one of the mutants, previously regarded as "short lived", showed a meaningful increase in the time of life. The analysis of the available data shows that the time of life of most yeast strains is similar irrespective of their genetic background and mutations, which suggests a quasi-programmed nature of yeast death. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Determination of pK(a) of felodipine using UV-Visible spectroscopy.

    PubMed

    Pandey, M M; Jaipal, A; Kumar, A; Malik, R; Charde, S Y

    2013-11-01

    In the present study, for the first time, experimental pKa value of felodipine is reported. Dissociation constant, pKa, is one of the very important physicochemical properties of drugs. It is of paramount significance from the perspective of pharmaceutical analysis and dosage form design. The method used for the pKa determination of felodipine was essentially a UV-Visible spectrophotometric method. The spectrophotometric method for the pKa determination was opted by acknowledging the established fact that spectrophotometric determination of pKa produces most precise values. The pKa of felodipine was found to be 5.07. Furthermore, the ruggedness of the determined value is also validated in this study in order to produce exact pKa of the felodipine. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Automation of POST Cases via External Optimizer and "Artificial p2" Calculation

    NASA Technical Reports Server (NTRS)

    Dees, Patrick D.; Zwack, Mathew R.; Michelson, Diane K.

    2017-01-01

    During conceptual design speed and accuracy are often at odds. Specifically in the realm of launch vehicles, optimizing the ascent trajectory requires a larger pool of analytical power and expertise. Experienced analysts working on familiar vehicles can produce optimal trajectories in a short time frame, however whenever either "experienced" or "familiar " is not applicable the optimization process can become quite lengthy. In order to construct a vehicle agnostic method an established global optimization algorithm is needed. In this work the authors develop an "artificial" error term to map arbitrary control vectors to non-zero error by which a global method can operate. Two global methods are compared alongside Design of Experiments and random sampling and are shown to produce comparable results to analysis done by a human expert.

  16. Laser-induced differential normalized fluorescence method for cancer diagnosis

    DOEpatents

    Vo-Dinh, Tuan; Panjehpour, Masoud; Overholt, Bergein F.

    1996-01-01

    An apparatus and method for cancer diagnosis are disclosed. The diagnostic method includes the steps of irradiating a tissue sample with monochromatic excitation light, producing a laser-induced fluorescence spectrum from emission radiation generated by interaction of the excitation light with the tissue sample, and dividing the intensity at each wavelength of the laser-induced fluorescence spectrum by the integrated area under the laser-induced fluorescence spectrum to produce a normalized spectrum. A mathematical difference between the normalized spectrum and an average value of a reference set of normalized spectra which correspond to normal tissues is calculated, which provides for amplifying small changes in weak signals from malignant tissues for improved analysis. The calculated differential normalized spectrum is correlated to a specific condition of a tissue sample.

  17. Laser-induced differential normalized fluorescence method for cancer diagnosis

    DOEpatents

    Vo-Dinh, T.; Panjehpour, M.; Overholt, B.F.

    1996-12-03

    An apparatus and method for cancer diagnosis are disclosed. The diagnostic method includes the steps of irradiating a tissue sample with monochromatic excitation light, producing a laser-induced fluorescence spectrum from emission radiation generated by interaction of the excitation light with the tissue sample, and dividing the intensity at each wavelength of the laser-induced fluorescence spectrum by the integrated area under the laser-induced fluorescence spectrum to produce a normalized spectrum. A mathematical difference between the normalized spectrum and an average value of a reference set of normalized spectra which correspond to normal tissues is calculated, which provides for amplifying small changes in weak signals from malignant tissues for improved analysis. The calculated differential normalized spectrum is correlated to a specific condition of a tissue sample. 5 figs.

  18. Synthesis and Characterization of LaTiO2N

    NASA Astrophysics Data System (ADS)

    Rugen, Evan E.

    Photocatalysts offer an excellent opportunity to shift the global energy landscape from a fossil fuel-dependent paradigm to sustainable and carbon-neutral solar fuels. Oxynitride materials such as LaTiO2N are potential photocatalysts for the water splitting reaction due to their high oxidative stability and their narrow band gaps, which are suitable for visible light absorption. However, facile synthetic routes to metal oxynitrides with controlled morphologies are rare, and the local structures of these materials are under-characterized. Ultrasonic spray synthesis (USS) offers a facile method toward complex metal oxides which can potentially be converted to oxynitrides with preservation of the microsphere structures that typify the products from such aerosol routes. Here, La-Ti-O microspheres were facilely produced by USS and converted by ammonolysis to LaTiO2N microspheres with porous shells and hollow interiors. This particle architecture is accounted for by coupling suitable combustion chemistry with the aerosol technique, producing precursor particles where the La3+ and Ti4+ are well-mixed at small length scales; this feature enables preservation of the microsphere morphology during nitridation despite the crystallographic changes that occur. The LaTiO2N microspheres are comparable oxygen evolving photocatalysts to samples produced by conventional solid state methods. Pair distribution function (PDF) analysis is a local probe designed to examine the structure of disordered crystalline materials, and is an ideal technique for characterizing the ordering of anions in oxynitrides. Preliminary studies using PDF analysis to determine the presence of anion ordering and local structure in LaTiO2N produced by solid state methods are presented here. Future experiments are proposed that will grant detailed insight into the factors driving the degree of anion ordering in these types of materials. These results demonstrate the utility of USS as a facile, potentially scalable route to complex photocatalytic materials and their precursors, and the feasibility of PDF analysis for the determination of local structures in complex oxynitrides.

  19. Influence of stage of lactation and year season on composition of mares' colostrum and milk and method and time of storage on vitamin C content in mares' milk.

    PubMed

    Markiewicz-Kęszycka, Maria; Czyżak-Runowska, Grażyna; Wójtowski, Jacek; Jóźwik, Artur; Pankiewicz, Radosław; Łęska, Bogusława; Krzyżewski, Józef; Strzałkowska, Nina; Marchewka, Joanna; Bagnicka, Emilia

    2015-08-30

    Mares' milk is becoming increasingly popular in Western Europe. This study was thus aimed at investigating the impact of stage of lactation and season on chemical composition, somatic cell count and some physicochemical parameters of mares' colostrum and milk, and at developing a method for the determination of vitamin C (ascorbic acid) in mares' milk and to determine its content in fresh and stored milk. The analysis conducted showed an effect of the stage of lactation on contents of selected chemical components and physicochemical parameters of mares' milk. In successive lactation periods levels of fat, cholesterol, energy value, citric acid and titratable acidity decreased, whereas levels of lactose and vitamin C, as well as the freezing point, increased. Analysis showed that milk produced in autumn (September, October, November) had a higher freezing point and lower concentrations of total solids, protein, fat, cholesterol, citric acid and energy value in comparison to milk produced in summer (June, July, August). Mares' milk was characterised by low somatic cell count throughout lactation. In terms of vitamin C stability the most advantageous method of milk storage was 6-month storage of lyophilised milk. In general, the results confirmed that mares' milk is a raw material with a unique chemical composition different from that produced by other farm animals. © 2014 Society of Chemical Industry.

  20. The extraction of essential oil from patchouli leaves (Pogostemon cablin Benth) using microwave hydrodistillation and solvent-free microwave extraction methods

    NASA Astrophysics Data System (ADS)

    Putri, D. K. Y.; Kusuma, H. S.; Syahputra, M. E.; Parasandi, D.; Mahfud, M.

    2017-12-01

    Patchouli plant (Pogostemon cablin Benth) is one of the important essential oil-producing plant, contributes more than 50% of total exports of Indonesia’s essential oil. However, the extraction of patchouli oil that has been done in Indonesia is generally still used conventional methods that require enormous amount of energy, high solvent usage, and long time of extraction. Therefore, in this study, patchouli oil extraction was carried out by using microwave hydrodistillation and solvent-free microwave extraction methods. Based on this research, it is known that the extraction of patchouli oil using microwave hydrodistillation method with longer extraction time (240 min) only produced patchouli oil’s yield 1.2 times greater than solvent-free microwave extraction method which require faster extraction time (120 min). Otherwise the analysis of electric consumption and the environmental impact, the solvent-free microwave extraction method showed a smaller amount when compared with microwave hydrodistillation method. It is conclude that the use of solvent-free microwave extraction method for patchouli oil extraction is suitably method as a new green technique.

  1. The effects of physical activity on impulsive choice: Influence of sensitivity to reinforcement amount and delay

    PubMed Central

    Strickland, Justin C.; Feinstein, Max A.; Lacy, Ryan T.; Smith, Mark A.

    2016-01-01

    Impulsive choice is a diagnostic feature and/or complicating factor for several psychological disorders and may be examined in the laboratory using delay-discounting procedures. Recent investigators have proposed using quantitative measures of analysis to examine the behavioral processes contributing to impulsive choice. The purpose of this study was to examine the effects of physical activity (i.e., wheel running) on impulsive choice in a single-response, discrete-trial procedure using two quantitative methods of analysis. To this end, rats were assigned to physical activity or sedentary groups and trained to respond in a delay-discounting procedure. In this procedure, one lever always produced one food pellet immediately, whereas a second lever produced three food pellets after a 0, 10, 20, 40, or 80-second delay. Estimates of sensitivity to reinforcement amount and sensitivity to reinforcement delay were determined using (1) a simple linear analysis and (2) an analysis of logarithmically transformed response ratios. Both analyses revealed that physical activity decreased sensitivity to reinforcement amount and sensitivity to reinforcement delay. These findings indicate that (1) physical activity has significant but functionally opposing effects on the behavioral processes that contribute to impulsive choice and (2) both quantitative methods of analysis are appropriate for use in single-response, discrete-trial procedures. PMID:26964905

  2. A Two-Step Approach to Uncertainty Quantification of Core Simulators

    DOE PAGES

    Yankov, Artem; Collins, Benjamin; Klein, Markus; ...

    2012-01-01

    For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor andmore » in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.« less

  3. Analysis-Preserving Video Microscopy Compression via Correlation and Mathematical Morphology

    PubMed Central

    Shao, Chong; Zhong, Alfred; Cribb, Jeremy; Osborne, Lukas D.; O’Brien, E. Timothy; Superfine, Richard; Mayer-Patel, Ketan; Taylor, Russell M.

    2015-01-01

    The large amount video data produced by multi-channel, high-resolution microscopy system drives the need for a new high-performance domain-specific video compression technique. We describe a novel compression method for video microscopy data. The method is based on Pearson's correlation and mathematical morphology. The method makes use of the point-spread function (PSF) in the microscopy video acquisition phase. We compare our method to other lossless compression methods and to lossy JPEG, JPEG2000 and H.264 compression for various kinds of video microscopy data including fluorescence video and brightfield video. We find that for certain data sets, the new method compresses much better than lossless compression with no impact on analysis results. It achieved a best compressed size of 0.77% of the original size, 25× smaller than the best lossless technique (which yields 20% for the same video). The compressed size scales with the video's scientific data content. Further testing showed that existing lossy algorithms greatly impacted data analysis at similar compression sizes. PMID:26435032

  4. Comparison of baseline removal methods for laser-induced breakdown spectroscopy of geological samples

    NASA Astrophysics Data System (ADS)

    Dyar, M. Darby; Giguere, Stephen; Carey, CJ; Boucher, Thomas

    2016-12-01

    This project examines the causes, effects, and optimization of continuum removal in laser-induced breakdown spectroscopy (LIBS) to produce the best possible prediction accuracy of elemental composition in geological samples. We compare prediction accuracy resulting from several different techniques for baseline removal, including asymmetric least squares (ALS), adaptive iteratively reweighted penalized least squares (Air-PLS), fully automatic baseline correction (FABC), continuous wavelet transformation, median filtering, polynomial fitting, the iterative thresholding Dietrich method, convex hull/rubber band techniques, and a newly-developed technique for Custom baseline removal (BLR). We assess the predictive performance of these methods using partial least-squares analysis for 13 elements of geological interest, expressed as the weight percentages of SiO2, Al2O3, TiO2, FeO, MgO, CaO, Na2O, K2O, and the parts per million concentrations of Ni, Cr, Zn, Mn, and Co. We find that previously published methods for baseline subtraction generally produce equivalent prediction accuracies for major elements. When those pre-existing methods are used, automated optimization of their adjustable parameters is always necessary to wring the best predictive accuracy out of a data set; ideally, it should be done for each individual variable. The new technique of Custom BLR produces significant improvements in prediction accuracy over existing methods across varying geological data sets, instruments, and varying analytical conditions. These results also demonstrate the dual objectives of the continuum removal problem: removing a smooth underlying signal to fit individual peaks (univariate analysis) versus using feature selection to select only those channels that contribute to best prediction accuracy for multivariate analyses. Overall, the current practice of using generalized, one-method-fits-all-spectra baseline removal results in poorer predictive performance for all methods. The extra steps needed to optimize baseline removal for each predicted variable and empower multivariate techniques with the best possible input data for optimal prediction accuracy are shown to be well worth the slight increase in necessary computations and complexity.

  5. Comparative analysis of a modified ecolite method, the colicomplete method, and a most-probable-number method for detecting Escherichia coli in orange juice.

    PubMed

    Durbin, Gregory W; Salter, Robert

    2006-01-01

    The Ecolite High Volume Juice (HVJ) presence-absence method for a 10-ml juice sample was compared with the U.S. Food and Drug Administration Bacteriological Analytical Manual most-probable-number (MPN) method for analysis of artificially contaminated orange juices. Samples were added to Ecolite-HVJ medium and incubated at 35 degrees C for 24 to 48 h. Fluorescent blue results were positive for glucuronidase- and galactosidase-producing microorganisms, specifically indicative of about 94% of Escherichia coli strains. Four strains of E. coli were added to juices at concentrations of 0.21 to 6.8 CFU/ ml. Mixtures of enteric bacteria (Enterobacter plus Klebsiella, Citrobacter plus Proteus, or Hafnia plus Citrobacter plus Enterobacter) were added to simulate background flora. Three orange juice types were evaluated (n = 10) with and without the addition of the E. coli strains. Ecolite-HVJ produced 90 of 90 (10 of 10 samples of three juice types, each inoculated with three different E. coli strains) positive (blue-fluorescent) results with artificially contaminated E. coli that had MPN concentrations of <0.3 to 9.3 CFU/ml. Ten of 30 E. coli ATCC 11229 samples with MPN concentrations of <0.3 CFU/ml were identified as positive with Ecolite-HVJ. Isolated colonies recovered from positive Ecolite-HVJ samples were confirmed biochemically as E. coli. Thirty (10 samples each of three juice types) negative (not fluorescent) results were obtained for samples contaminated with only enteric bacteria and for uninoculated control samples. A juice manufacturer evaluated citrus juice production with both the Ecolite-HVJ and Colicomplete methods and recorded identical negative results for 95 20-ml samples and identical positive results for 5 20-ml samples artificially contaminated with E. coli. The Ecolite-HVJ method requires no preenrichment and subsequent transfer steps, which makes it a simple and easy method for use by juice producers.

  6. Process for structural geologic analysis of topography and point data

    DOEpatents

    Eliason, Jay R.; Eliason, Valerie L. C.

    1987-01-01

    A quantitative method of geologic structural analysis of digital terrain data is described for implementation on a computer. Assuming selected valley segments are controlled by the underlying geologic structure, topographic lows in the terrain data, defining valley bottoms, are detected, filtered and accumulated into a series line segments defining contiguous valleys. The line segments are then vectorized to produce vector segments, defining valley segments, which may be indicative of the underlying geologic structure. Coplanar analysis is performed on vector segment pairs to determine which vectors produce planes which represent underlying geologic structure. Point data such as fracture phenomena which can be related to fracture planes in 3-dimensional space can be analyzed to define common plane orientation and locations. The vectors, points, and planes are displayed in various formats for interpretation.

  7. A full potential inverse method based on a density linearization scheme for wing design

    NASA Technical Reports Server (NTRS)

    Shankar, V.

    1982-01-01

    A mixed analysis inverse procedure based on the full potential equation in conservation form was developed to recontour a given base wing to produce density linearization scheme in applying the pressure boundary condition in terms of the velocity potential. The FL030 finite volume analysis code was modified to include the inverse option. The new surface shape information, associated with the modified pressure boundary condition, is calculated at a constant span station based on a mass flux integration. The inverse method is shown to recover the original shape when the analysis pressure is not altered. Inverse calculations for weakening of a strong shock system and for a laminar flow control (LFC) pressure distribution are presented. Two methods for a trailing edge closure model are proposed for further study.

  8. Risk analysis for veterinary biologicals released into the environment.

    PubMed

    Silva, S V; Samagh, B S; Morley, R S

    1995-12-01

    All veterinary biologicals licensed in Canada must be shown to be pure, potent, safe and effective. A risk-based approach is used to evaluate the safety of all biologicals, whether produced by conventional methods or by molecular biological techniques. Traditionally, qualitative risk assessment methods have been used for this purpose. More recently, quantitative risk assessment has become available for complex issues. The quantitative risk assessment method uses "scenario tree analysis' to predict the likelihood of various outcomes and their respective impacts. The authors describe the quantitative risk assessment approach which is used within the broader context of risk analysis (i.e. risk assessment, risk management and risk communication) to develop recommendations for the field release of veterinary biologicals. The general regulatory framework for the licensing of veterinary biologicals in Canada is also presented.

  9. Velopharyngeal Status of Stop Consonants and Vowels Produced by Young Children with and without Repaired Cleft Palate at 12, 14, and 18 Months of Age: A Preliminary Analysis

    ERIC Educational Resources Information Center

    Eshghi, Marziye; Vallino, Linda D.; Baylis, Adriane L.; Preisser, John S.; Zajac, David J.

    2017-01-01

    Purpose: The objective was to determine velopharyngeal (VP) status of stop consonants and vowels produced by young children with repaired cleft palate (CP) and typically developing (TD) children from 12 to 18 months of age. Method: Nasal ram pressure (NRP) was monitored in 9 children (5 boys, 4 girls) with repaired CP with or without cleft lip and…

  10. Position Paper: Applying Machine Learning to Software Analysis to Achieve Trusted, Repeatable Scientific Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prowell, Stacy J; Symons, Christopher T

    2015-01-01

    Producing trusted results from high-performance codes is essential for policy and has significant economic impact. We propose combining rigorous analytical methods with machine learning techniques to achieve the goal of repeatable, trustworthy scientific computing.

  11. A general panel method for the analysis and design of arbitrary configurations in incompressible flows. [boundary value problem

    NASA Technical Reports Server (NTRS)

    Johnson, F. T.

    1980-01-01

    A method for solving the linear integral equations of incompressible potential flow in three dimensions is presented. Both analysis (Neumann) and design (Dirichlet) boundary conditions are treated in a unified approach to the general flow problem. The method is an influence coefficient scheme which employs source and doublet panels as boundary surfaces. Curved panels possessing singularity strengths, which vary as polynomials are used, and all influence coefficients are derived in closed form. These and other features combine to produce an efficient scheme which is not only versatile but eminently suited to the practical realities of a user-oriented environment. A wide variety of numerical results demonstrating the method is presented.

  12. Semi-automatic mapping of geological Structures using UAV-based photogrammetric data: An image analysis approach

    NASA Astrophysics Data System (ADS)

    Vasuki, Yathunanthan; Holden, Eun-Jung; Kovesi, Peter; Micklethwaite, Steven

    2014-08-01

    Recent advances in data acquisition technologies, such as Unmanned Aerial Vehicles (UAVs), have led to a growing interest in capturing high-resolution rock surface images. However, due to the large volumes of data that can be captured in a short flight, efficient analysis of this data brings new challenges, especially the time it takes to digitise maps and extract orientation data. We outline a semi-automated method that allows efficient mapping of geological faults using photogrammetric data of rock surfaces, which was generated from aerial photographs collected by a UAV. Our method harnesses advanced automated image analysis techniques and human data interaction to rapidly map structures and then calculate their dip and dip directions. Geological structures (faults, joints and fractures) are first detected from the primary photographic dataset and the equivalent three dimensional (3D) structures are then identified within a 3D surface model generated by structure from motion (SfM). From this information the location, dip and dip direction of the geological structures are calculated. A structure map generated by our semi-automated method obtained a recall rate of 79.8% when compared against a fault map produced using expert manual digitising and interpretation methods. The semi-automated structure map was produced in 10 min whereas the manual method took approximately 7 h. In addition, the dip and dip direction calculation, using our automated method, shows a mean±standard error of 1.9°±2.2° and 4.4°±2.6° respectively with field measurements. This shows the potential of using our semi-automated method for accurate and efficient mapping of geological structures, particularly from remote, inaccessible or hazardous sites.

  13. Dynamics of intracellular processes in live-cell systems unveiled by fluorescence correlation microscopy.

    PubMed

    González Bardeci, Nicolás; Angiolini, Juan Francisco; De Rossi, María Cecilia; Bruno, Luciana; Levi, Valeria

    2017-01-01

    Fluorescence fluctuation-based methods are non-invasive microscopy tools especially suited for the study of dynamical aspects of biological processes. These methods examine spontaneous intensity fluctuations produced by fluorescent molecules moving through the small, femtoliter-sized observation volume defined in confocal and multiphoton microscopes. The quantitative analysis of the intensity trace provides information on the processes producing the fluctuations that include diffusion, binding interactions, chemical reactions and photophysical phenomena. In this review, we present the basic principles of the most widespread fluctuation-based methods, discuss their implementation in standard confocal microscopes and briefly revise some examples of their applications to address relevant questions in living cells. The ultimate goal of these methods in the Cell Biology field is to observe biomolecules as they move, interact with targets and perform their biological action in the natural context. © 2016 IUBMB Life, 69(1):8-15, 2017. © 2016 International Union of Biochemistry and Molecular Biology.

  14. A novel power harmonic analysis method based on Nuttall-Kaiser combination window double spectrum interpolated FFT algorithm

    NASA Astrophysics Data System (ADS)

    Jin, Tao; Chen, Yiyang; Flesch, Rodolfo C. C.

    2017-11-01

    Harmonics pose a great threat to safe and economical operation of power grids. Therefore, it is critical to detect harmonic parameters accurately to design harmonic compensation equipment. The fast Fourier transform (FFT) is widely used for electrical popular power harmonics analysis. However, the barrier effect produced by the algorithm itself and spectrum leakage caused by asynchronous sampling often affects the harmonic analysis accuracy. This paper examines a new approach for harmonic analysis based on deducing the modifier formulas of frequency, phase angle, and amplitude, utilizing the Nuttall-Kaiser window double spectrum line interpolation method, which overcomes the shortcomings in traditional FFT harmonic calculations. The proposed approach is verified numerically and experimentally to be accurate and reliable.

  15. Multivariable confounding adjustment in distributed data networks without sharing of patient-level data.

    PubMed

    Toh, Sengwee; Reichman, Marsha E; Houstoun, Monika; Ding, Xiao; Fireman, Bruce H; Gravel, Eric; Levenson, Mark; Li, Lingling; Moyneur, Erick; Shoaibi, Azadeh; Zornberg, Gwen; Hennessy, Sean

    2013-11-01

    It is increasingly necessary to analyze data from multiple sources when conducting public health safety surveillance or comparative effectiveness research. However, security, privacy, proprietary, and legal concerns often reduce data holders' willingness to share highly granular information. We describe and compare two approaches that do not require sharing of patient-level information to adjust for confounding in multi-site studies. We estimated the risks of angioedema associated with angiotensin-converting enzyme inhibitors (ACEIs), angiotensin receptor blockers (ARBs), and aliskiren in comparison with beta-blockers within Mini-Sentinel, which has created a distributed data system of 18 health plans. To obtain the adjusted hazard ratios (HRs) and 95% confidence intervals (CIs), we performed (i) a propensity score-stratified case-centered logistic regression analysis, a method identical to a stratified Cox regression analysis but needing only aggregated risk set data, and (ii) an inverse variance-weighted meta-analysis, which requires only the site-specific HR and variance. We also performed simulations to further compare the two methods. Compared with beta-blockers, the adjusted HR was 3.04 (95% CI: 2.81, 3.27) for ACEIs, 1.16 (1.00, 1.34) for ARBs, and 2.85 (1.34, 6.04) for aliskiren in the case-centered analysis. The corresponding HRs were 2.98 (2.76, 3.21), 1.15 (1.00, 1.33), and 2.86 (1.35, 6.04) in the meta-analysis. Simulations suggested that the two methods may produce different results under certain analytic scenarios. The case-centered analysis and the meta-analysis produced similar results without the need to share patient-level data across sites in our empirical study, but may provide different results in other study settings. Copyright © 2013 John Wiley & Sons, Ltd.

  16. Analysis of simple 2-D and 3-D metal structures subjected to fragment impact

    NASA Technical Reports Server (NTRS)

    Witmer, E. A.; Stagliano, T. R.; Spilker, R. L.; Rodal, J. J. A.

    1977-01-01

    Theoretical methods were developed for predicting the large-deflection elastic-plastic transient structural responses of metal containment or deflector (C/D) structures to cope with rotor burst fragment impact attack. For two-dimensional C/D structures both, finite element and finite difference analysis methods were employed to analyze structural response produced by either prescribed transient loads or fragment impact. For the latter category, two time-wise step-by-step analysis procedures were devised to predict the structural responses resulting from a succession of fragment impacts: the collision force method (CFM) which utilizes an approximate prediction of the force applied to the attacked structure during fragment impact, and the collision imparted velocity method (CIVM) in which the impact-induced velocity increment acquired by a region of the impacted structure near the impact point is computed. The merits and limitations of these approaches are discussed. For the analysis of 3-d responses of C/D structures, only the CIVM approach was investigated.

  17. Use of multiple cluster analysis methods to explore the validity of a community outcomes concept map.

    PubMed

    Orsi, Rebecca

    2017-02-01

    Concept mapping is now a commonly-used technique for articulating and evaluating programmatic outcomes. However, research regarding validity of knowledge and outcomes produced with concept mapping is sparse. The current study describes quantitative validity analyses using a concept mapping dataset. We sought to increase the validity of concept mapping evaluation results by running multiple cluster analysis methods and then using several metrics to choose from among solutions. We present four different clustering methods based on analyses using the R statistical software package: partitioning around medoids (PAM), fuzzy analysis (FANNY), agglomerative nesting (AGNES) and divisive analysis (DIANA). We then used the Dunn and Davies-Bouldin indices to assist in choosing a valid cluster solution for a concept mapping outcomes evaluation. We conclude that the validity of the outcomes map is high, based on the analyses described. Finally, we discuss areas for further concept mapping methods research. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. A novel method for qualitative analysis of edible oil oxidation using an electronic nose.

    PubMed

    Xu, Lirong; Yu, Xiuzhu; Liu, Lei; Zhang, Rui

    2016-07-01

    An electronic nose (E-nose) was used for rapid assessment of the degree of oxidation in edible oils. Peroxide and acid values of edible oil samples were analyzed using data obtained by the American Oil Chemists' Society (AOCS) Official Method for reference. Qualitative discrimination between non-oxidized and oxidized oils was conducted using the E-nose technique developed in combination with cluster analysis (CA), principal component analysis (PCA), and linear discriminant analysis (LDA). The results from CA, PCA and LDA indicated that the E-nose technique could be used for differentiation of non-oxidized and oxidized oils. LDA produced slightly better results than CA and PCA. The proposed approach can be used as an alternative to AOCS Official Method as an innovative tool for rapid detection of edible oil oxidation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. A case study of the sensitivity of forecast skill to data and data analysis techniques

    NASA Technical Reports Server (NTRS)

    Baker, W. E.; Atlas, R.; Halem, M.; Susskind, J.

    1983-01-01

    A series of experiments have been conducted to examine the sensitivity of forecast skill to various data and data analysis techniques for the 0000 GMT case of January 21, 1979. These include the individual components of the FGGE observing system, the temperatures obtained with different satellite retrieval methods, and the method of vertical interpolation between the mandatory pressure analysis levels and the model sigma levels. It is found that NESS TIROS-N infrared retrievals seriously degrade a rawinsonde-only analysis over land, resulting in a poorer forecast over North America. Less degradation in the 72-hr forecast skill at sea level and some improvement at 500 mb is noted, relative to the control with TIROS-N retrievals produced with a physical inversion method which utilizes a 6-hr forecast first guess. NESS VTPR oceanic retrievals lead to an improved forecast over North America when added to the control.

  20. A complementation assay for in vivo protein structure/function analysis in Physcomitrella patens (Funariaceae)

    DOE PAGES

    Scavuzzo-Duggan, Tess R.; Chaves, Arielle M.; Roberts, Alison W.

    2015-07-14

    Here, a method for rapid in vivo functional analysis of engineered proteins was developed using Physcomitrella patens. A complementation assay was designed for testing structure/function relationships in cellulose synthase (CESA) proteins. The components of the assay include (1) construction of test vectors that drive expression of epitope-tagged PpCESA5 carrying engineered mutations, (2) transformation of a ppcesa5 knockout line that fails to produce gametophores with test and control vectors, (3) scoring the stable transformants for gametophore production, (4) statistical analysis comparing complementation rates for test vectors to positive and negative control vectors, and (5) analysis of transgenic protein expression by Westernmore » blotting. The assay distinguished mutations that generate fully functional, nonfunctional, and partially functional proteins. In conclusion, compared with existing methods for in vivo testing of protein function, this complementation assay provides a rapid method for investigating protein structure/function relationships in plants.« less

  1. SURVEY AND ANALYSIS OF CRACKS ON NATM CONCRETE LINING, AND A STUDY OF THE METHOD TO CONTROL CRACKS GENERATION

    NASA Astrophysics Data System (ADS)

    Takayama, Hirofumi; Masuda, Yasuo; Nakayama, Takashi; Shigeta, Yoshiyuki; Yingyograttanakul, Narentorn; Asakura, Toshihiro

    The concrete linings constructed by NATM often have cracks occurred near the tunnel crown in the longitudinal direction. In the results of the 1/4 scaled model tests, the authors have showed that in order to simulate the mechanism of cracks generation correctly, not only the coupled stress-thermal analysis but also the coupled stress-moisture analysis should be performed in numerical analysis procedures. We survey the strain produced inside of the second lining concrete and the progress of cracks occurred in the real tunnel used at the Shinkansen. And point out that not only the coupled stress-thermal analysis but also the coupled stress-moisture analysis can represent them. Further, we propose a method to control cracks generation, the adjustment of the temperature and the humidity.

  2. Development of Advanced Materials for Electro-Ceramic Application Final Report CRADA No. TC-1331-96

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caplan, M.; Olstad, R.; McMillan, L.

    The goal of this project was to further develop and characterize the electrochemical methods originating in Russia for producing ultra high purity organometallic compounds utilized as precursors in the production of high quality electro-ceramic materials. Symetrix planned to use electro-ceramic materials with high dielectric constant for microelectronic memory circuit applications. General Atomics planned to use the barium titanate type ceramics with low loss tangent for producing a high power ferroelectric tuner used to match radio frequency power into their Dill-D fusion machine. Phase I of the project was scheduled to have a large number of organometallic (alkoxides) chemical samples producedmore » using various methods. These would be analyzed by LLNL, Soliton and Symetrix independently to determine the level of chemical impurities thus verifying each other's analysis. The goal was to demonstrate a cost-effective production method, which could be implemented in a large commercial facility to produce high purity organometallic compounds. In addition, various compositions of barium-strontium-titanate ceramics were to be produced and analyzed in order to develop an electroceramic capacitor material having the desired characteristics with respect to dielectric constant, loss tangent, temperature characteristics and non-linear behavior under applied voltage. Upon optimizing the barium titanate material, 50 capacitor preforms would be produced from this material demonstrating the ability to produce, in quantity, the pills ultimately required for the ferroelectric tuner (approx 2000-3000 ceramic pills).« less

  3. Review and statistical analysis of the use of ultrasonic velocity for estimating the porosity fraction in polycrystalline materials

    NASA Technical Reports Server (NTRS)

    Roth, D. J.; Swickard, S. M.; Stang, D. B.; Deguire, M. R.

    1991-01-01

    A review and statistical analysis of the ultrasonic velocity method for estimating the porosity fraction in polycrystalline materials is presented. Initially, a semiempirical model is developed showing the origin of the linear relationship between ultrasonic velocity and porosity fraction. Then, from a compilation of data produced by many researchers, scatter plots of velocity versus percent porosity data are shown for Al2O3, MgO, porcelain-based ceramics, PZT, SiC, Si3N4, steel, tungsten, UO2,(U0.30Pu0.70)C, and YBa2Cu3O(7-x). Linear regression analysis produces predicted slope, intercept, correlation coefficient, level of significance, and confidence interval statistics for the data. Velocity values predicted from regression analysis of fully-dense materials are in good agreement with those calculated from elastic properties.

  4. Comparison of GEOS-5 AGCM planetary boundary layer depths computed with various definitions

    NASA Astrophysics Data System (ADS)

    McGrath-Spangler, E. L.; Molod, A.

    2014-07-01

    Accurate models of planetary boundary layer (PBL) processes are important for forecasting weather and climate. The present study compares seven methods of calculating PBL depth in the GEOS-5 atmospheric general circulation model (AGCM) over land. These methods depend on the eddy diffusion coefficients, bulk and local Richardson numbers, and the turbulent kinetic energy. The computed PBL depths are aggregated to the Köppen-Geiger climate classes, and some limited comparisons are made using radiosonde profiles. Most methods produce similar midday PBL depths, although in the warm, moist climate classes the bulk Richardson number method gives midday results that are lower than those given by the eddy diffusion coefficient methods. Additional analysis revealed that methods sensitive to turbulence driven by radiative cooling produce greater PBL depths, this effect being most significant during the evening transition. Nocturnal PBLs based on Richardson number methods are generally shallower than eddy diffusion coefficient based estimates. The bulk Richardson number estimate is recommended as the PBL height to inform the choice of the turbulent length scale, based on the similarity to other methods during the day, and the improved nighttime behavior.

  5. Computing and Applying Atomic Regulons to Understand Gene Expression and Regulation

    DOE PAGES

    Faria, José P.; Davis, James J.; Edirisinghe, Janaka N.; ...

    2016-11-24

    Understanding gene function and regulation is essential for the interpretation, prediction, and ultimate design of cell responses to changes in the environment. A multitude of technologies, abstractions, and interpretive frameworks have emerged to answer the challenges presented by genome function and regulatory network inference. Here, we propose a new approach for producing biologically meaningful clusters of coexpressed genes, called Atomic Regulons (ARs), based on expression data, gene context, and functional relationships. We demonstrate this new approach by computing ARs for Escherichia coli, which we compare with the coexpressed gene clusters predicted by two prevalent existing methods: hierarchical clustering and k-meansmore » clustering. We test the consistency of ARs predicted by all methods against expected interactions predicted by the Context Likelihood of Relatedness (CLR) mutual information based method, finding that the ARs produced by our approach show better agreement with CLR interactions. We then apply our method to compute ARs for four other genomes: Shewanella oneidensis, Pseudomonas aeruginosa, Thermus thermophilus, and Staphylococcus aureus. We compare the AR clusters from all genomes to study the similarity of coexpression among a phylogenetically diverse set of species, identifying subsystems that show remarkable similarity over wide phylogenetic distances. We also study the sensitivity of our method for computing ARs to the expression data used in the computation, showing that our new approach requires less data than competing approaches to converge to a near final configuration of ARs. We go on to use our sensitivity analysis to identify the specific experiments that lead most rapidly to the final set of ARs for E. coli. As a result, this analysis produces insights into improving the design of gene expression experiments.« less

  6. Computing and Applying Atomic Regulons to Understand Gene Expression and Regulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faria, José P.; Davis, James J.; Edirisinghe, Janaka N.

    Understanding gene function and regulation is essential for the interpretation, prediction, and ultimate design of cell responses to changes in the environment. A multitude of technologies, abstractions, and interpretive frameworks have emerged to answer the challenges presented by genome function and regulatory network inference. Here, we propose a new approach for producing biologically meaningful clusters of coexpressed genes, called Atomic Regulons (ARs), based on expression data, gene context, and functional relationships. We demonstrate this new approach by computing ARs for Escherichia coli, which we compare with the coexpressed gene clusters predicted by two prevalent existing methods: hierarchical clustering and k-meansmore » clustering. We test the consistency of ARs predicted by all methods against expected interactions predicted by the Context Likelihood of Relatedness (CLR) mutual information based method, finding that the ARs produced by our approach show better agreement with CLR interactions. We then apply our method to compute ARs for four other genomes: Shewanella oneidensis, Pseudomonas aeruginosa, Thermus thermophilus, and Staphylococcus aureus. We compare the AR clusters from all genomes to study the similarity of coexpression among a phylogenetically diverse set of species, identifying subsystems that show remarkable similarity over wide phylogenetic distances. We also study the sensitivity of our method for computing ARs to the expression data used in the computation, showing that our new approach requires less data than competing approaches to converge to a near final configuration of ARs. We go on to use our sensitivity analysis to identify the specific experiments that lead most rapidly to the final set of ARs for E. coli. As a result, this analysis produces insights into improving the design of gene expression experiments.« less

  7. The relationship between the structures of four beta-lactamases obtained from Bacillus cereus.

    PubMed

    Cid, H; Carrillo, O; Bunster, M; Martínez, J; Vargas, V

    1988-06-01

    Bacillus cereus has proved to be one of the most interesting microorganisms in the study of beta-lactamases. It secrets these enzymes very efficiently and, frequently, in multiple forms. Three different forms are produced by strain 569/H; mutant 5/B of the same microorganism is constitutive for the secretion of beta-lactamases I and II. The present study, based on secondary structure prediction by two independent methods, states the relationship among the structures of beta-lactamases I, II and III produced by B. cereus 569/H and beta-lactamase I from the strain 5/B of this microorganism. A strong similarity is also established for the enzyme type III of B. cereus and the enzyme type I produced by B. licheniformis which could have an evolutionary explanation. A structural analysis of the leader peptide regions of these enzymes by the method of Mohana and Argos is also reported.

  8. Overview of selected surrogate technologies for high-temporal resolution suspended-sediment monitoring

    USGS Publications Warehouse

    Gray, John R.; Gartner, Jeffrey W.

    2010-01-01

    Traditional methods for characterizing selected properties of suspended sediments in rivers are being augmented and in some cases replaced by cost-effective surrogate instruments and methods that produce a temporally dense time series of quantifiably accurate data for use primarily in sediment-flux computations. Turbidity is the most common such surrogate technology, and the first to be sanctioned by the U.S. Geological Survey for use in producing data used in concert with water-discharge data to compute sediment concentrations and fluxes for storage in the National Water Information System. Other technologies, including laser-diffraction, digital photo-optic, acoustic-attenuation and backscatter, and pressure-difference techniques are being evaluated for producing reliable sediment concentration and, in some cases, particle-size distribution data. Each technology addresses a niche for sediment monitoring. Their performances range from compelling to disappointing. Some of these technologies have the potential to revolutionize fluvial-sediment data collection, analysis, and availability.

  9. Method and Apparatus for the Portable Identification of Material Thickness and Defects Using Spatially Controlled Heat Application

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott (Inventor); Winfree, William P. (Inventor)

    1999-01-01

    A method and a portable apparatus for the nondestructive identification of defects in structures. The apparatus comprises a heat source and a thermal imager that move at a constant speed past a test surface of a structure. The thermal imager is off set at a predetermined distance from the heat source. The heat source induces a constant surface temperature. The imager follows the heat source and produces a video image of the thermal characteristics of the test surface. Material defects produce deviations from the constant surface temperature that move at the inverse of the constant speed. Thermal noise produces deviations that move at random speed. Computer averaging of the digitized thermal image data with respect to the constant speed minimizes noise and improves the signal of valid defects. The motion of thermographic equipment coupled with the high signal to noise ratio render it suitable for portable, on site analysis.

  10. Freud: a software suite for high-throughput simulation analysis

    NASA Astrophysics Data System (ADS)

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  11. Analysis and testing of a new method for drop size measurement using laser scatter interferometry

    NASA Technical Reports Server (NTRS)

    Bachalo, W. D.; Houser, M. J.

    1984-01-01

    Research was conducted on a laser light scatter detection method for measuring the size and velocity of spherical particles. The method is based upon the measurement of the interference fringe pattern produced by spheres passing through the intersection of two laser beams. A theoretical analysis of the method was carried out using the geometrical optics theory. Experimental verification of the theory was obtained by using monodisperse droplet streams. Several optical configurations were tested to identify all of the parametric effects upon the size measurements. Both off-axis forward and backscatter light detection were utilized. Simulated spray environments and fuel spray nozzles were used in the evaluation of the method. The measurements of the monodisperse drops showed complete agreement with the theoretical predictions. The method was demonstrated to be independent of the beam intensity and extinction resulting from the surrounding drops. Signal processing concepts were considered and a method was selected for development.

  12. Description of recovery method used for curdlan produced by Agrobacterium sp. IFO 13140 and its relation to the morphology and physicochemical and technological properties of the polysaccharide

    PubMed Central

    Mangolim, Camila Sampaio; da Silva, Thamara Thaiane; Fenelon, Vanderson Carvalho; Koga, Luciana Numata; Ferreira, Sabrina Barbosa de Souza; Bruschi, Marcos Luciano; Matioli, Graciette

    2017-01-01

    Curdlan is a linear polysaccharide considered a dietary fiber and with gelation properties. This study evaluated the structure, morphology and the physicochemical and technological properties of curdlan produced by Agrobacterium sp. IFO 13140 recovered by pre-gelation and precipitation methods. Commercial curdlan submitted or otherwise to the pre-gelation process was also evaluated. The data obtained from structural analysis revealed a similarity between the curdlan produced by Agrobacterium sp. IFO 13140 (recovered by both methods) and the commercial curdlans. The results showed that the curdlans evaluated differed significantly in terms of dispersibility and gelation, and only the pre-gelled ones had significant potential for food application, because this method influence on the size of the particles and in the presence of NaCl. In terms of technological properties, the curdlan produced by Agrobacterium sp. IFO 13140 (pre-gelation method) had a greater water and oil holding capacity (64% and 98% greater, respectively) and a greater thickening capacity than the pre-gelled commercial curdlan. The pre-gelled commercial curdlan displayed a greater gelling capacity at 95°C than the others. When applied to food, only the pre-gelled curdlans improved the texture parameters of yogurts and reduced syneresis. The curdlan gels, which are rigid and stable in structure, demonstrated potential for improving the texture of food products, with potential industrial use. PMID:28245244

  13. Description of recovery method used for curdlan produced by Agrobacterium sp. IFO 13140 and its relation to the morphology and physicochemical and technological properties of the polysaccharide.

    PubMed

    Mangolim, Camila Sampaio; Silva, Thamara Thaiane da; Fenelon, Vanderson Carvalho; Koga, Luciana Numata; Ferreira, Sabrina Barbosa de Souza; Bruschi, Marcos Luciano; Matioli, Graciette

    2017-01-01

    Curdlan is a linear polysaccharide considered a dietary fiber and with gelation properties. This study evaluated the structure, morphology and the physicochemical and technological properties of curdlan produced by Agrobacterium sp. IFO 13140 recovered by pre-gelation and precipitation methods. Commercial curdlan submitted or otherwise to the pre-gelation process was also evaluated. The data obtained from structural analysis revealed a similarity between the curdlan produced by Agrobacterium sp. IFO 13140 (recovered by both methods) and the commercial curdlans. The results showed that the curdlans evaluated differed significantly in terms of dispersibility and gelation, and only the pre-gelled ones had significant potential for food application, because this method influence on the size of the particles and in the presence of NaCl. In terms of technological properties, the curdlan produced by Agrobacterium sp. IFO 13140 (pre-gelation method) had a greater water and oil holding capacity (64% and 98% greater, respectively) and a greater thickening capacity than the pre-gelled commercial curdlan. The pre-gelled commercial curdlan displayed a greater gelling capacity at 95°C than the others. When applied to food, only the pre-gelled curdlans improved the texture parameters of yogurts and reduced syneresis. The curdlan gels, which are rigid and stable in structure, demonstrated potential for improving the texture of food products, with potential industrial use.

  14. Publication Bias in Meta-Analysis: Confidence Intervals for Rosenthal's Fail-Safe Number.

    PubMed

    Fragkos, Konstantinos C; Tsagris, Michail; Frangos, Christos C

    2014-01-01

    The purpose of the present paper is to assess the efficacy of confidence intervals for Rosenthal's fail-safe number. Although Rosenthal's estimator is highly used by researchers, its statistical properties are largely unexplored. First of all, we developed statistical theory which allowed us to produce confidence intervals for Rosenthal's fail-safe number. This was produced by discerning whether the number of studies analysed in a meta-analysis is fixed or random. Each case produces different variance estimators. For a given number of studies and a given distribution, we provided five variance estimators. Confidence intervals are examined with a normal approximation and a nonparametric bootstrap. The accuracy of the different confidence interval estimates was then tested by methods of simulation under different distributional assumptions. The half normal distribution variance estimator has the best probability coverage. Finally, we provide a table of lower confidence intervals for Rosenthal's estimator.

  15. Publication Bias in Meta-Analysis: Confidence Intervals for Rosenthal's Fail-Safe Number

    PubMed Central

    Fragkos, Konstantinos C.; Tsagris, Michail; Frangos, Christos C.

    2014-01-01

    The purpose of the present paper is to assess the efficacy of confidence intervals for Rosenthal's fail-safe number. Although Rosenthal's estimator is highly used by researchers, its statistical properties are largely unexplored. First of all, we developed statistical theory which allowed us to produce confidence intervals for Rosenthal's fail-safe number. This was produced by discerning whether the number of studies analysed in a meta-analysis is fixed or random. Each case produces different variance estimators. For a given number of studies and a given distribution, we provided five variance estimators. Confidence intervals are examined with a normal approximation and a nonparametric bootstrap. The accuracy of the different confidence interval estimates was then tested by methods of simulation under different distributional assumptions. The half normal distribution variance estimator has the best probability coverage. Finally, we provide a table of lower confidence intervals for Rosenthal's estimator. PMID:27437470

  16. A techno-economic analysis of polyhydroxyalkanoate and hydrogen production from syngas fermentation of gasified biomass.

    PubMed

    Choi, DongWon; Chipman, David C; Bents, Scott C; Brown, Robert C

    2010-02-01

    A techno-economic analysis was conducted to investigate the feasibility of a gasification-based hybrid biorefinery producing both hydrogen gas and polyhydroxyalkanoates (PHA), biodegradable polymer materials that can be an attractive substitute for conventional petrochemical plastics. The biorefinery considered used switchgrass as a feedstock and converted that raw material through thermochemical methods into syngas, a gaseous mixture composed mainly of hydrogen and carbon monoxide. The syngas was then fermented using Rhodospirillum rubrum, a purple non-sulfur bacterium, to produce PHA and to enrich hydrogen in the syngas. Total daily production of the biorefinery was assumed to be 12 Mg of PHA and 50 Mg of hydrogen gas. Grassroots capital for the biorefinery was estimated to be $55 million, with annual operating costs at $6.7 million. With a market value of $2.00/kg assumed for the hydrogen, the cost of producing PHA was determined to be $1.65/kg.

  17. A technique for estimating the absolute gain of a photomultiplier tube

    NASA Astrophysics Data System (ADS)

    Takahashi, M.; Inome, Y.; Yoshii, S.; Bamba, A.; Gunji, S.; Hadasch, D.; Hayashida, M.; Katagiri, H.; Konno, Y.; Kubo, H.; Kushida, J.; Nakajima, D.; Nakamori, T.; Nagayoshi, T.; Nishijima, K.; Nozaki, S.; Mazin, D.; Mashuda, S.; Mirzoyan, R.; Ohoka, H.; Orito, R.; Saito, T.; Sakurai, S.; Takeda, J.; Teshima, M.; Terada, Y.; Tokanai, F.; Yamamoto, T.; Yoshida, T.

    2018-06-01

    Detection of low-intensity light relies on the conversion of photons to photoelectrons, which are then multiplied and detected as an electrical signal. To measure the actual intensity of the light, one must know the factor by which the photoelectrons have been multiplied. To obtain this amplification factor, we have developed a procedure for estimating precisely the signal caused by a single photoelectron. The method utilizes the fact that the photoelectrons conform to a Poisson distribution. The average signal produced by a single photoelectron can then be estimated from the number of noise events, without requiring analysis of the distribution of the signal produced by a single photoelectron. The signal produced by one or more photoelectrons can be estimated experimentally without any assumptions. This technique, and an example of the analysis of a signal from a photomultiplier tube, are described in this study.

  18. Evaluation of a global algorithm for wavefront reconstruction for Shack-Hartmann wave-front sensors and thick fundus reflectors.

    PubMed

    Liu, Tao; Thibos, Larry; Marin, Gildas; Hernandez, Martha

    2014-01-01

    Conventional aberration analysis by a Shack-Hartmann aberrometer is based on the implicit assumption that an injected probe beam reflects from a single fundus layer. In fact, the biological fundus is a thick reflector and therefore conventional analysis may produce errors of unknown magnitude. We developed a novel computational method to investigate this potential failure of conventional analysis. The Shack-Hartmann wavefront sensor was simulated by computer software and used to recover by two methods the known wavefront aberrations expected from a population of normally-aberrated human eyes and bi-layer fundus reflection. The conventional method determines the centroid of each spot in the SH data image, from which wavefront slopes are computed for least-squares fitting with derivatives of Zernike polynomials. The novel 'global' method iteratively adjusted the aberration coefficients derived from conventional centroid analysis until the SH image, when treated as a unitary picture, optimally matched the original data image. Both methods recovered higher order aberrations accurately and precisely, but only the global algorithm correctly recovered the defocus coefficients associated with each layer of fundus reflection. The global algorithm accurately recovered Zernike coefficients for mean defocus and bi-layer separation with maximum error <0.1%. The global algorithm was robust for bi-layer separation up to 2 dioptres for a typical SH wavefront sensor design. For 100 randomly generated test wavefronts with 0.7 D axial separation, the retrieved mean axial separation was 0.70 D with standard deviations (S.D.) of 0.002 D. Sufficient information is contained in SH data images to measure the dioptric thickness of dual-layer fundus reflection. The global algorithm is superior since it successfully recovered the focus value associated with both fundus layers even when their separation was too small to produce clearly separated spots, while the conventional analysis misrepresents the defocus component of the wavefront aberration as the mean defocus for the two reflectors. Our novel global algorithm is a promising method for SH data image analysis in clinical and visual optics research for human and animal eyes. © 2013 The Authors Ophthalmic & Physiological Optics © 2013 The College of Optometrists.

  19. Psychological Flexibility, ACT, and Organizational Behavior

    ERIC Educational Resources Information Center

    Bond, Frank W.; Hayes, Steven C.; Barnes-Holmes, Dermot

    2006-01-01

    This paper offers organizational behavior management (OBM) a behavior analytically consistent way to expand its analysis of, and methods for changing, organizational behavior. It shows how Relational Frame Theory (RFT) suggests that common, problematic, psychological processes emerge from language itself, and they produce psychological…

  20. Experimental field test of proposed pedestrian safety messages. Volume 1, Methods and materials development

    DOT National Transportation Integrated Search

    1983-11-01

    Author's abstract: A detailed re-analysis of available pedestrian accident data was utilized to define three sets of pedestrian safety public information and education (PI&E) messages. These messages were then produced and field tested. The objective...

  1. Project for the analysis of technology transfer

    NASA Technical Reports Server (NTRS)

    Kottenstette, J. P.; Freeman, J. E.; Staskin, E. R.

    1971-01-01

    The special task of preparing technology transfer profiles during the first six months of 1971 produced two major results: refining a new method for identifying and describing technology transfer activities, and generating practical insights into a number of issues associated with transfer programs.

  2. A Primer On Consumer Marketing Research, Procedures, Methods, And Tools

    DOT National Transportation Integrated Search

    1994-03-01

    THE FOLLOWING IS ONE OF A SERIES OF PAPERS DEVELOPED OR PRODUCED BY THE ECONOMIC ANALYSIS DIVISION OF THE JOHN A. VOLPE NATIONAL TRANSPORTATION SYSTEMS CENTER AS PART OF ITS RESEARCH PROJECT LOOKING INTO ISSUES SURROUNDING : USER RESPONSE AND MARKET ...

  3. A novel combined approach of diffuse reflectance UV-Vis-NIR spectroscopy and multivariate analysis for non-destructive examination of blue ballpoint pen inks in forensic application.

    PubMed

    Kumar, Raj; Sharma, Vishal

    2017-03-15

    The present research is focused on the analysis of writing inks using destructive UV-Vis spectroscopy (dissolution of ink by the solvent) and non-destructive diffuse reflectance UV-Vis-NIR spectroscopy along with Chemometrics. Fifty seven samples of blue ballpoint pen inks were analyzed under optimum conditions to determine the differences in spectral features of inks among same and different manufacturers. Normalization was performed on the spectroscopic data before chemometric analysis. Principal Component Analysis (PCA) and K-mean cluster analysis were used on the data to ascertain whether the blue ballpoint pen inks could be differentiated by their UV-Vis/UV-Vis NIR spectra. The discriminating power is calculated by qualitative analysis by the visual comparison of the spectra (absorbance peaks), produced by the destructive and non-destructive methods. In the latter two methods, the pairwise comparison is made by incorporating the clustering method. It is found that chemometric method provides better discriminating power (98.72% and 99.46%, in destructive and non-destructive, respectively) in comparison to the qualitative analysis (69.67%). Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Producing intricate IPMC shapes by means of spray-painting and printing (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Trabia, Sarah; Olsen, Zakai; Hwang, Taeseon; Kim, Kwang Jin

    2017-04-01

    Ionic Polymer-Metal Composites (IPMC) are common soft actuators that are Nafion® based and plated with a conductive metal, such as platinum, gold, or palladium. Nafion® is available in three forms: sheets, pellets, and water dispersion. Nafion® sheets can be cut to the desired dimensions and are best for rectangular IPMCs. However, the user is not able to change the thickness of these sheets by stacking and melting because Nafion® does not melt. A solution to this is Nafion® pellets, which can melt. These can be used for extrusion and injection molding. Though Nafion® pellets can be melted, they are difficult to work with, making the process quite challenging to master. The last form is Nafion® Water Dispersion, which can be used for casting. Casting can produce the desired thickness, but it does not solve the problem of achieving complex contours. The current methods of fabrication do not allow for complex shapes and structures. To solve this problem, two methods are presented: painting and printing. The painting method uses Nafion® Water Dispersion, an airbrush, and vinyl stencils. The stencils can be made into any shape with detailed edges. The printing method uses Nafion® pellets that are extruded into filaments and a commercially available 3D printer. The models are drawn in a Computer-Aided Drawing (CAD) program, such as SolidWorks. The produced Nafion® membranes will be compared with a commercial Nafion® membrane through a variety of tests, including Fourier Transform Infrared Spectroscopy, Scanning Electron Microscope, Thermogravimetric Analysis, Dynamic Mechanical Analysis, and Optical Microscope.

  5. Static aeroelastic analysis and tailoring of a single-element racing car wing

    NASA Astrophysics Data System (ADS)

    Sadd, Christopher James

    This thesis presents the research from an Engineering Doctorate research programme in collaboration with Reynard Motorsport Ltd, a manufacturer of racing cars. Racing car wing design has traditionally considered structures to be rigid. However, structures are never perfectly rigid and the interaction between aerodynamic loading and structural flexibility has a direct impact on aerodynamic performance. This interaction is often referred to as static aeroelasticity and the focus of this research has been the development of a computational static aeroelastic analysis method to improve the design of a single-element racing car wing. A static aeroelastic analysis method has been developed by coupling a Reynolds-Averaged Navier-Stokes CFD analysis method with a Finite Element structural analysis method using an iterative scheme. Development of this method has included assessment of CFD and Finite Element analysis methods and development of data transfer and mesh deflection methods. Experimental testing was also completed to further assess the computational analyses. The computational and experimental results show a good correlation and these studies have also shown that a Navier-Stokes static aeroelastic analysis of an isolated wing can be performed at an acceptable computational cost. The static aeroelastic analysis tool was used to assess methods of tailoring the structural flexibility of the wing to increase its aerodynamic performance. These tailoring methods were then used to produce two final wing designs to increase downforce and reduce drag respectively. At the average operating dynamic pressure of the racing car, the computational analysis predicts that the downforce-increasing wing has a downforce of C[1]=-1.377 in comparison to C[1]=-1.265 for the original wing. The computational analysis predicts that the drag-reducing wing has a drag of C[d]=0.115 in comparison to C[d]=0.143 for the original wing.

  6. Analysis of the heavy oil production technology effectiveness using natural thermal convection with heat agent recirculation method in reservoirs with varying initial water saturation

    NASA Astrophysics Data System (ADS)

    Osnos, V. B.; Kuneevsky, V. V.; Larionov, V. M.; Saifullin, E. R.; Gainetdinov, A. V.; Vankov, Yu V.; Larionova, I. V.

    2017-01-01

    The method of natural thermal convection with heat agent recirculation (NTC HAR) in oil reservoirs is described. The analysis of the effectiveness of this method for oil reservoir heating with the values of water saturation from 0 to 0.5 units is conducted. As the test element Ashalchinskoye oil field is taken. CMG STARS software was used for calculations. Dynamics of cumulative production, recovery factor and specific energy consumption per 1 m3 of crude oil produced in the application of the heat exchanger with heat agent in cases of different initial water saturation are defined and presented as graphs.

  7. Time-series RNA-seq analysis package (TRAP) and its application to the analysis of rice, Oryza sativa L. ssp. Japonica, upon drought stress.

    PubMed

    Jo, Kyuri; Kwon, Hawk-Bin; Kim, Sun

    2014-06-01

    Measuring expression levels of genes at the whole genome level can be useful for many purposes, especially for revealing biological pathways underlying specific phenotype conditions. When gene expression is measured over a time period, we have opportunities to understand how organisms react to stress conditions over time. Thus many biologists routinely measure whole genome level gene expressions at multiple time points. However, there are several technical difficulties for analyzing such whole genome expression data. In addition, these days gene expression data is often measured by using RNA-sequencing rather than microarray technologies and then analysis of expression data is much more complicated since the analysis process should start with mapping short reads and produce differentially activated pathways and also possibly interactions among pathways. In addition, many useful tools for analyzing microarray gene expression data are not applicable for the RNA-seq data. Thus a comprehensive package for analyzing time series transcriptome data is much needed. In this article, we present a comprehensive package, Time-series RNA-seq Analysis Package (TRAP), integrating all necessary tasks such as mapping short reads, measuring gene expression levels, finding differentially expressed genes (DEGs), clustering and pathway analysis for time-series data in a single environment. In addition to implementing useful algorithms that are not available for RNA-seq data, we extended existing pathway analysis methods, ORA and SPIA, for time series analysis and estimates statistical values for combined dataset by an advanced metric. TRAP also produces visual summary of pathway interactions. Gene expression change labeling, a practical clustering method used in TRAP, enables more accurate interpretation of the data when combined with pathway analysis. We applied our methods on a real dataset for the analysis of rice (Oryza sativa L. Japonica nipponbare) upon drought stress. The result showed that TRAP was able to detect pathways more accurately than several existing methods. TRAP is available at http://biohealth.snu.ac.kr/software/TRAP/. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Establishing Substantial Equivalence: Proteomics

    NASA Astrophysics Data System (ADS)

    Lovegrove, Alison; Salt, Louise; Shewry, Peter R.

    Wheat is a major crop in world agriculture and is consumed after processing into a range of food products. It is therefore of great importance to determine the consequences (intended and unintended) of transgenesis in wheat and whether genetically modified lines are substantially equivalent to those produced by conventional plant breeding. Proteomic analysis is one of several approaches which can be used to address these questions. Two-dimensional PAGE (2D PAGE) remains the most widely available method for proteomic analysis, but is notoriously difficult to reproduce between laboratories. We therefore describe methods which have been developed as standard operating procedures in our laboratory to ensure the reproducibility of proteomic analyses of wheat using 2D PAGE analysis of grain proteins.

  9. [Survival analysis with competing risks: estimating failure probability].

    PubMed

    Llorca, Javier; Delgado-Rodríguez, Miguel

    2004-01-01

    To show the impact of competing risks of death on survival analysis. We provide an example of survival time without chronic rejection after heart transplantation, where death before rejection acts as a competing risk. Using a computer simulation, we compare the Kaplan-Meier estimator and the multiple decrement model. The Kaplan-Meier method overestimated the probability of rejection. Next, we illustrate the use of the multiple decrement model to analyze secondary end points (in our example: death after rejection). Finally, we discuss Kaplan-Meier assumptions and why they fail in the presence of competing risks. Survival analysis should be adjusted for competing risks of death to avoid overestimation of the risk of rejection produced with the Kaplan-Meier method.

  10. Science on TeacherTube: A Mixed Methods Analysis of Teacher Produced Video

    NASA Astrophysics Data System (ADS)

    Chmiel, Margaret (Marjee)

    Increased bandwidth, inexpensive video cameras and easy-to-use video editing software have made social media sites featuring user generated video (UGV) an increasingly popular vehicle for online communication. As such, UGV have come to play a role in education, both formal and informal, but there has been little research on this topic in scholarly literature. In this mixed-methods study, a content and discourse analysis are used to describe the most successful UGV in the science channel of an education-focused site called TeacherTube. The analysis finds that state achievement tests, and their focus on vocabulary and recall-level knowledge, drive much of the content found on TeacherTube.

  11. Methods for producing complex films, and films produced thereby

    DOEpatents

    Duty, Chad E.; Bennett, Charlee J. C.; Moon, Ji -Won; Phelps, Tommy J.; Blue, Craig A.; Dai, Quanqin; Hu, Michael Z.; Ivanov, Ilia N.; Jellison, Jr., Gerald E.; Love, Lonnie J.; Ott, Ronald D.; Parish, Chad M.; Walker, Steven

    2015-11-24

    A method for producing a film, the method comprising melting a layer of precursor particles on a substrate until at least a portion of the melted particles are planarized and merged to produce the film. The invention is also directed to a method for producing a photovoltaic film, the method comprising depositing particles having a photovoltaic or other property onto a substrate, and affixing the particles to the substrate, wherein the particles may or may not be subsequently melted. Also described herein are films produced by these methods, methods for producing a patterned film on a substrate, and methods for producing a multilayer structure.

  12. Static Aeroelastic Analysis with an Inviscid Cartesian Method

    NASA Technical Reports Server (NTRS)

    Rodriguez, David L.; Aftosmis, Michael J.; Nemec, Marian; Smith, Stephen C.

    2014-01-01

    An embedded-boundary, Cartesian-mesh flow solver is coupled with a three degree-of-freedom structural model to perform static, aeroelastic analysis of complex aircraft geometries. The approach solves a nonlinear, aerostructural system of equations using a loosely-coupled strategy. An open-source, 3-D discrete-geometry engine is utilized to deform a triangulated surface geometry according to the shape predicted by the structural model under the computed aerodynamic loads. The deformation scheme is capable of modeling large deflections and is applicable to the design of modern, very-flexible transport wings. The coupling interface is modular so that aerodynamic or structural analysis methods can be easily swapped or enhanced. After verifying the structural model with comparisons to Euler beam theory, two applications of the analysis method are presented as validation. The first is a relatively stiff, transport wing model which was a subject of a recent workshop on aeroelasticity. The second is a very flexible model recently tested in a low speed wind tunnel. Both cases show that the aeroelastic analysis method produces results in excellent agreement with experimental data.

  13. Inhibitory effect of burdock leaves on elastase and tyrosinase activity.

    PubMed

    Horng, Chi-Ting; Wu, Hsing-Chen; Chiang, Ni-Na; Lee, Chiu-Fang; Huang, Yu-Syuan; Wang, Hui-Yun; Yang, Jai-Sing; Chen, Fu-An

    2017-10-01

    Burdock ( Arctium lappa L.) leaves generate a considerable amount of waste following burdock root harvest in Taiwan. To increase the use of burdock leaves, the present study investigated the optimal methods for producing burdock leaf extract (BLE) with high antioxidant polyphenolic content, including drying methods and solvent extraction concentration. In addition, the elastase and tyrosinase inhibitory activity of BLE was examined. Burdock leaves were dried by four methods: Shadow drying, oven drying, sun drying and freeze-drying. The extract solution was then subjected to total polyphenol content analysis and the method that produced BLE with the highest amount of total antioxidant components was taken forward for further analysis. The 1,1-diphenyl-2-pycrylhydrazyl scavenging, antielastase and antityrosinase activity of the BLE were measured to enable the evaluation of the antioxidant and skin aging-associated enzyme inhibitory activities of BLE. The results indicated that the total polyphenolic content following extraction with ethanol (EtOH) was highest using the freeze-drying method, followed by the oven drying, shadow drying and sun drying methods. BLE yielded a higher polyphenol content and stronger antioxidant activity as the ratio of the aqueous content of the extraction solvent used increased. BLE possesses marked tyrosinase and elastase inhibitory activities, with its antielastase activity notably stronger compared with its antityrosinase activity. These results indicate that the concentration of the extraction solvent was associated with the antioxidant and skin aging-associated enzyme inhibitory activity of BLE. The reactive oxygen species scavenging theory of skin aging may explain the tyrosinase and elastase inhibitory activity of BLE. In conclusion, the optimal method for obtaining BLE with a high antioxidant polyphenolic content was freeze-drying followed by 30-50% EtOH extraction. In addition, the antielastase and antityrosinase activities of the BLE produced may be aid in the development of skincare products with antiwrinkle and skin-evening properties.

  14. Inhibitory effect of burdock leaves on elastase and tyrosinase activity

    PubMed Central

    Horng, Chi-Ting; Wu, Hsing-Chen; Chiang, Ni-Na; Lee, Chiu-Fang; Huang, Yu-Syuan; Wang, Hui-Yun; Yang, Jai-Sing; Chen, Fu-An

    2017-01-01

    Burdock (Arctium lappa L.) leaves generate a considerable amount of waste following burdock root harvest in Taiwan. To increase the use of burdock leaves, the present study investigated the optimal methods for producing burdock leaf extract (BLE) with high antioxidant polyphenolic content, including drying methods and solvent extraction concentration. In addition, the elastase and tyrosinase inhibitory activity of BLE was examined. Burdock leaves were dried by four methods: Shadow drying, oven drying, sun drying and freeze-drying. The extract solution was then subjected to total polyphenol content analysis and the method that produced BLE with the highest amount of total antioxidant components was taken forward for further analysis. The 1,1-diphenyl-2-pycrylhydrazyl scavenging, antielastase and antityrosinase activity of the BLE were measured to enable the evaluation of the antioxidant and skin aging-associated enzyme inhibitory activities of BLE. The results indicated that the total polyphenolic content following extraction with ethanol (EtOH) was highest using the freeze-drying method, followed by the oven drying, shadow drying and sun drying methods. BLE yielded a higher polyphenol content and stronger antioxidant activity as the ratio of the aqueous content of the extraction solvent used increased. BLE possesses marked tyrosinase and elastase inhibitory activities, with its antielastase activity notably stronger compared with its antityrosinase activity. These results indicate that the concentration of the extraction solvent was associated with the antioxidant and skin aging-associated enzyme inhibitory activity of BLE. The reactive oxygen species scavenging theory of skin aging may explain the tyrosinase and elastase inhibitory activity of BLE. In conclusion, the optimal method for obtaining BLE with a high antioxidant polyphenolic content was freeze-drying followed by 30–50% EtOH extraction. In addition, the antielastase and antityrosinase activities of the BLE produced may be aid in the development of skincare products with antiwrinkle and skin-evening properties. PMID:28912875

  15. Measuring signal-to-noise ratio in partially parallel imaging MRI

    PubMed Central

    Goerner, Frank L.; Clarke, Geoffrey D.

    2011-01-01

    Purpose: To assess five different methods of signal-to-noise ratio (SNR) measurement for partially parallel imaging (PPI) acquisitions. Methods: Measurements were performed on a spherical phantom and three volunteers using a multichannel head coil a clinical 3T MRI system to produce echo planar, fast spin echo, gradient echo, and balanced steady state free precession image acquisitions. Two different PPI acquisitions, generalized autocalibrating partially parallel acquisition algorithm and modified sensitivity encoding with acceleration factors (R) of 2–4, were evaluated and compared to nonaccelerated acquisitions. Five standard SNR measurement techniques were investigated and Bland–Altman analysis was used to determine agreement between the various SNR methods. The estimated g-factor values, associated with each method of SNR calculation and PPI reconstruction method, were also subjected to assessments that considered the effects on SNR due to reconstruction method, phase encoding direction, and R-value. Results: Only two SNR measurement methods produced g-factors in agreement with theoretical expectations (g ≥ 1). Bland–Altman tests demonstrated that these two methods also gave the most similar results relative to the other three measurements. R-value was the only factor of the three we considered that showed significant influence on SNR changes. Conclusions: Non-signal methods used in SNR evaluation do not produce results consistent with expectations in the investigated PPI protocols. Two of the methods studied provided the most accurate and useful results. Of these two methods, it is recommended, when evaluating PPI protocols, the image subtraction method be used for SNR calculations due to its relative accuracy and ease of implementation. PMID:21978049

  16. A new state space model for the NASA/JPL 70-meter antenna servo controls

    NASA Technical Reports Server (NTRS)

    Hill, R. E.

    1987-01-01

    A control axis referenced model of the NASA/JPL 70-m antenna structure is combined with the dynamic equations of servo components to produce a comprehansive state variable (matrix) model of the coupled system. An interactive Fortran program for generating the linear system model and computing its salient parameters is described. Results are produced in a state variable, block diagram, and in factored transfer function forms to facilitate design and analysis by classical as well as modern control methods.

  17. Automated classification of brain tumor type in whole-slide digital pathology images using local representative tiles.

    PubMed

    Barker, Jocelyn; Hoogi, Assaf; Depeursinge, Adrien; Rubin, Daniel L

    2016-05-01

    Computerized analysis of digital pathology images offers the potential of improving clinical care (e.g. automated diagnosis) and catalyzing research (e.g. discovering disease subtypes). There are two key challenges thwarting computerized analysis of digital pathology images: first, whole slide pathology images are massive, making computerized analysis inefficient, and second, diverse tissue regions in whole slide images that are not directly relevant to the disease may mislead computerized diagnosis algorithms. We propose a method to overcome both of these challenges that utilizes a coarse-to-fine analysis of the localized characteristics in pathology images. An initial surveying stage analyzes the diversity of coarse regions in the whole slide image. This includes extraction of spatially localized features of shape, color and texture from tiled regions covering the slide. Dimensionality reduction of the features assesses the image diversity in the tiled regions and clustering creates representative groups. A second stage provides a detailed analysis of a single representative tile from each group. An Elastic Net classifier produces a diagnostic decision value for each representative tile. A weighted voting scheme aggregates the decision values from these tiles to obtain a diagnosis at the whole slide level. We evaluated our method by automatically classifying 302 brain cancer cases into two possible diagnoses (glioblastoma multiforme (N = 182) versus lower grade glioma (N = 120)) with an accuracy of 93.1% (p < 0.001). We also evaluated our method in the dataset provided for the 2014 MICCAI Pathology Classification Challenge, in which our method, trained and tested using 5-fold cross validation, produced a classification accuracy of 100% (p < 0.001). Our method showed high stability and robustness to parameter variation, with accuracy varying between 95.5% and 100% when evaluated for a wide range of parameters. Our approach may be useful to automatically differentiate between the two cancer subtypes. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. A statistical analysis of the association between tropical cyclone intensity change and tornado frequency

    NASA Astrophysics Data System (ADS)

    Moore, Todd W.

    2016-07-01

    Tropical cyclones often produce tornadoes that have the potential to compound the injury and fatality counts and the economic losses associated with tropical cyclones. These tornadoes do not occur uniformly through time or across space. Multiple statistical methods were used in this study to analyze the association between tropical cyclone intensity change and tornado frequency. Results indicate that there is an association between the two and that tropical cyclones tend to produce more tornadoes when they are weakening, but the association is weak. Tropical cyclones can also produce a substantial number of tornadoes when they are relatively stable or strengthening.

  19. Phyto-reduction of graphene oxide using the aqueous extract of Eichhornia crassipes (Mart.) Solms

    NASA Astrophysics Data System (ADS)

    Firdhouse, M. Jannathul; Lalitha, P.

    2014-10-01

    The aqueous extract of Eichhornia crassipes was used as reductant to produce graphene from graphene oxide by refluxing method. The complete reduction of graphene oxide was monitored using UV-Vis spectrophotometer. Characterization of graphene was made through FTIR, XRD, and Raman spectroscopy analysis. The stability of graphene was studied by thermal gravimetric analysis and zeta potential measurements. The nature and surface morphology of the synthesized graphene was analyzed by transmission electron microscopy. The production of graphene using phytoextract as reductant emphasizes on the facile method of synthesis and greener nanotechnology.

  20. Accuracy of remotely sensed data: Sampling and analysis procedures

    NASA Technical Reports Server (NTRS)

    Congalton, R. G.; Oderwald, R. G.; Mead, R. A.

    1982-01-01

    A review and update of the discrete multivariate analysis techniques used for accuracy assessment is given. A listing of the computer program written to implement these techniques is given. New work on evaluating accuracy assessment using Monte Carlo simulation with different sampling schemes is given. The results of matrices from the mapping effort of the San Juan National Forest is given. A method for estimating the sample size requirements for implementing the accuracy assessment procedures is given. A proposed method for determining the reliability of change detection between two maps of the same area produced at different times is given.

  1. Failure Bounding And Sensitivity Analysis Applied To Monte Carlo Entry, Descent, And Landing Simulations

    NASA Technical Reports Server (NTRS)

    Gaebler, John A.; Tolson, Robert H.

    2010-01-01

    In the study of entry, descent, and landing, Monte Carlo sampling methods are often employed to study the uncertainty in the designed trajectory. The large number of uncertain inputs and outputs, coupled with complicated non-linear models, can make interpretation of the results difficult. Three methods that provide statistical insights are applied to an entry, descent, and landing simulation. The advantages and disadvantages of each method are discussed in terms of the insights gained versus the computational cost. The first method investigated was failure domain bounding which aims to reduce the computational cost of assessing the failure probability. Next a variance-based sensitivity analysis was studied for the ability to identify which input variable uncertainty has the greatest impact on the uncertainty of an output. Finally, probabilistic sensitivity analysis is used to calculate certain sensitivities at a reduced computational cost. These methods produce valuable information that identifies critical mission parameters and needs for new technology, but generally at a significant computational cost.

  2. Comparison of GEOS-5 AGCM Planetary Boundary Layer Depths Computed with Various Definitions

    NASA Technical Reports Server (NTRS)

    Mcgrath-Spangler, E. L.; Molod, A.

    2014-01-01

    Accurate models of planetary boundary layer (PBL) processes are important for forecasting weather and climate. The present study compares seven methods of calculating PBL depth in the GEOS-5 atmospheric general circulation model (AGCM) over land. These methods depend on the eddy diffusion coefficients, bulk and local Richardson numbers, and the turbulent kinetic energy. The computed PBL depths are aggregated to the Koppen climate classes, and some limited comparisons are made using radiosonde profiles. Most methods produce similar midday PBL depths, although in the warm, moist climate classes, the bulk Richardson number method gives midday results that are lower than those given by the eddy diffusion coefficient methods. Additional analysis revealed that methods sensitive to turbulence driven by radiative cooling produce greater PBL depths, this effect being most significant during the evening transition. Nocturnal PBLs based on Richardson number are generally shallower than eddy diffusion coefficient based estimates. The bulk Richardson number estimate is recommended as the PBL height to inform the choice of the turbulent length scale, based on the similarity to other methods during the day, and the improved nighttime behavior.

  3. Comparison of GEOS-5 AGCM planetary boundary layer depths computed with various definitions

    NASA Astrophysics Data System (ADS)

    McGrath-Spangler, E. L.; Molod, A.

    2014-03-01

    Accurate models of planetary boundary layer (PBL) processes are important for forecasting weather and climate. The present study compares seven methods of calculating PBL depth in the GEOS-5 atmospheric general circulation model (AGCM) over land. These methods depend on the eddy diffusion coefficients, bulk and local Richardson numbers, and the turbulent kinetic energy. The computed PBL depths are aggregated to the Köppen climate classes, and some limited comparisons are made using radiosonde profiles. Most methods produce similar midday PBL depths, although in the warm, moist climate classes, the bulk Richardson number method gives midday results that are lower than those given by the eddy diffusion coefficient methods. Additional analysis revealed that methods sensitive to turbulence driven by radiative cooling produce greater PBL depths, this effect being most significant during the evening transition. Nocturnal PBLs based on Richardson number are generally shallower than eddy diffusion coefficient based estimates. The bulk Richardson number estimate is recommended as the PBL height to inform the choice of the turbulent length scale, based on the similarity to other methods during the day, and the improved nighttime behavior.

  4. Development of parallel algorithms for electrical power management in space applications

    NASA Technical Reports Server (NTRS)

    Berry, Frederick C.

    1989-01-01

    The application of parallel techniques for electrical power system analysis is discussed. The Newton-Raphson method of load flow analysis was used along with the decomposition-coordination technique to perform load flow analysis. The decomposition-coordination technique enables tasks to be performed in parallel by partitioning the electrical power system into independent local problems. Each independent local problem represents a portion of the total electrical power system on which a loan flow analysis can be performed. The load flow analysis is performed on these partitioned elements by using the Newton-Raphson load flow method. These independent local problems will produce results for voltage and power which can then be passed to the coordinator portion of the solution procedure. The coordinator problem uses the results of the local problems to determine if any correction is needed on the local problems. The coordinator problem is also solved by an iterative method much like the local problem. The iterative method for the coordination problem will also be the Newton-Raphson method. Therefore, each iteration at the coordination level will result in new values for the local problems. The local problems will have to be solved again along with the coordinator problem until some convergence conditions are met.

  5. Verification and validation of a Work Domain Analysis with turing machine task analysis.

    PubMed

    Rechard, J; Bignon, A; Berruet, P; Morineau, T

    2015-03-01

    While the use of Work Domain Analysis as a methodological framework in cognitive engineering is increasing rapidly, verification and validation of work domain models produced by this method are becoming a significant issue. In this article, we propose the use of a method based on Turing machine formalism named "Turing Machine Task Analysis" to verify and validate work domain models. The application of this method on two work domain analyses, one of car driving which is an "intentional" domain, and the other of a ship water system which is a "causal domain" showed the possibility of highlighting improvements needed by these models. More precisely, the step by step analysis of a degraded task scenario in each work domain model pointed out unsatisfactory aspects in the first modelling, like overspecification, underspecification, omission of work domain affordances, or unsuitable inclusion of objects in the work domain model. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  6. An extended GS method for dense linear systems

    NASA Astrophysics Data System (ADS)

    Niki, Hiroshi; Kohno, Toshiyuki; Abe, Kuniyoshi

    2009-09-01

    Davey and Rosindale [K. Davey, I. Rosindale, An iterative solution scheme for systems of boundary element equations, Internat. J. Numer. Methods Engrg. 37 (1994) 1399-1411] derived the GSOR method, which uses an upper triangular matrix [Omega] in order to solve dense linear systems. By applying functional analysis, the authors presented an expression for the optimum [Omega]. Moreover, Davey and Bounds [K. Davey, S. Bounds, A generalized SOR method for dense linear systems of boundary element equations, SIAM J. Comput. 19 (1998) 953-967] also introduced further interesting results. In this note, we employ a matrix analysis approach to investigate these schemes, and derive theorems that compare these schemes with existing preconditioners for dense linear systems. We show that the convergence rate of the Gauss-Seidel method with preconditioner PG is superior to that of the GSOR method. Moreover, we define some splittings associated with the iterative schemes. Some numerical examples are reported to confirm the theoretical analysis. We show that the EGS method with preconditioner produces an extremely small spectral radius in comparison with the other schemes considered.

  7. Electrical Engineering in Los Alamos Neutron Science Center Accelerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silva, Michael James

    The field of electrical engineering plays a significant role in particle accelerator design and operations. Los Alamos National Laboratories LANSCE facility utilizes the electrical energy concepts of power distribution, plasma generation, radio frequency energy, electrostatic acceleration, signals and diagnostics. The culmination of these fields produces a machine of incredible potential with uses such as isotope production, neutron spallation, neutron imaging and particle analysis. The key isotope produced in LANSCE isotope production facility is Strontium-82 which is utilized for medical uses such as cancer treatment and positron emission tomography also known as PET scans. Neutron spallation is one of the verymore » few methods used to produce neutrons for scientific research the other methods are natural decay of transuranic elements from nuclear reactors. Accelerator produce neutrons by accelerating charged particles into neutron dense elements such as tungsten imparting a neutral particle with kinetic energy, this has the benefit of producing a large number of neutrons as well as minimizing the waste generated. Utilizing the accelerator scientist can gain an understanding of how various particles behave and interact with matter to better understand the natural laws of physics and the universe around us.« less

  8. Diagnostics of Polymer Composite Materials and Analysis of Their Production Technology by Using the Method of Acoustic Emission

    NASA Astrophysics Data System (ADS)

    Bashkov, O. V.; Protsenko, A. E.; Bryanskii, A. A.; Romashko, R. V.

    2017-09-01

    The strength properties of glass-fiber-reinforced plastics produced by vacuum and vacuum autoclave molding techniques are studied. Based on acoustic emission data, a method of diagnostic and prediction of the bearing capacity of polymer composite materials by using data from three-point bending tests is developed. The method is based on evaluating changes in the exponent of a power function relating the total acoustic emission to the test stress.

  9. Design, ancillary testing, analysis and fabrication data for the advanced composite stabilizer for Boeing 737 aircraft. Volume 1: Technical summary

    NASA Technical Reports Server (NTRS)

    Aniversario, R. B.; Harvey, S. T.; Mccarty, J. E.; Parsons, J. T.; Peterson, D. C.; Pritchett, L. D.; Wilson, D. R.; Wogulis, E. R.

    1983-01-01

    The horizontal stabilizer of the 737 transport was redesigned. Five shipsets were fabricated using composite materials. Weight reduction greater than the 20% goal was achieved. Parts and assemblies were readily produced on production-type tooling. Quality assurance methods were demonstrated. Repair methods were developed and demonstrated. Strength and stiffness analytical methods were substantiated by comparison with test results. Cost data was accumulated in a semiproduction environment. FAA certification was obtained.

  10. Authentication of animal origin of heparin and low molecular weight heparin including ovine, porcine and bovine species using 1D NMR spectroscopy and chemometric tools.

    PubMed

    Monakhova, Yulia B; Diehl, Bernd W K; Fareed, Jawed

    2018-02-05

    High resolution (600MHz) nuclear magnetic resonance (NMR) spectroscopy is used to distinguish heparin and low-molecular weight heparins (LMWHs) produced from porcine, bovine and ovine mucosal tissues as well as their blends. For multivariate analysis several statistical methods such as principal component analysis (PCA), factor discriminant analysis (FDA), partial least squares - discriminant analysis (PLS-DA), linear discriminant analysis (LDA) were utilized for the modeling of NMR data of more than 100 authentic samples. Heparin and LMWH samples from the independent test set (n=15) were 100% correctly classified according to its animal origin. Moreover, by using 1 H NMR coupled with chemometrics and several batches of bovine heparins from two producers were differentiated. Thus, NMR spectroscopy combined with chemometrics is an efficient tool for simultaneous identification of animal origin and process based manufacturing difference in heparin products. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Wavefield iterative deconvolution to remove multiples and produce phase specific Ps receiver functions

    NASA Astrophysics Data System (ADS)

    Ainiwaer, A.; Gurrola, H.

    2018-03-01

    Common conversion point stacking or migration of receiver functions (RFs) and H-k (H is depth and k is Vp/Vs) stacking of RFs has become a common method to study the crust and upper mantle beneath broad-band three-component seismic stations. However, it can be difficult to interpret Pds RFs due to interference between the Pds, PPds and PSds phases, especially in the mantle portion of the lithosphere. We propose a phase separation method to isolate the prominent phases of the RFs and produce separate Pds, PPds and PSds `phase specific' receiver functions (referred to as PdsRFs, PPdsRFs and PSdsRFs, respectively) by deconvolution of the wavefield rather than single seismograms. One of the most important products of this deconvolution method is to produce Ps receiver functions (PdsRFs) that are free of crustal multiples. This is accomplished by using H-k analysis to identify specific phases in the wavefield from all seismograms recorded at a station which enables development of an iterative deconvolution procedure to produce the above-mentioned phase specific RFs. We refer to this method as wavefield iterative deconvolution (WID). The WID method differentiates and isolates different RF phases by exploiting their differences in moveout curves across the entire wave front. We tested the WID by applying it to synthetic seismograms produced using a modified version of the PREM velocity model. The WID effectively separates phases from each stacked RF in synthetic data. We also applied this technique to produce RFs from seismograms recorded at ARU (a broad-band station in Arti, Russia). The phase specific RFs produced using WID are easier to interpret than traditional RFs. The PdsRFs computed using WID are the most improved, owing to the distinct shape of its moveout curves as compared to the moveout curves for the PPds and PSds phases. The importance of this WID method is most significant in reducing interference between phases for depths of less than 300 km. Phases from deeper layers (i.e. P660s as compared to PP220s) are less likely to be misinterpreted because the large amount of moveout causes the appropriate phases to stack coherently if there is sufficient distribution in ray parameter. WID is most effective in producing clean PdsRFs that are relatively free of reverberations whereas PPdsRFs and PSdsRFs retain contamination from reverberations.

  12. Simulation of anisoplanatic imaging through optical turbulence using numerical wave propagation with new validation analysis

    NASA Astrophysics Data System (ADS)

    Hardie, Russell C.; Power, Jonathan D.; LeMaster, Daniel A.; Droege, Douglas R.; Gladysz, Szymon; Bose-Pillai, Santasri

    2017-07-01

    We present a numerical wave propagation method for simulating imaging of an extended scene under anisoplanatic conditions. While isoplanatic simulation is relatively common, few tools are specifically designed for simulating the imaging of extended scenes under anisoplanatic conditions. We provide a complete description of the proposed simulation tool, including the wave propagation method used. Our approach computes an array of point spread functions (PSFs) for a two-dimensional grid on the object plane. The PSFs are then used in a spatially varying weighted sum operation, with an ideal image, to produce a simulated image with realistic optical turbulence degradation. The degradation includes spatially varying warping and blurring. To produce the PSF array, we generate a series of extended phase screens. Simulated point sources are numerically propagated from an array of positions on the object plane, through the phase screens, and ultimately to the focal plane of the simulated camera. Note that the optical path for each PSF will be different, and thus, pass through a different portion of the extended phase screens. These different paths give rise to a spatially varying PSF to produce anisoplanatic effects. We use a method for defining the individual phase screen statistics that we have not seen used in previous anisoplanatic simulations. We also present a validation analysis. In particular, we compare simulated outputs with the theoretical anisoplanatic tilt correlation and a derived differential tilt variance statistic. This is in addition to comparing the long- and short-exposure PSFs and isoplanatic angle. We believe this analysis represents the most thorough validation of an anisoplanatic simulation to date. The current work is also unique that we simulate and validate both constant and varying Cn2(z) profiles. Furthermore, we simulate sequences with both temporally independent and temporally correlated turbulence effects. Temporal correlation is introduced by generating even larger extended phase screens and translating this block of screens in front of the propagation area. Our validation analysis shows an excellent match between the simulation statistics and the theoretical predictions. Thus, we think this tool can be used effectively to study optical anisoplanatic turbulence and to aid in the development of image restoration methods.

  13. Plasmid Profile Analysis and bla VIM Gene Detection of Metalo β-lactamase (MBL) Producing Pseudomonas aeruginosa Isolates from Clinical Samples

    PubMed Central

    M, Jeya

    2014-01-01

    Introduction:Pseudomonas aeruginosa is a frequent colonizer of hospitalized patients. They are responsible for serious infections such as meningitis, urological infections, septicemia and pneumonia. Carbapenem resistance of Pseudomonas aeruginosa is currently increasingly reported which is often mediated by production of metallo-β-lactamase (MBL). Multidrug resistant Pseudomonas aeruginosa isolates may involve reduced cell wall permeability, production of chromosomal and plasmid mediated β lactamases, aminoglycosides modifying enzymes and an active multidrug efflux mechanism. Objective: This study is aimed to detect the presence and the nature of plasmids among metallo-β-lactamase producing Pseudomonas aeruginosa isolates. Also to detect the presence of bla VIM gene from these isolates. Materials and Methods: Clinical isolates of Pseudomonas aeruginosa showing the metalo-β-lactamase enzyme (MBL) production were isolated. The MBL production was confirmed by three different methods. From the MBL producing isolates plasmid extraction was done by alkaline lysis method. Plasmid positive isolates were subjected for blaVIM gene detection by PCR method. Results: Two thousand seventy six clinical samples yielded 316 (15.22%) Pseudomonas aeruginosa isolates, out of which 141 (44.62%) were multidrug resistant. Among them 25 (17.73%) were metallo-β-lactamase enzyme producers. Plasmids were extracted from 18 out of 25 isolates tested. Five out of 18 isolates were positive for the blaVIM gene detection by the PCR amplification. Conclusion: The MBL producers were susceptible to polymyxin /colistin with MIC ranging from 0.5 – 2μg/ml. Molecular detection of specific genes bla VIM were positive among the carbapenem resistant isolates. PMID:25120980

  14. Simulation of 7050 Wrought Aluminum Alloy Wheel Die Forging and its Defects Analysis based on DEFORM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang Shiquan; Yi Youping; Zhang Yuxun

    2010-06-15

    Defects such as folding, intercrystalline cracking and flow lines outcrop are very likely to occur in the forging of aluminum alloy. Moreover, it is difficult to achieve the optimal set of process parameters just by trial and error within an industrial environment. In producing 7050 wrought aluminum alloy wheel, a rigid-plastic finite element method (FEM) analysis has been performed to optimize die forging process. Processing parameters were analyzed, focusing on the effects of punch speed, friction factor and temperature. Meanwhile, mechanism as well as the evolution with respect to the defects of the wrought wheel was studied in details. Frommore » an analysis of the results, isothermal die forging was proposed for producing 7050 aluminum alloy wheel with good mechanical properties. Finally, verification experiment was carried out on hydropress.« less

  15. Development of Methodologies Evaluating Emissions from Metal-Containing Explosives and Propellants

    DTIC Science & Technology

    Experiments were performed to develop methodologies that will allow determination of pollutant emission factors for gases and particles produced by...micrometer, 16 by weight). Although not included here, the analysis methods described will be directly applicable to the study of pyrotechnics.

  16. Automated Assessment of Child Vocalization Development Using LENA

    ERIC Educational Resources Information Center

    Richards, Jeffrey A.; Xu, Dongxin; Gilkerson, Jill; Yapanel, Umit; Gray, Sharmistha; Paul, Terrance

    2017-01-01

    Purpose: To produce a novel, efficient measure of children's expressive vocal development on the basis of automatic vocalization assessment (AVA), child vocalizations were automatically identified and extracted from audio recordings using Language Environment Analysis (LENA) System technology. Method: Assessment was based on full-day audio…

  17. METHODS TO CLASSIFY ENVIRONMENTAL SAMPLES BASED ON MOLD ANALYSES BY QPCR

    EPA Science Inventory

    Quantitative PCR (QPCR) analysis of molds in indoor environmental samples produces highly accurate speciation and enumeration data. In a number of studies, eighty of the most common or potentially problematic indoor molds were identified and quantified in dust samples from homes...

  18. AN HPLC METHOD WITH UV DETECTION, PH CONTROL, AND REDUCTIVE ASCORBIC ACID FOR CYANURIC ACID ANALYSIS IN WATER

    EPA Science Inventory

    Every year over 250 million pounds of cyanuric acid (CA) and chloroisocyanurates are produced industrially. These compounds are standard ingredients in formulations for household bleaches, industrial cleansers, dishwasher compounds, general sanitizers, and chlorine stabilizers. ...

  19. AN HPLC METHOD WITH UVDETECTION, PH CONTROL, AND REDUCTIVE ASCORBIC ACID FOR CYANURIC ACID ANALYSIS IN WATER

    EPA Science Inventory

    Every year over 250 million pounds of cyanuric acid (CA) and chlorinated isocyanurates are produced industrially. These compounds are standard ingredients in formulations for household bleaches, industrial cleansers, dishwasher compounds, general sanitizers, and chlorine stabiliz...

  20. Classification of corn kernels contaminated with aflatoxins using fluorescence and reflectance hyperspectral image analysis

    USDA-ARS?s Scientific Manuscript database

    Aflatoxins are secondary metabolites produced by certain fungal species of the Aspergillus genus. Aflatoxin contamination remains a problem in agricultural products due to its toxic and carcinogenic properties. Conventional chemical methods for aflatoxin detection are time-consuming and destructive....

  1. Comparison of prosthetic models produced by traditional and additive manufacturing methods.

    PubMed

    Park, Jin-Young; Kim, Hae-Young; Kim, Ji-Hwan; Kim, Jae-Hong; Kim, Woong-Chul

    2015-08-01

    The purpose of this study was to verify the clinical-feasibility of additive manufacturing by comparing the accuracy of four different manufacturing methods for metal coping: the conventional lost wax technique (CLWT); subtractive methods with wax blank milling (WBM); and two additive methods, multi jet modeling (MJM), and micro-stereolithography (Micro-SLA). Thirty study models were created using an acrylic model with the maxillary upper right canine, first premolar, and first molar teeth. Based on the scan files from a non-contact blue light scanner (Identica; Medit Co. Ltd., Seoul, Korea), thirty cores were produced using the WBM, MJM, and Micro-SLA methods, respectively, and another thirty frameworks were produced using the CLWT method. To measure the marginal and internal gap, the silicone replica method was adopted, and the silicone images obtained were evaluated using a digital microscope (KH-7700; Hirox, Tokyo, Japan) at 140X magnification. Analyses were performed using two-way analysis of variance (ANOVA) and Tukey post hoc test (α=.05). The mean marginal gaps and internal gaps showed significant differences according to tooth type (P<.001 and P<.001, respectively) and manufacturing method (P<.037 and P<.001, respectively). Micro-SLA did not show any significant difference from CLWT regarding mean marginal gap compared to the WBM and MJM methods. The mean values of gaps resulting from the four different manufacturing methods were within a clinically allowable range, and, thus, the clinical use of additive manufacturing methods is acceptable as an alternative to the traditional lost wax-technique and subtractive manufacturing.

  2. Ecology and characterization of polyhydroxyalkanoate-producing microorganisms on and in plants.

    PubMed

    Gasser, Ilona; Müller, Henry; Berg, Gabriele

    2009-10-01

    Polyhydroxyalkanoates are energy reserve polymers produced by bacteria to survive periods of starvation in natural habitats. Little is known about the ecology of polyhydroxyalkanoate-producing bacteria. To analyse the occurrence of this specific group on/in seven different plant species, a combined strategy containing culture-dependent and -independent methods was applied. Using microbial fingerprint techniques (single-strand conformation polymorphism analysis with specific primers for phaC gene encoding the key enzyme of the polyhydroxyalkanoate synthesis), a high number of bands were especially found for the rhizosphere. Furthermore, cluster analysis revealed plant species-specific communities. Isolation of bacteria, recognition of brightly refractile cytoplasmatic inclusions, lipophilic stainings and a PCR strategy targeted on the phaC gene were used as a culture-dependent strategy for the detection of polyhydroxyalkanoate-producing bacteria. Results again represent a high degree of plant specificity: the rhizosphere of sugar beet contained the highest number of positive strains. This was confirmed by quantitative PCR: the relative copy number of phaC was statistically and significantly enhanced in all rhizospheres in comparison with bulk soil. New polyhydroxyalkanoate-producing bacterial species were detected: for example, Burkholderia terricola, Lysobacter gummosus, Pseudomonas extremaustralis, Pseudomonas brassicacearum and Pseudomonas orientalis. Our results confirm the hypothesis that the rhizosphere is an interesting hidden reservoir for polyhydroxyalkanoate producers.

  3. Skeletal Mechanism Generation of Surrogate Jet Fuels for Aeropropulsion Modeling

    NASA Astrophysics Data System (ADS)

    Sung, Chih-Jen; Niemeyer, Kyle E.

    2010-05-01

    A novel implementation for the skeletal reduction of large detailed reaction mechanisms using the directed relation graph with error propagation and sensitivity analysis (DRGEPSA) is developed and presented with skeletal reductions of two important hydrocarbon components, n-heptane and n-decane, relevant to surrogate jet fuel development. DRGEPSA integrates two previously developed methods, directed relation graph-aided sensitivity analysis (DRGASA) and directed relation graph with error propagation (DRGEP), by first applying DRGEP to efficiently remove many unimportant species prior to sensitivity analysis to further remove unimportant species, producing an optimally small skeletal mechanism for a given error limit. It is illustrated that the combination of the DRGEP and DRGASA methods allows the DRGEPSA approach to overcome the weaknesses of each previous method, specifically that DRGEP cannot identify all unimportant species and that DRGASA shields unimportant species from removal.

  4. Production of beta-gamma coincidence spectra of individual radioxenon isotopes for improved analysis of nuclear explosion monitoring data

    NASA Astrophysics Data System (ADS)

    Haas, Derek Anderson

    Radioactive xenon gas is a fission product released in the detonation of nuclear devices that can be detected in atmospheric samples far from the detonation site. In order to improve the capabilities of radioxenon detection systems, this work produces beta-gamma coincidence spectra of individual isotopes of radioxenon. Previous methods of radioxenon production consisted of the removal of mixed isotope samples of radioxenon gas released from fission of contained fissile materials such as 235U. In order to produce individual samples of the gas, isotopically enriched stable xenon gas is irradiated with neutrons. The detection of the individual isotopes is also modeled using Monte Carlo simulations to produce spectra. The experiment shows that samples of 131mXe, 133 Xe, and 135Xe with a purity greater than 99% can be produced, and that a sample of 133mXe can be produced with a relatively low amount of 133Xe background. These spectra are compared to models and used as essential library data for the Spectral Deconvolution Analysis Tool (SDAT) to analyze atmospheric samples of radioxenon for evidence of nuclear events.

  5. Controlling protected designation of origin of wine by Raman spectroscopy.

    PubMed

    Mandrile, Luisa; Zeppa, Giuseppe; Giovannozzi, Andrea Mario; Rossi, Andrea Mario

    2016-11-15

    In this paper, a Fourier Transform Raman spectroscopy method, to authenticate the provenience of wine, for food traceability applications was developed. In particular, due to the specific chemical fingerprint of the Raman spectrum, it was possible to discriminate different wines produced in the Piedmont area (North West Italy) in accordance with i) grape varieties, ii) production area and iii) ageing time. In order to create a consistent training set, more than 300 samples from tens of different producers were analyzed, and a chemometric treatment of raw spectra was applied. A discriminant analysis method was employed in the classification procedures, providing a classification capability (percentage of correct answers) of 90% for validation of grape analysis and geographical area provenance, and a classification capability of 84% for ageing time classification. The present methodology was applied successfully to raw materials without any preliminary treatment of the sample, providing a response in a very short time. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Computer-based route-definition system for peripheral bronchoscopy.

    PubMed

    Graham, Michael W; Gibbs, Jason D; Higgins, William E

    2012-04-01

    Multi-detector computed tomography (MDCT) scanners produce high-resolution images of the chest. Given a patient's MDCT scan, a physician can use an image-guided intervention system to first plan and later perform bronchoscopy to diagnostic sites situated deep in the lung periphery. An accurate definition of complete routes through the airway tree leading to the diagnostic sites, however, is vital for avoiding navigation errors during image-guided bronchoscopy. We present a system for the robust definition of complete airway routes suitable for image-guided bronchoscopy. The system incorporates both automatic and semiautomatic MDCT analysis methods for this purpose. Using an intuitive graphical user interface, the user invokes automatic analysis on a patient's MDCT scan to produce a series of preliminary routes. Next, the user visually inspects each route and quickly corrects the observed route defects using the built-in semiautomatic methods. Application of the system to a human study for the planning and guidance of peripheral bronchoscopy demonstrates the efficacy of the system.

  7. Authenticity analysis of pear juice employing chromatographic fingerprinting.

    PubMed

    Willems, Jamie L; Low, Nicholas H

    2014-12-03

    Pear juice is predominately composed of carbohydrates/polyols (>95% of the total soluble solids), making it susceptible to adulteration by the addition of less expensive commercial sweeteners. In this research, the major carbohydrate and polyol (fructose, glucose, sucrose, and sorbitol) content of 32 pure pear juices representing five world producing regions and three years of production was determined. Additionally, methods employing oligosaccharide profiling to detect the debasing of these samples with four commercial sweeteners (HFCS 55 and 90, TIS, and HIS) were developed using capillary gas chromatography with flame ionization detection (CGC-FID) and high-performance liquid chromatography with pulsed amperometric detection (HPAE-PAD). Detection limits for the four commercial sweeteners ranged from 0.5 to 5.0% (v/v). In addition, the developed CGC-FID method could be used to (a) detect the addition of pear to apple juice via arbutin detection and (b) determine if a pear juice was produced using enzymatic liquefaction via the presence of O-β-d-glucopyranosyl-(1→4)-d-glucopyranose (cellobiose), all within a single chromatographic analysis.

  8. An exploratory survey of methods used to develop measures of performance

    NASA Astrophysics Data System (ADS)

    Hamner, Kenneth L.; Lafleur, Charles A.

    1993-09-01

    Nonmanufacturing organizations are being challenged to provide high-quality products and services to their customers, with an emphasis on continuous process improvement. Measures of performance, referred to as metrics, can be used to foster process improvement. The application of performance measurement to nonmanufacturing processes can be very difficult. This research explored methods used to develop metrics in nonmanufacturing organizations. Several methods were formally defined in the literature, and the researchers used a two-step screening process to determine the OMB Generic Method was most likely to produce high-quality metrics. The OMB Generic Method was then used to develop metrics. A few other metric development methods were found in use at nonmanufacturing organizations. The researchers interviewed participants in metric development efforts to determine their satisfaction and to have them identify the strengths and weaknesses of, and recommended improvements to, the metric development methods used. Analysis of participants' responses allowed the researchers to identify the key components of a sound metrics development method. Those components were incorporated into a proposed metric development method that was based on the OMB Generic Method, and should be more likely to produce high-quality metrics that will result in continuous process improvement.

  9. Charged particle mobility refrigerant analyzer

    DOEpatents

    Allman, S.L.; Chunghsuan Chen; Chen, F.C.

    1993-02-02

    A method for analyzing a gaseous electronegative species comprises the steps of providing an analysis chamber; providing an electric field of known potential within the analysis chamber; admitting into the analysis chamber a gaseous sample containing the gaseous electronegative species; providing a pulse of free electrons within the electric field so that the pulse of free electrons interacts with the gaseous electronegative species so that a swarm of electrically charged particles is produced within the electric field; and, measuring the mobility of the electrically charged particles within the electric field.

  10. Charged particle mobility refrigerant analyzer

    DOEpatents

    Allman, Steve L.; Chen, Chung-Hsuan; Chen, Fang C.

    1993-01-01

    A method for analyzing a gaseous electronegative species comprises the steps of providing an analysis chamber; providing an electric field of known potential within the analysis chamber; admitting into the analysis chamber a gaseous sample containing the gaseous electronegative species; providing a pulse of free electrons within the electric field so that the pulse of free electrons interacts with the gaseous electronegative species so that a swarm of electrically charged particles is produced within the electric field; and, measuring the mobility of the electrically charged particles within the electric field.

  11. Controlling microbial contamination during hydrolysis of AFEX-pretreated corn stover and switchgrass: Effects on hydrolysate composition, microbial response and fermentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Serate, Jose; Xie, Dan; Pohlmann, Edward

    Microbial conversion of lignocellulosic feedstocks into biofuels remains an attractive means to produce sustainable energy. It is essential to produce lignocellulosic hydrolysates in a consistent manner in order to study microbial performance in different feedstock hydrolysates. Because of the potential to introduce microbial contamination from the untreated biomass or at various points during the process, it can be difficult to control sterility during hydrolysate production. In this study, we compared hydrolysates produced from AFEX-pretreated corn stover and switchgrass using two different methods to control contamination: either by autoclaving the pretreated feedstocks prior to enzymatic hydrolysis, or by introducing antibiotics duringmore » the hydrolysis of non-autoclaved feedstocks. We then performed extensive chemical analysis, chemical genomics, and comparative fermentations to evaluate any differences between these two different methods used for producing corn stover and switchgrass hydrolysates. Autoclaving the pretreated feedstocks could eliminate the contamination for a variety of feedstocks, whereas the antibiotic gentamicin was unable to control contamination consistently during hydrolysis. Compared to the addition of gentamicin, autoclaving of biomass before hydrolysis had a minimal effect on mineral concentrations, and showed no significant effect on the two major sugars (glucose and xylose) found in these hydrolysates. However, autoclaving elevated the concentration of some furanic and phenolic compounds. Chemical genomics analyses using Saccharomyces cerevisiae strains indicated a high correlation between the AFEX-pretreated hydrolysates produced using these two methods within the same feedstock, indicating minimal differences between the autoclaving and antibiotic methods. Comparative fermentations with S. cerevisiae and Zymomonas mobilis also showed that autoclaving the AFEX-pretreated feedstocks had no significant effects on microbial performance in these hydrolysates. In conclusion, our results showed that autoclaving the pretreated feedstocks offered advantages over the addition of antibiotics for hydrolysate production. The autoclaving method produced a more consistent quality of hydrolysate.« less

  12. Controlling microbial contamination during hydrolysis of AFEX-pretreated corn stover and switchgrass: Effects on hydrolysate composition, microbial response and fermentation

    DOE PAGES

    Serate, Jose; Xie, Dan; Pohlmann, Edward; ...

    2015-11-14

    Microbial conversion of lignocellulosic feedstocks into biofuels remains an attractive means to produce sustainable energy. It is essential to produce lignocellulosic hydrolysates in a consistent manner in order to study microbial performance in different feedstock hydrolysates. Because of the potential to introduce microbial contamination from the untreated biomass or at various points during the process, it can be difficult to control sterility during hydrolysate production. In this study, we compared hydrolysates produced from AFEX-pretreated corn stover and switchgrass using two different methods to control contamination: either by autoclaving the pretreated feedstocks prior to enzymatic hydrolysis, or by introducing antibiotics duringmore » the hydrolysis of non-autoclaved feedstocks. We then performed extensive chemical analysis, chemical genomics, and comparative fermentations to evaluate any differences between these two different methods used for producing corn stover and switchgrass hydrolysates. Autoclaving the pretreated feedstocks could eliminate the contamination for a variety of feedstocks, whereas the antibiotic gentamicin was unable to control contamination consistently during hydrolysis. Compared to the addition of gentamicin, autoclaving of biomass before hydrolysis had a minimal effect on mineral concentrations, and showed no significant effect on the two major sugars (glucose and xylose) found in these hydrolysates. However, autoclaving elevated the concentration of some furanic and phenolic compounds. Chemical genomics analyses using Saccharomyces cerevisiae strains indicated a high correlation between the AFEX-pretreated hydrolysates produced using these two methods within the same feedstock, indicating minimal differences between the autoclaving and antibiotic methods. Comparative fermentations with S. cerevisiae and Zymomonas mobilis also showed that autoclaving the AFEX-pretreated feedstocks had no significant effects on microbial performance in these hydrolysates. In conclusion, our results showed that autoclaving the pretreated feedstocks offered advantages over the addition of antibiotics for hydrolysate production. The autoclaving method produced a more consistent quality of hydrolysate.« less

  13. Matrix effects in pesticide multi-residue analysis by liquid chromatography-mass spectrometry.

    PubMed

    Kruve, Anneli; Künnapas, Allan; Herodes, Koit; Leito, Ivo

    2008-04-11

    Three sample preparation methods: Luke method (AOAC 985.22), QuEChERS (quick, easy, cheap, effective, rugged and safe) and matrix solid-phase dispersion (MSPD) were applied to different fruits and vegetables for analysis of 14 pesticide residues by high-performance liquid chromatography with electrospray ionization-mass spectrometry (HPLC/ESI/MS). Matrix effect, recovery and process efficiency of the sample preparation methods applied to different fruits and vegetables were compared. The Luke method was found to produce least matrix effect. On an average the best recoveries were obtained with the QuEChERS method. MSPD gave unsatisfactory recoveries for some basic pesticide residues. Comparison of matrix effects for different apple varieties showed high variability for some residues. It was demonstrated that the amount of co-extracting compounds that cause ionization suppression of aldicarb depends on the apple variety as well as on the sample preparation method employed.

  14. Extracting insights from the shape of complex data using topology

    PubMed Central

    Lum, P. Y.; Singh, G.; Lehman, A.; Ishkanov, T.; Vejdemo-Johansson, M.; Alagappan, M.; Carlsson, J.; Carlsson, G.

    2013-01-01

    This paper applies topological methods to study complex high dimensional data sets by extracting shapes (patterns) and obtaining insights about them. Our method combines the best features of existing standard methodologies such as principal component and cluster analyses to provide a geometric representation of complex data sets. Through this hybrid method, we often find subgroups in data sets that traditional methodologies fail to find. Our method also permits the analysis of individual data sets as well as the analysis of relationships between related data sets. We illustrate the use of our method by applying it to three very different kinds of data, namely gene expression from breast tumors, voting data from the United States House of Representatives and player performance data from the NBA, in each case finding stratifications of the data which are more refined than those produced by standard methods. PMID:23393618

  15. Extracting insights from the shape of complex data using topology.

    PubMed

    Lum, P Y; Singh, G; Lehman, A; Ishkanov, T; Vejdemo-Johansson, M; Alagappan, M; Carlsson, J; Carlsson, G

    2013-01-01

    This paper applies topological methods to study complex high dimensional data sets by extracting shapes (patterns) and obtaining insights about them. Our method combines the best features of existing standard methodologies such as principal component and cluster analyses to provide a geometric representation of complex data sets. Through this hybrid method, we often find subgroups in data sets that traditional methodologies fail to find. Our method also permits the analysis of individual data sets as well as the analysis of relationships between related data sets. We illustrate the use of our method by applying it to three very different kinds of data, namely gene expression from breast tumors, voting data from the United States House of Representatives and player performance data from the NBA, in each case finding stratifications of the data which are more refined than those produced by standard methods.

  16. Structural determination of intact proteins using mass spectrometry

    DOEpatents

    Kruppa, Gary [San Francisco, CA; Schoeniger, Joseph S [Oakland, CA; Young, Malin M [Livermore, CA

    2008-05-06

    The present invention relates to novel methods of determining the sequence and structure of proteins. Specifically, the present invention allows for the analysis of intact proteins within a mass spectrometer. Therefore, preparatory separations need not be performed prior to introducing a protein sample into the mass spectrometer. Also disclosed herein are new instrumental developments for enhancing the signal from the desired modified proteins, methods for producing controlled protein fragments in the mass spectrometer, eliminating complex microseparations, and protein preparatory chemical steps necessary for cross-linking based protein structure determination.Additionally, the preferred method of the present invention involves the determination of protein structures utilizing a top-down analysis of protein structures to search for covalent modifications. In the preferred method, intact proteins are ionized and fragmented within the mass spectrometer.

  17. A method of hidden Markov model optimization for use with geophysical data sets

    NASA Technical Reports Server (NTRS)

    Granat, R. A.

    2003-01-01

    Geophysics research has been faced with a growing need for automated techniques with which to process large quantities of data. A successful tool must meet a number of requirements: it should be consistent, require minimal parameter tuning, and produce scientifically meaningful results in reasonable time. We introduce a hidden Markov model (HMM)-based method for analysis of geophysical data sets that attempts to address these issues.

  18. Correlation analysis of 1 to 30 MeV celestial gamma rays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, J.L.

    1984-01-01

    This paper outlines the development of a method of producing celestial sky maps from the data generated by the University of California, Riverside's double Compton scatter gamma ray telescope. The method makes use of a correlation between the telescope's data and theoretical calculated response functions. The results of applying this technique to northern hemisphere data obtained from a 1978 balloon flight from Palestine, Texas are included.

  19. Treatment of batik waste using distillation method

    NASA Astrophysics Data System (ADS)

    Riyanto, Sidiq, Nurma Yunita; Hidayah, Nailil

    2017-12-01

    In this study has been the treatment of batik waste using distillation method. This study aims to the treatment of batik waste using distillation method. Batik is a world heritage that has an impact on economic improvement and environmental damage. Batik waste is a hazardous and toxic waste material. Batik waste in this research has been taken from Batik Industry in Yogyakarta, Indonesia. Batik waste of 5 L is included in the distillation apparatus, then the distillation run for 4 hours. The distillation product of solids and liquids is collected and analyzed. The solid produced at the distillation boiler was analyzed by FTIR. The distillation liquid was analyzed ammonia and COD concentration using UV-Vis Spectrophotometer. The result of the analysis showed that based on FTIR spectra obtained by dye with high purity. The analysis results shown are of ammonia, COD and pH were 0.652 mg/L, 238.31 mg/L, and 7.306, respectively. The compounds produced by boiler are the azo dye based on the spectrum at wave numbers 1554.07 cm-1. The conclusion of this research is that the distillation method is very suitable for the treatment of the batik waste at small batik industry. Advantages of distillation techniques that can be obtained two products are water and dye that can be used in batik industry.

  20. Development and optimization of SPECT gated blood pool cluster analysis for the prediction of CRT outcome.

    PubMed

    Lalonde, Michel; Wells, R Glenn; Birnie, David; Ruddy, Terrence D; Wassenaar, Richard

    2014-07-01

    Phase analysis of single photon emission computed tomography (SPECT) radionuclide angiography (RNA) has been investigated for its potential to predict the outcome of cardiac resynchronization therapy (CRT). However, phase analysis may be limited in its potential at predicting CRT outcome as valuable information may be lost by assuming that time-activity curves (TAC) follow a simple sinusoidal shape. A new method, cluster analysis, is proposed which directly evaluates the TACs and may lead to a better understanding of dyssynchrony patterns and CRT outcome. Cluster analysis algorithms were developed and optimized to maximize their ability to predict CRT response. About 49 patients (N = 27 ischemic etiology) received a SPECT RNA scan as well as positron emission tomography (PET) perfusion and viability scans prior to undergoing CRT. A semiautomated algorithm sampled the left ventricle wall to produce 568 TACs from SPECT RNA data. The TACs were then subjected to two different cluster analysis techniques, K-means, and normal average, where several input metrics were also varied to determine the optimal settings for the prediction of CRT outcome. Each TAC was assigned to a cluster group based on the comparison criteria and global and segmental cluster size and scores were used as measures of dyssynchrony and used to predict response to CRT. A repeated random twofold cross-validation technique was used to train and validate the cluster algorithm. Receiver operating characteristic (ROC) analysis was used to calculate the area under the curve (AUC) and compare results to those obtained for SPECT RNA phase analysis and PET scar size analysis methods. Using the normal average cluster analysis approach, the septal wall produced statistically significant results for predicting CRT results in the ischemic population (ROC AUC = 0.73;p < 0.05 vs. equal chance ROC AUC = 0.50) with an optimal operating point of 71% sensitivity and 60% specificity. Cluster analysis results were similar to SPECT RNA phase analysis (ROC AUC = 0.78, p = 0.73 vs cluster AUC; sensitivity/specificity = 59%/89%) and PET scar size analysis (ROC AUC = 0.73, p = 1.0 vs cluster AUC; sensitivity/specificity = 76%/67%). A SPECT RNA cluster analysis algorithm was developed for the prediction of CRT outcome. Cluster analysis results produced results equivalent to those obtained from Fourier and scar analysis.

  1. An information-theoretic approach to the modeling and analysis of whole-genome bisulfite sequencing data.

    PubMed

    Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John

    2018-03-07

    DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of quantifying methylation stochasticity using concepts from information theory. By employing this methodology, substantial improvement of DNA methylation analysis can be achieved by effectively taking into account the massive amount of statistical information available in WGBS data, which is largely ignored by existing methods.

  2. Inter-hospital outbreak of Klebsiella pneumoniae producing KPC-2 carbapenemase in Ireland.

    PubMed

    Morris, Dearbháile; Boyle, Fiona; Morris, Carol; Condon, Iris; Delannoy-Vieillard, Anne-Sophie; Power, Lorraine; Khan, Aliya; Morris-Downes, Margaret; Finnegan, Cathriona; Powell, James; Monahan, Regina; Burns, Karen; O'Connell, Nuala; Boyle, Liz; O'Gorman, Alan; Humphreys, Hilary; Brisse, Sylvain; Turton, Jane; Woodford, Neil; Cormican, Martin

    2012-10-01

    To describe an outbreak of KPC-2-producing Klebsiella pneumoniae with inter-hospital spread and measures taken to control transmission. Between January and March 2011, 13 K. pneumoniae isolates were collected from nine patients at hospital A and two patients at hospital B. Meropenem, imipenem and ertapenem MICs were determined by Etest, carbapenemase production was confirmed by the modified Hodge method and by a disc synergy test, and confirmed carbapenemase producers were tested for the presence of carbapenemase-encoding genes by PCR. PFGE, plasmid analysis, multilocus variable-number tandem-repeat analysis (MLVA) and multilocus sequence typing (MLST) analysis were performed on all or a subset of isolates. Meropenem, imipenem and ertapenem MICs were 4 to >32, 8-32 and >16 mg/L, respectively. PCR and sequencing confirmed the presence of bla(KPC-2). PFGE identified four distinguishable (≥88%) pulsed-field profiles (PFPs). Isolates distinguishable by PFGE had identical MLVA profiles, and MLST analysis indicated all isolates belonged to the ST258 clone. Stringent infection prevention and control measures were implemented. Over a period of almost 8 months no further carbapenemase-producing Enterobacteriaceae (CPE) were isolated. However, KPC-2-producing K. pneumoniae was detected in two further patients in hospital A in August (PFP indistinguishable from previous isolates) and October 2011 (PFP similar to but distinguishable from previous isolates). Stringent infection prevention and control measures help contain CPE in the healthcare setting; however, in the case of hospital A, where CPE appears to be established in the population served, it may be virtually impossible to achieve eradication or avoid reintroduction into the hospital.

  3. Evaluation of the veracity of one work by the artist Di Cavalcanti through non-destructive techniques: XRF, imaging and brush stroke analysis

    NASA Astrophysics Data System (ADS)

    Kajiya, E. A. M.; Campos, P. H. O. V.; Rizzutto, M. A.; Appoloni, C. R.; Lopes, F.

    2014-02-01

    This paper presents systematic studies and analysis that contributed to the identification of the forgery of a work by the artist Emiliano Augusto Cavalcanti de Albuquerque e Melo, known as Di Cavalcanti. The use of several areas of expertise such as brush stroke analysis ("pinacologia"), applied physics, and art history resulted in an accurate diagnosis for ascertaining the authenticity of the work entitled "Violeiro" (1950). For this work we used non-destructive methods such as techniques of infrared, ultraviolet, visible and tangential light imaging combined with chemical analysis of the pigments by portable X-Ray Fluorescence (XRF) and graphic gesture analysis. Each applied method of analysis produced specific information that made possible the identification of materials and techniques employed and we concluded that this work is not consistent with patterns characteristic of the artist Di Cavalcanti.

  4. The Impact of Normalization Methods on RNA-Seq Data Analysis

    PubMed Central

    Zyprych-Walczak, J.; Szabelska, A.; Handschuh, L.; Górczak, K.; Klamecka, K.; Figlerowicz, M.; Siatkowski, I.

    2015-01-01

    High-throughput sequencing technologies, such as the Illumina Hi-seq, are powerful new tools for investigating a wide range of biological and medical problems. Massive and complex data sets produced by the sequencers create a need for development of statistical and computational methods that can tackle the analysis and management of data. The data normalization is one of the most crucial steps of data processing and this process must be carefully considered as it has a profound effect on the results of the analysis. In this work, we focus on a comprehensive comparison of five normalization methods related to sequencing depth, widely used for transcriptome sequencing (RNA-seq) data, and their impact on the results of gene expression analysis. Based on this study, we suggest a universal workflow that can be applied for the selection of the optimal normalization procedure for any particular data set. The described workflow includes calculation of the bias and variance values for the control genes, sensitivity and specificity of the methods, and classification errors as well as generation of the diagnostic plots. Combining the above information facilitates the selection of the most appropriate normalization method for the studied data sets and determines which methods can be used interchangeably. PMID:26176014

  5. Robust and transferable quantification of NMR spectral quality using IROC analysis

    NASA Astrophysics Data System (ADS)

    Zambrello, Matthew A.; Maciejewski, Mark W.; Schuyler, Adam D.; Weatherby, Gerard; Hoch, Jeffrey C.

    2017-12-01

    Non-Fourier methods are increasingly utilized in NMR spectroscopy because of their ability to handle nonuniformly-sampled data. However, non-Fourier methods present unique challenges due to their nonlinearity, which can produce nonrandom noise and render conventional metrics for spectral quality such as signal-to-noise ratio unreliable. The lack of robust and transferable metrics (i.e. applicable to methods exhibiting different nonlinearities) has hampered comparison of non-Fourier methods and nonuniform sampling schemes, preventing the identification of best practices. We describe a novel method, in situ receiver operating characteristic analysis (IROC), for characterizing spectral quality based on the Receiver Operating Characteristic curve. IROC utilizes synthetic signals added to empirical data as "ground truth", and provides several robust scalar-valued metrics for spectral quality. This approach avoids problems posed by nonlinear spectral estimates, and provides a versatile quantitative means of characterizing many aspects of spectral quality. We demonstrate applications to parameter optimization in Fourier and non-Fourier spectral estimation, critical comparison of different methods for spectrum analysis, and optimization of nonuniform sampling schemes. The approach will accelerate the discovery of optimal approaches to nonuniform sampling experiment design and non-Fourier spectrum analysis for multidimensional NMR.

  6. Analysis of Venetian-type glass fragments from the ancient city of Lezha (Albania)

    NASA Astrophysics Data System (ADS)

    Šmit, Ž.; Stamati, F.; Civici, N.; Vevecka-Priftaj, A.; Kos, M.; Jezeršek, D.

    2009-08-01

    A series of glasses excavated in the Albanian city of Lezha (ancient Lissos) were analyzed by the combined PIXE-PIGE method in air and by source-excited XRF. The analysis revealed two types of glass that can be identified as façon de Venise glass and its subsequent younger phase, produced by chemically purer components and using As 2O 3 as decolorant.

  7. Deriving pathway maps from automated text analysis using a grammar-based approach.

    PubMed

    Olsson, Björn; Gawronska, Barbara; Erlendsson, Björn

    2006-04-01

    We demonstrate how automated text analysis can be used to support the large-scale analysis of metabolic and regulatory pathways by deriving pathway maps from textual descriptions found in the scientific literature. The main assumption is that correct syntactic analysis combined with domain-specific heuristics provides a good basis for relation extraction. Our method uses an algorithm that searches through the syntactic trees produced by a parser based on a Referent Grammar formalism, identifies relations mentioned in the sentence, and classifies them with respect to their semantic class and epistemic status (facts, counterfactuals, hypotheses). The semantic categories used in the classification are based on the relation set used in KEGG (Kyoto Encyclopedia of Genes and Genomes), so that pathway maps using KEGG notation can be automatically generated. We present the current version of the relation extraction algorithm and an evaluation based on a corpus of abstracts obtained from PubMed. The results indicate that the method is able to combine a reasonable coverage with high accuracy. We found that 61% of all sentences were parsed, and 97% of the parse trees were judged to be correct. The extraction algorithm was tested on a sample of 300 parse trees and was found to produce correct extractions in 90.5% of the cases.

  8. Probability Density Functions of Observed Rainfall in Montana

    NASA Technical Reports Server (NTRS)

    Larsen, Scott D.; Johnson, L. Ronald; Smith, Paul L.

    1995-01-01

    The question of whether a rain rate probability density function (PDF) can vary uniformly between precipitation events is examined. Image analysis on large samples of radar echoes is possible because of advances in technology. The data provided by such an analysis easily allow development of radar reflectivity factors (and by extension rain rate) distribution. Finding a PDF becomes a matter of finding a function that describes the curve approximating the resulting distributions. Ideally, one PDF would exist for all cases; or many PDF's that have the same functional form with only systematic variations in parameters (such as size or shape) exist. Satisfying either of theses cases will, validate the theoretical basis of the Area Time Integral (ATI). Using the method of moments and Elderton's curve selection criteria, the Pearson Type 1 equation was identified as a potential fit for 89 percent of the observed distributions. Further analysis indicates that the Type 1 curve does approximate the shape of the distributions but quantitatively does not produce a great fit. Using the method of moments and Elderton's curve selection criteria, the Pearson Type 1 equation was identified as a potential fit for 89% of the observed distributions. Further analysis indicates that the Type 1 curve does approximate the shape of the distributions but quantitatively does not produce a great fit.

  9. Determination of mycotoxins in foods: current state of analytical methods and limitations.

    PubMed

    Köppen, Robert; Koch, Matthias; Siegel, David; Merkel, Stefan; Maul, Ronald; Nehls, Irene

    2010-05-01

    Mycotoxins are natural contaminants produced by a range of fungal species. Their common occurrence in food and feed poses a threat to the health of humans and animals. This threat is caused either by the direct contamination of agricultural commodities or by a "carry-over" of mycotoxins and their metabolites into animal tissues, milk, and eggs after feeding of contaminated hay or corn. As a consequence of their diverse chemical structures and varying physical properties, mycotoxins exhibit a wide range of biological effects. Individual mycotoxins can be genotoxic, mutagenic, carcinogenic, teratogenic, and oestrogenic. To protect consumer health and to reduce economic losses, surveillance and control of mycotoxins in food and feed has become a major objective for producers, regulatory authorities and researchers worldwide. However, the variety of chemical structures makes it impossible to use one single technique for mycotoxin analysis. Hence, a vast number of analytical methods has been developed and validated. The heterogeneity of food matrices combined with the demand for a fast, simultaneous and accurate determination of multiple mycotoxins creates enormous challenges for routine analysis. The most crucial issues will be discussed in this review. These are (1) the collection of representative samples, (2) the performance of classical and emerging analytical methods based on chromatographic or immunochemical techniques, (3) the validation of official methods for enforcement, and (4) the limitations and future prospects of the current methods.

  10. Numerical and experimental analyses of the radiant heat flux produced by quartz heating systems

    NASA Technical Reports Server (NTRS)

    Turner, Travis L.; Ash, Robert L.

    1994-01-01

    A method is developed for predicting the radiant heat flux distribution produced by tungsten filament, tubular fused-quartz envelope heating systems with reflectors. The method is an application of Monte Carlo simulation, which takes the form of a random walk or ray tracing scheme. The method is applied to four systems of increasing complexity, including a single lamp without a reflector, a single lamp with a Hat reflector, a single lamp with a parabolic reflector, and up to six lamps in a six-lamp contoured-reflector heating unit. The application of the Monte Carlo method to the simulation of the thermal radiation generated by these systems is discussed. The procedures for numerical implementation are also presented. Experiments were conducted to study these quartz heating systems and to acquire measurements of the corresponding empirical heat flux distributions for correlation with analysis. The experiments were conducted such that several complicating factors could be isolated and studied sequentially. Comparisons of the experimental results with analysis are presented and discussed. Good agreement between the experimental and simulated results was obtained in all cases. This study shows that this method can be used to analyze very complicated quartz heating systems and can account for factors such as spectral properties, specular reflection from curved surfaces, source enhancement due to reflectors and/or adjacent sources, and interaction with a participating medium in a straightforward manner.

  11. A comparison of imputation techniques for handling missing predictor values in a risk model with a binary outcome.

    PubMed

    Ambler, Gareth; Omar, Rumana Z; Royston, Patrick

    2007-06-01

    Risk models that aim to predict the future course and outcome of disease processes are increasingly used in health research, and it is important that they are accurate and reliable. Most of these risk models are fitted using routinely collected data in hospitals or general practices. Clinical outcomes such as short-term mortality will be near-complete, but many of the predictors may have missing values. A common approach to dealing with this is to perform a complete-case analysis. However, this may lead to overfitted models and biased estimates if entire patient subgroups are excluded. The aim of this paper is to investigate a number of methods for imputing missing data to evaluate their effect on risk model estimation and the reliability of the predictions. Multiple imputation methods, including hotdecking and multiple imputation by chained equations (MICE), were investigated along with several single imputation methods. A large national cardiac surgery database was used to create simulated yet realistic datasets. The results suggest that complete case analysis may produce unreliable risk predictions and should be avoided. Conditional mean imputation performed well in our scenario, but may not be appropriate if using variable selection methods. MICE was amongst the best performing multiple imputation methods with regards to the quality of the predictions. Additionally, it produced the least biased estimates, with good coverage, and hence is recommended for use in practice.

  12. Sequence-related amplified polymorphism (SRAP) marker as a new method for identification of endophytic fungi from Taxus.

    PubMed

    Ren, Na; Liu, Jiajia; Yang, Dongliang; Chen, Jianhua; Luan, Mingbao; Hong, Juan

    2012-01-01

    A total of 20 endophytic fungi stains were classified into four groups using traditional morphological identification method, and were studied for genetic diversity by sequence-related amplified polymorphism (SRAP) technique. Genomic DNA (deoxyribonucleic acid) of these strains was extracted with CTAB method. SRAP analysis was done with 24 pairs of primers. All strains could be uniquely distinguished with 584 bands and 446 polymorphism bands which generated 76.4% of polymorphic ratio. Unweighted pair-group method with arithmetical averages cluster analysis enabled construction of a dendrogram for estimating genetic distances between different strains. All strains, which were just divided into four groups by traditional morphology identification, were clustered into four major groups at GS = 0.603 and further separated into eight sub-groups at GS = 0.921. Dendrogram also revealed a large genetic variation in 20 strains; different primer combinations allowed them distinctly distinguished one from others with relatively low genetic similarity. The results show that the SRAP technology is more efficient than traditional morphology identification. It is found that SRAP markers could more really reflect the genetic diversity of endophytic fungi strains from Taxus, and also could be used as a method for identification of endophytic fungi from Taxus. It also suggests that SRAP can be used to establish foundation for further screening of taxol-producing endophytic fungi strains which can produce high levels of paclitaxel.

  13. Top quark produced through the electroweak force: Discovery using the matrix element analysis and search for heavy gauge bosons using boosted decision trees

    NASA Astrophysics Data System (ADS)

    Pangilinan, Monica

    The top quark produced through the electroweak channel provides a direct measurement of the Vtb element in the CKM matrix which can be viewed as a transition rate of a top quark to a bottom quark. This production channel of top quark is also sensitive to different theories beyond the Standard Model such as heavy charged gauged bosons termed W'. This thesis measures the cross section of the electroweak produced top quark using a technique based on using the matrix elements of the processes under consideration. The technique is applied to 2.3 fb--1 of data from the DO detector. From a comparison of the matrix element discriminants between data and the signal and background model using Bayesian statistics, we measure the cross section of the top quark produced through the electroweak mechanism spp¯→ tb+X,tqb+X=4.30+0.98-1.2 0pb The measured result corresponds to a 4.9sigma Gaussian-equivalent significance. By combining this analysis with other analyses based on the Bayesian Neural Network (BNN) and Boosted Decision Tree (BDT) method, the measured cross section is 3.94 +/- 0.88 pb with a significance of 5.0sigma, resulting in the discovery of electroweak produced top quarks. Using this measured cross section and constraining |Vtb| < 1, the 95% confidence level (C.L.) lower limit is |Vtb| > 0.78. Additionally, a search is made for the production of W' using the same samples from the electroweak produced top quark. An analysis based on the BDT method is used to separate the signal from expected backgrounds. No significant excess is found and 95% C.L. upper limits on the production cross section are set for W' with masses within 600--950 GeV. For four general models of W' boson production using decay channel W' → tb¯, the lower mass limits are the following: M( W'L with SM couplings) > 840 GeV; M( W'R ) > 880 GeV or 890 GeV if the right-handed neutrino is lighter or heavier than W'R ; and M( W'L+R ) > 915 GeV.

  14. Web-Based Phylogenetic Assignment Tool for Analysis of Terminal Restriction Fragment Length Polymorphism Profiles of Microbial Communities

    PubMed Central

    Kent, Angela D.; Smith, Dan J.; Benson, Barbara J.; Triplett, Eric W.

    2003-01-01

    Culture-independent DNA fingerprints are commonly used to assess the diversity of a microbial community. However, relating species composition to community profiles produced by community fingerprint methods is not straightforward. Terminal restriction fragment length polymorphism (T-RFLP) is a community fingerprint method in which phylogenetic assignments may be inferred from the terminal restriction fragment (T-RF) sizes through the use of web-based resources that predict T-RF sizes for known bacteria. The process quickly becomes computationally intensive due to the need to analyze profiles produced by multiple restriction digests and the complexity of profiles generated by natural microbial communities. A web-based tool is described here that rapidly generates phylogenetic assignments from submitted community T-RFLP profiles based on a database of fragments produced by known 16S rRNA gene sequences. Users have the option of submitting a customized database generated from unpublished sequences or from a gene other than the 16S rRNA gene. This phylogenetic assignment tool allows users to employ T-RFLP to simultaneously analyze microbial community diversity and species composition. An analysis of the variability of bacterial species composition throughout the water column in a humic lake was carried out to demonstrate the functionality of the phylogenetic assignment tool. This method was validated by comparing the results generated by this program with results from a 16S rRNA gene clone library. PMID:14602639

  15. Effect of polymer coating on the osseointegration of CP-Ti dental implant

    NASA Astrophysics Data System (ADS)

    Al-Hassani, Emad; Al-Hassani, Fatima; Najim, Manar

    2018-05-01

    Modifications achieved coatings of titanium samples were investigated in order to improve their surface characteristics so as to facilitate bio-integration. Chitosan coating was use for commercial pure Ti alloys manufactured by two different methods in which commercial pure titanium rod converted in form of implant screw by using wire cut machine and lathe, second method included the used of powder technology for producing the implant screws. The coating process of chitosan polymer was carried out using advance technology (electrospnning process) to create fibrous structure from Nano to micro scale of the chitosan on the implant surface which result in a bioactive surface. The characterization includes; microstructure observation, surface chemical composition analysis (EDS), surface roughness (AFM), and the histological analysis. from the SEM No morphological differences were observed among the implants surfaces except for some inconsiderable morphological differences that results from the manufacturing process, by using EDX analysis the surfaces chemical compositions were completely changed and there was large decrease in the percentage of titanium element at the surface which indicates that the surface is covered with chitosan and had a new surface composition and topography. The sample was produced by powder technology process have higher roughness (845.36 nm) than sample produced by machining without any surface treatment (531.7nm),finally The histological view of implant samples after 4weeks of implantation, showed active bone formation in all implant surface which give clear indication of tissue acceptance.

  16. Application of the Monte Carlo method to estimate doses due to neutron activation of different materials in a nuclear reactor

    NASA Astrophysics Data System (ADS)

    Ródenas, José

    2017-11-01

    All materials exposed to some neutron flux can be activated independently of the kind of the neutron source. In this study, a nuclear reactor has been considered as neutron source. In particular, the activation of control rods in a BWR is studied to obtain the doses produced around the storage pool for irradiated fuel of the plant when control rods are withdrawn from the reactor and installed into this pool. It is very important to calculate these doses because they can affect to plant workers in the area. The MCNP code based on the Monte Carlo method has been applied to simulate activation reactions produced in the control rods inserted into the reactor. Obtained activities are introduced as input into another MC model to estimate doses produced by them. The comparison of simulation results with experimental measurements allows the validation of developed models. The developed MC models have been also applied to simulate the activation of other materials, such as components of a stainless steel sample introduced into a training reactors. These models, once validated, can be applied to other situations and materials where a neutron flux can be found, not only nuclear reactors. For instance, activation analysis with an Am-Be source, neutrography techniques in both medical applications and non-destructive analysis of materials, civil engineering applications using a Troxler, analysis of materials in decommissioning of nuclear power plants, etc.

  17. Finding Imaging Patterns of Structural Covariance via Non-Negative Matrix Factorization

    PubMed Central

    Sotiras, Aristeidis; Resnick, Susan M.; Davatzikos, Christos

    2015-01-01

    In this paper, we investigate the use of Non-Negative Matrix Factorization (NNMF) for the analysis of structural neuroimaging data. The goal is to identify the brain regions that co-vary across individuals in a consistent way, hence potentially being part of underlying brain networks or otherwise influenced by underlying common mechanisms such as genetics and pathologies. NNMF offers a directly data-driven way of extracting relatively localized co-varying structural regions, thereby transcending limitations of Principal Component Analysis (PCA), Independent Component Analysis (ICA) and other related methods that tend to produce dispersed components of positive and negative loadings. In particular, leveraging upon the well known ability of NNMF to produce parts-based representations of image data, we derive decompositions that partition the brain into regions that vary in consistent ways across individuals. Importantly, these decompositions achieve dimensionality reduction via highly interpretable ways and generalize well to new data as shown via split-sample experiments. We empirically validate NNMF in two data sets: i) a Diffusion Tensor (DT) mouse brain development study, and ii) a structural Magnetic Resonance (sMR) study of human brain aging. We demonstrate the ability of NNMF to produce sparse parts-based representations of the data at various resolutions. These representations seem to follow what we know about the underlying functional organization of the brain and also capture some pathological processes. Moreover, we show that these low dimensional representations favorably compare to descriptions obtained with more commonly used matrix factorization methods like PCA and ICA. PMID:25497684

  18. New Tools for Comparing Microscopy Images: Quantitative Analysis of Cell Types in Bacillus subtilis

    PubMed Central

    van Gestel, Jordi; Vlamakis, Hera

    2014-01-01

    Fluorescence microscopy is a method commonly used to examine individual differences between bacterial cells, yet many studies still lack a quantitative analysis of fluorescence microscopy data. Here we introduce some simple tools that microbiologists can use to analyze and compare their microscopy images. We show how image data can be converted to distribution data. These data can be subjected to a cluster analysis that makes it possible to objectively compare microscopy images. The distribution data can further be analyzed using distribution fitting. We illustrate our methods by scrutinizing two independently acquired data sets, each containing microscopy images of a doubly labeled Bacillus subtilis strain. For the first data set, we examined the expression of srfA and tapA, two genes which are expressed in surfactin-producing and matrix-producing cells, respectively. For the second data set, we examined the expression of eps and tapA; these genes are expressed in matrix-producing cells. We show that srfA is expressed by all cells in the population, a finding which contrasts with a previously reported bimodal distribution of srfA expression. In addition, we show that eps and tapA do not always have the same expression profiles, despite being expressed in the same cell type: both operons are expressed in cell chains, while single cells mainly express eps. These findings exemplify that the quantification and comparison of microscopy data can yield insights that otherwise would go unnoticed. PMID:25448819

  19. New tools for comparing microscopy images: quantitative analysis of cell types in Bacillus subtilis.

    PubMed

    van Gestel, Jordi; Vlamakis, Hera; Kolter, Roberto

    2015-02-15

    Fluorescence microscopy is a method commonly used to examine individual differences between bacterial cells, yet many studies still lack a quantitative analysis of fluorescence microscopy data. Here we introduce some simple tools that microbiologists can use to analyze and compare their microscopy images. We show how image data can be converted to distribution data. These data can be subjected to a cluster analysis that makes it possible to objectively compare microscopy images. The distribution data can further be analyzed using distribution fitting. We illustrate our methods by scrutinizing two independently acquired data sets, each containing microscopy images of a doubly labeled Bacillus subtilis strain. For the first data set, we examined the expression of srfA and tapA, two genes which are expressed in surfactin-producing and matrix-producing cells, respectively. For the second data set, we examined the expression of eps and tapA; these genes are expressed in matrix-producing cells. We show that srfA is expressed by all cells in the population, a finding which contrasts with a previously reported bimodal distribution of srfA expression. In addition, we show that eps and tapA do not always have the same expression profiles, despite being expressed in the same cell type: both operons are expressed in cell chains, while single cells mainly express eps. These findings exemplify that the quantification and comparison of microscopy data can yield insights that otherwise would go unnoticed. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  20. Comparison of detrending methods for fluctuation analysis in hydrology

    NASA Astrophysics Data System (ADS)

    Zhang, Qiang; Zhou, Yu; Singh, Vijay P.; Chen, Yongqin David

    2011-03-01

    SummaryTrends within a hydrologic time series can significantly influence the scaling results of fluctuation analysis, such as rescaled range (RS) analysis and (multifractal) detrended fluctuation analysis (MF-DFA). Therefore, removal of trends is important in the study of scaling properties of the time series. In this study, three detrending methods, including adaptive detrending algorithm (ADA), Fourier-based method, and average removing technique, were evaluated by analyzing numerically generated series and observed streamflow series with obvious relative regular periodic trend. Results indicated that: (1) the Fourier-based detrending method and ADA were similar in detrending practices, and given proper parameters, these two methods can produce similarly satisfactory results; (2) detrended series by Fourier-based detrending method and ADA lose the fluctuation information at larger time scales, and the location of crossover points is heavily impacted by the chosen parameters of these two methods; and (3) the average removing method has an advantage over the other two methods, i.e., the fluctuation information at larger time scales is kept well-an indication of relatively reliable performance in detrending. In addition, the average removing method performed reasonably well in detrending a time series with regular periods or trends. In this sense, the average removing method should be preferred in the study of scaling properties of the hydrometeorolgical series with relative regular periodic trend using MF-DFA.

  1. An integrated one-step system to extract, analyze and annotate all relevant information from image-based cell screening of chemical libraries.

    PubMed

    Rabal, Obdulia; Link, Wolfgang; Serelde, Beatriz G; Bischoff, James R; Oyarzabal, Julen

    2010-04-01

    Here we report the development and validation of a complete solution to manage and analyze the data produced by image-based phenotypic screening campaigns of small-molecule libraries. In one step initial crude images are analyzed for multiple cytological features, statistical analysis is performed and molecules that produce the desired phenotypic profile are identified. A naïve Bayes classifier, integrating chemical and phenotypic spaces, is built and utilized during the process to assess those images initially classified as "fuzzy"-an automated iterative feedback tuning. Simultaneously, all this information is directly annotated in a relational database containing the chemical data. This novel fully automated method was validated by conducting a re-analysis of results from a high-content screening campaign involving 33 992 molecules used to identify inhibitors of the PI3K/Akt signaling pathway. Ninety-two percent of confirmed hits identified by the conventional multistep analysis method were identified using this integrated one-step system as well as 40 new hits, 14.9% of the total, originally false negatives. Ninety-six percent of true negatives were properly recognized too. A web-based access to the database, with customizable data retrieval and visualization tools, facilitates the posterior analysis of annotated cytological features which allows identification of additional phenotypic profiles; thus, further analysis of original crude images is not required.

  2. GIS and statistical analysis for landslide susceptibility mapping in the Daunia area, Italy

    NASA Astrophysics Data System (ADS)

    Mancini, F.; Ceppi, C.; Ritrovato, G.

    2010-09-01

    This study focuses on landslide susceptibility mapping in the Daunia area (Apulian Apennines, Italy) and achieves this by using a multivariate statistical method and data processing in a Geographical Information System (GIS). The Logistic Regression (hereafter LR) method was chosen to produce a susceptibility map over an area of 130 000 ha where small settlements are historically threatened by landslide phenomena. By means of LR analysis, the tendency to landslide occurrences was, therefore, assessed by relating a landslide inventory (dependent variable) to a series of causal factors (independent variables) which were managed in the GIS, while the statistical analyses were performed by means of the SPSS (Statistical Package for the Social Sciences) software. The LR analysis produced a reliable susceptibility map of the investigated area and the probability level of landslide occurrence was ranked in four classes. The overall performance achieved by the LR analysis was assessed by local comparison between the expected susceptibility and an independent dataset extrapolated from the landslide inventory. Of the samples classified as susceptible to landslide occurrences, 85% correspond to areas where landslide phenomena have actually occurred. In addition, the consideration of the regression coefficients provided by the analysis demonstrated that a major role is played by the "land cover" and "lithology" causal factors in determining the occurrence and distribution of landslide phenomena in the Apulian Apennines.

  3. The effects of physical activity on impulsive choice: Influence of sensitivity to reinforcement amount and delay.

    PubMed

    Strickland, Justin C; Feinstein, Max A; Lacy, Ryan T; Smith, Mark A

    2016-05-01

    Impulsive choice is a diagnostic feature and/or complicating factor for several psychological disorders and may be examined in the laboratory using delay-discounting procedures. Recent investigators have proposed using quantitative measures of analysis to examine the behavioral processes contributing to impulsive choice. The purpose of this study was to examine the effects of physical activity (i.e., wheel running) on impulsive choice in a single-response, discrete-trial procedure using two quantitative methods of analysis. To this end, rats were assigned to physical activity or sedentary groups and trained to respond in a delay-discounting procedure. In this procedure, one lever always produced one food pellet immediately, whereas a second lever produced three food pellets after a 0, 10, 20, 40, or 80-s delay. Estimates of sensitivity to reinforcement amount and sensitivity to reinforcement delay were determined using (1) a simple linear analysis and (2) an analysis of logarithmically transformed response ratios. Both analyses revealed that physical activity decreased sensitivity to reinforcement amount and sensitivity to reinforcement delay. These findings indicate that (1) physical activity has significant but functionally opposing effects on the behavioral processes that contribute to impulsive choice and (2) both quantitative methods of analysis are appropriate for use in single-response, discrete-trial procedures. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Measuring kinetics of complex single ion channel data using mean-variance histograms.

    PubMed

    Patlak, J B

    1993-07-01

    The measurement of single ion channel kinetics is difficult when those channels exhibit subconductance events. When the kinetics are fast, and when the current magnitudes are small, as is the case for Na+, Ca2+, and some K+ channels, these difficulties can lead to serious errors in the estimation of channel kinetics. I present here a method, based on the construction and analysis of mean-variance histograms, that can overcome these problems. A mean-variance histogram is constructed by calculating the mean current and the current variance within a brief "window" (a set of N consecutive data samples) superimposed on the digitized raw channel data. Systematic movement of this window over the data produces large numbers of mean-variance pairs which can be assembled into a two-dimensional histogram. Defined current levels (open, closed, or sublevel) appear in such plots as low variance regions. The total number of events in such low variance regions is estimated by curve fitting and plotted as a function of window width. This function decreases with the same time constants as the original dwell time probability distribution for each of the regions. The method can therefore be used: 1) to present a qualitative summary of the single channel data from which the signal-to-noise ratio, open channel noise, steadiness of the baseline, and number of conductance levels can be quickly determined; 2) to quantify the dwell time distribution in each of the levels exhibited. In this paper I present the analysis of a Na+ channel recording that had a number of complexities. The signal-to-noise ratio was only about 8 for the main open state, open channel noise, and fast flickers to other states were present, as were a substantial number of subconductance states. "Standard" half-amplitude threshold analysis of these data produce open and closed time histograms that were well fitted by the sum of two exponentials, but with apparently erroneous time constants, whereas the mean-variance histogram technique provided a more credible analysis of the open, closed, and subconductance times for the patch. I also show that the method produces accurate results on simulated data in a wide variety of conditions, whereas the half-amplitude method, when applied to complex simulated data shows the same errors as were apparent in the real data. The utility and the limitations of this new method are discussed.

  5. Distributed Evaluation of Local Sensitivity Analysis (DELSA), with application to hydrologic models

    USGS Publications Warehouse

    Rakovec, O.; Hill, Mary C.; Clark, M.P.; Weerts, A. H.; Teuling, A. J.; Uijlenhoet, R.

    2014-01-01

    This paper presents a hybrid local-global sensitivity analysis method termed the Distributed Evaluation of Local Sensitivity Analysis (DELSA), which is used here to identify important and unimportant parameters and evaluate how model parameter importance changes as parameter values change. DELSA uses derivative-based “local” methods to obtain the distribution of parameter sensitivity across the parameter space, which promotes consideration of sensitivity analysis results in the context of simulated dynamics. This work presents DELSA, discusses how it relates to existing methods, and uses two hydrologic test cases to compare its performance with the popular global, variance-based Sobol' method. The first test case is a simple nonlinear reservoir model with two parameters. The second test case involves five alternative “bucket-style” hydrologic models with up to 14 parameters applied to a medium-sized catchment (200 km2) in the Belgian Ardennes. Results show that in both examples, Sobol' and DELSA identify similar important and unimportant parameters, with DELSA enabling more detailed insight at much lower computational cost. For example, in the real-world problem the time delay in runoff is the most important parameter in all models, but DELSA shows that for about 20% of parameter sets it is not important at all and alternative mechanisms and parameters dominate. Moreover, the time delay was identified as important in regions producing poor model fits, whereas other parameters were identified as more important in regions of the parameter space producing better model fits. The ability to understand how parameter importance varies through parameter space is critical to inform decisions about, for example, additional data collection and model development. The ability to perform such analyses with modest computational requirements provides exciting opportunities to evaluate complicated models as well as many alternative models.

  6. Qualitative Comparison of Streamflow Information Programs of the U.S. Geological Survey and Three Non-Federal Agencies

    USGS Publications Warehouse

    Norris, J. Michael; Lewis, Michael; Dorsey, Michael; Kimbrough, Robert; Holmes, Robert R.; Staubitz, Ward

    2008-01-01

    A qualitative comparison was made of the streamgaging programs of the U.S. Geological Survey (USGS) and three non-Federal agencies in terms of approximate costs and streamflow-information products produced. The three non-Federal agencies provided the USGS with detailed information on their streamgaging program and related costs, and the USGS explored, through publicly available Web sites and one-on-one discussions, the comparability of the streamflow information produced. The type and purpose of streamgages operated, the quality of streamflow record produced, and cost-accounting methods have a great effect on streamgaging costs. There are many uses of streamflow information, and the information requirements for streamgaging programs differ greatly across this range of purposes. A premise of the USGS streamgaging program is that the network must produce consistent data of sufficient quality to support the broadest range of possible uses. Other networks may have a narrower range of purposes; as a consequence, the method of operation, data-quality objectives, and information delivery may be different from those for a multipurpose network. As a result, direct comparison of the overall cost (or of the cost per streamgage) among these programs is not possible. The analysis is, nonetheless, very instructive and provides USGS program managers, agency leadership, and other agency streamgaging program managers useful insight to influence future decisions. Even though the comparison of streamgaging costs and streamflow information products was qualitative, this analysis does offer useful insights on longstanding questions of USGS streamgaging costs.

  7. A Comparison of Alternating Current and Direct Current Electrospray Ionization for Mass Spectrometry

    PubMed Central

    Sarver, Scott A.; Gartner, Carlos A.; Chetwani, Nishant; Go, David B.; Dovichi, Norman J.

    2014-01-01

    A series of studies comparing the performance of alternating current electrospray ionization (AC ESI) mass spectrometry (MS) and direct current electrospray ionization (DC ESI) MS has been conducted, exploring the absolute signal intensity and signal-to-background ratios produced by both methods using caffeine and a model peptide as targets. Because the high-voltage AC signal was more susceptible to generating gas discharges, the operating voltage range of AC ESI was significantly smaller than that for DC ESI, such that the absolute signal intensities produced by DC ESI at peak voltages were 1 - 2 orders of magnitude greater than those for AC ESI. Using an electronegative nebulizing gas, sulfur hexafluoride (SF6), instead of nitrogen (N2) increased the operating range of AC ESI by ~50%, but did not appreciably improve signal intensities. While DC ESI generated far greater signal intensities, both ionization methods produced comparable signal-to-background noise, with AC ESI spectra appearing qualitatively cleaner. A quantitative calibration analysis was performed for two analytes, caffeine and the peptide MRFA. AC ESI utilizing SF6 outperforms all other techniques for the detection of MRFA, producing chromatographic limits of detection nearly one order of magnitude lower than that of DC ESI utilizing N2, and one half that of DC ESI utilizing SF6. However, DC ESI outperforms AC ESI for the analysis of caffeine, indicating improvements in spectral quality may benefit certain compounds, or classes of compounds, on an individual basis. PMID:24464359

  8. Linguistic Alternatives to Quantitative Research Strategies. Part One: How Linguistic Mechanisms Advance Research Outcomes

    ERIC Educational Resources Information Center

    Yeager, Joseph; Sommer, Linda

    2007-01-01

    Combining psycholinguistic technologies and systems analysis created advances in motivational profiling and numerous new behavioral engineering applications. These advances leapfrog many mainstream statistical research methods, producing superior research results via cause-effect language mechanisms. Entire industries explore motives ranging from…

  9. Development of an Analytic Method for Sulfur Compounds in Aged Garlic Extract with the Use of a Postcolumn High Performance Liquid Chromatography Method with Sulfur-Specific Detection.

    PubMed

    Matsutomo, Toshiaki; Kodera, Yukihiro

    2016-02-01

    Garlic and its processed preparations contain numerous sulfur compounds that are difficult to analyze in a single run using HPLC. The aim of this study was to develop a rapid and convenient sulfur-specific HPLC method to analyze sulfur compounds in aged garlic extract (AGE). We modified a conventional postcolumn HPLC method by employing a hexaiodoplatinate reagent. Identification and structural analysis of sulfur compounds were conducted by LC-mass spectrometry (LC-MS) and nuclear magnetic resonance. The production mechanisms of cis-S-1-propenylcysteine (cis-S1PC) and S-allylmercaptocysteine (SAMC) were examined by model reactions. Our method has the following advantages: less interference from nonsulfur compounds, high sensitivity, good correlation coefficients (r > 0.98), and high resolution that can separate >20 sulfur compounds, including several isomers, in garlic preparations in a single run. This method was adapted for LC-MS analysis. We identified cis-S1PC and γ-glutamyl-S-allyl-mercaptocysteine in AGE. The results of model reactions suggest that cis-S1PC is produced from trans-S1PC through an isomerization reaction and that SAMC is produced by a reaction involving S-allylcysteine/S1PC and diallyldisulfide during the aging period. We developed a rapid postcolumn HPLC method for both qualitative and quantitative analyses of sulfur compounds, and this method helped elucidate a potential mechanism of cis-S1PC and SAMC action in AGE. © 2016 American Society for Nutrition.

  10. Thermographic image analysis for classification of ACL rupture disease, bone cancer, and feline hyperthyroid, with Gabor filters

    NASA Astrophysics Data System (ADS)

    Alvandipour, Mehrdad; Umbaugh, Scott E.; Mishra, Deependra K.; Dahal, Rohini; Lama, Norsang; Marino, Dominic J.; Sackman, Joseph

    2017-05-01

    Thermography and pattern classification techniques are used to classify three different pathologies in veterinary images. Thermographic images of both normal and diseased animals were provided by the Long Island Veterinary Specialists (LIVS). The three pathologies are ACL rupture disease, bone cancer, and feline hyperthyroid. The diagnosis of these diseases usually involves radiology and laboratory tests while the method that we propose uses thermographic images and image analysis techniques and is intended for use as a prescreening tool. Images in each category of pathologies are first filtered by Gabor filters and then various features are extracted and used for classification into normal and abnormal classes. Gabor filters are linear filters that can be characterized by the two parameters wavelength λ and orientation θ. With two different wavelength and five different orientations, a total of ten different filters were studied. Different combinations of camera views, filters, feature vectors, normalization methods, and classification methods, produce different tests that were examined and the sensitivity, specificity and success rate for each test were produced. Using the Gabor features alone, sensitivity, specificity, and overall success rates of 85% for each of the pathologies was achieved.

  11. Investigation of Volcanic Seismo-Acoustic Signals: Applying Subspace Detection to Lava Fountain Activity at Etna Volcano

    NASA Astrophysics Data System (ADS)

    Sciotto, M.; Rowe, C. A.; Cannata, A.; Arrowsmith, S.; Privitera, E.; Gresta, S.

    2011-12-01

    The current eruption of Mount Etna, which began in January, 2011, has produced numerous energetic episodes of lava fountaining, which have bee recorded by the INGV seismic and acoustic sensors located on and around the volcano. The source of these events was the pit crater on the east flank of the Southeast crater of Etna. Simultaneously, small levels of activity were noted in the Bocca Nuova as well, prior to its lava fountaining activity. We will present an analysis of seismic and acoustic signals related to the 2011 activity wherein we apply the method of subspace detection to determine whether the source exhibits a temporal evolution within or between fountaining events, or otherwise produces repeating, classifiable events occurring through the continuous explosive degassing. We will examine not only the raw waveforms, but also spectral variations in time as well as time-varying statistical functions such as signal skewness and kurtosis. These results will be compared to straightforward cross-correlation analysis. In addition to classification performance, the subspace method has promise to outperform standard STA/LTA methods for real-time event detection in cases where similar events can be expected.

  12. Utilizing formative evaluation to enhance the understanding of chemistry and the methods and procedures of science

    NASA Astrophysics Data System (ADS)

    Pizzini, Edward L.; Treagust, David F.; Cody, John

    The purpose of this study was to determine whether or not formative evaluation could facilitate goal attainment in a biochemistry course and produce desired learning outcomes consistently by altering course materials and/or instruction. Formative evaluation procedures included the administration of the Inorganic-Organic-Biological Chemistry Test Form 1974 and the Methods and Procedures of Science test to course participants over three consecutive years. A one group pretest-post-test design was used. The statistical analysis involved the use of the Wilcoxon matched-pairs signed-ranks test. The study involved 64 participants. The findings indicate that the use of formative evaluation can be effective in producing desired learning outcomes to facilitate goal attainment.

  13. Effective application of multiple locus variable number of tandem repeats analysis to tracing Staphylococcus aureus in food-processing environment.

    PubMed

    Rešková, Z; Koreňová, J; Kuchta, T

    2014-04-01

    A total of 256 isolates of Staphylococcus aureus were isolated from 98 samples (34 swabs and 64 food samples) obtained from small or medium meat- and cheese-processing plants in Slovakia. The strains were genotypically characterized by multiple locus variable number of tandem repeats analysis (MLVA), involving multiplex polymerase chain reaction (PCR) with subsequent separation of the amplified DNA fragments by an automated flow-through gel electrophoresis. With the panel of isolates, MLVA produced 31 profile types, which was a sufficient discrimination to facilitate the description of spatial and temporal aspects of contamination. Further data on MLVA discrimination were obtained by typing a subpanel of strains by multiple locus sequence typing (MLST). MLVA coupled to automated electrophoresis proved to be an effective, comparatively fast and inexpensive method for tracing S. aureus contamination of food-processing factories. Subspecies genotyping of microbial contaminants in food-processing factories may facilitate identification of spatial and temporal aspects of the contamination. This may help to properly manage the process hygiene. With S. aureus, multiple locus variable number of tandem repeats analysis (MLVA) proved to be an effective method for the purpose, being sufficiently discriminative, yet comparatively fast and inexpensive. The application of automated flow-through gel electrophoresis to separation of DNA fragments produced by multiplex PCR helped to improve the accuracy and speed of the method. © 2013 The Society for Applied Microbiology.

  14. Analysis of Soluble Proteins in Natural Cordyceps sinensis from Different Producing Areas by Sodium Dodecyl Sulfate-Polyacrylamide Gel Electrophoresis and Two-dimensional Electrophoresis

    PubMed Central

    Li, Chun-Hong; Zuo, Hua-Li; Zhang, Qian; Wang, Feng-Qin; Hu, Yuan-Jia; Qian, Zheng-Ming; Li, Wen-Jia; Xia, Zhi-Ning; Yang, Feng-Qing

    2017-01-01

    Background: As one of the bioactive components in Cordyceps sinensis (CS), proteins were rarely used as index components to study the correlation between the protein components and producing areas of natural CS. Objective: Protein components of 26 natural CS samples produced in Qinghai, Tibet, and Sichuan provinces were analyzed and compared to investigate the relationship among 26 different producing areas. Materials and Methods: Proteins from 26 different producing areas were extracted by Tris-HCl buffer with Triton X-100, and separated using sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE) and two-dimensional electrophoresis (2-DE). Results: The SDS-PAGE results indicated that the number of protein bands and optical density curves of proteins in 26 CS samples was a bit different. However, the 2-DE results showed that the numbers and abundance of protein spots in protein profiles of 26 samples were obviously different and showed certain association with producing areas. Conclusions: Based on the expression values of matched protein spots, 26 batches of CS samples can be divided into two main categories (Tibet and Qinghai) by hierarchical cluster analysis. SUMMARY The number of protein bands and optical density curves of proteins in 26 Cordyceps sinensis samples were a bit different on the sodium dodecyl sulfate-polyacrylamide gel electrophoresis protein profilesNumbers and abundance of protein spots in protein profiles of 26 samples were obvious different on two-dimensional electrophoresis mapsTwenty-six different producing areas of natural Cordyceps sinensis samples were divided into two main categories (Tibet and Qinghai) by Hierarchical cluster analysis based on the values of matched protein spots. Abbreviations Used: SDS-PAGE: Sodium dodecyl sulfate polyacrylamide gel electrophoresis, 2-DE: Two-dimensional electrophoresis, Cordyceps sinensis: CS, TCMs: Traditional Chinese medicines PMID:28250651

  15. Prevalence, Characterization, and Mycotoxin Production Ability of Fusarium Species on Korean Adlay (Coix lacrymal-jobi L.) Seeds

    PubMed Central

    An, Tae Jin; Shin, Kyu Seop; Paul, Narayan Chandra; Kim, Young Guk; Cha, Seon Woo; Moon, Yuseok; Yu, Seung Hun; Oh, Sang-Keun

    2016-01-01

    Adlay seed samples were collected from three adlay growing regions (Yeoncheon, Hwasun, and Eumseong region) in Korea during 2012. Among all the samples collected, 400 seeds were tested for fungal occurrence by standard blotter and test tube agar methods and different taxonomic groups of fungal genera were detected. The most predominant fungal genera encountered were Fusarium, Phoma, Alternaria, Cladosporium, Curvularia, Cochliobolus and Leptosphaerulina. Fusarium species accounted for 45.6% of all species found; and, with phylogenetic analysis based on the combined sequences of two protein coding genes (EF-1α and β-tubulin), 10 Fusarium species were characterized namely, F. incarnatum (11.67%), F. kyushuense (10.33%), F. fujikuroi (8.67%), F. concentricum (6.00%), F. asiaticum (5.67%), F. graminearum (1.67%), F. miscanthi (0.67%), F. polyphialidicum (0.33%), F. armeniacum (0.33%), and F. thapsinum (0.33%). The Fusarium species were then examined for their morphological characteristics to confirm their identity. Morphological observations of the species correlated well with and confirmed their molecular identification. The ability of these isolates to produce the mycotoxins fumonisin (FUM) and zearalenone (ZEN) was tested by the ELISA quantitative analysis method. The result revealed that FUM was produced only by F. fujikuroi and that ZEN was produced by F. asiaticum and F. graminearum. PMID:27801779

  16. Prevalence, Characterization, and Mycotoxin Production Ability of Fusarium Species on Korean Adlay (Coix lacrymal-jobi L.) Seeds.

    PubMed

    An, Tae Jin; Shin, Kyu Seop; Paul, Narayan Chandra; Kim, Young Guk; Cha, Seon Woo; Moon, Yuseok; Yu, Seung Hun; Oh, Sang-Keun

    2016-10-27

    Adlay seed samples were collected from three adlay growing regions (Yeoncheon, Hwasun, and Eumseong region) in Korea during 2012. Among all the samples collected, 400 seeds were tested for fungal occurrence by standard blotter and test tube agar methods and different taxonomic groups of fungal genera were detected. The most predominant fungal genera encountered were Fusarium , Phoma , Alternaria , Cladosporium , Curvularia , Cochliobolus and Leptosphaerulina . Fusarium species accounted for 45.6% of all species found; and, with phylogenetic analysis based on the combined sequences of two protein coding genes (EF-1α and β-tubulin), 10 Fusarium species were characterized namely, F. incarnatum (11.67%), F. kyushuense (10.33%), F. fujikuroi (8.67%), F. concentricum (6.00%), F. asiaticum (5.67%), F. graminearum (1.67%), F. miscanthi (0.67%), F. polyphialidicum (0.33%), F. armeniacum (0.33%), and F. thapsinum (0.33%). The Fusarium species were then examined for their morphological characteristics to confirm their identity. Morphological observations of the species correlated well with and confirmed their molecular identification. The ability of these isolates to produce the mycotoxins fumonisin (FUM) and zearalenone (ZEN) was tested by the ELISA quantitative analysis method. The result revealed that FUM was produced only by F. fujikuroi and that ZEN was produced by F. asiaticum and F. graminearum .

  17. Quantitation of lysergic acid diethylamide in urine using atmospheric pressure matrix-assisted laser desorption/ionization ion trap mass spectrometry.

    PubMed

    Cui, Meng; McCooeye, Margaret A; Fraser, Catharine; Mester, Zoltán

    2004-12-01

    A quantitative method was developed for analysis of lysergic acid diethylamide (LSD) in urine using atmospheric pressure matrix-assisted laser desorption/ionization ion trap mass spectrometry (AP MALDI-ITMS). Following solid-phase extraction of LSD from urine samples, extracts were analyzed by AP MALDI-ITMS. The identity of LSD was confirmed by fragmentation of the [M + H](+) ion using tandem mass spectrometry. The quantification of LSD was achieved using stable-isotope-labeled LSD (LSD-d(3)) as the internal standard. The [M + H](+) ion fragmented to produce a dominant fragment ion, which was used for a selected reaction monitoring (SRM) method for quantitative analysis of LSD. SRM was compared with selected ion monitoring and produced a wider linear range and lower limit of quantification. For SRM analysis of samples of LSD spiked in urine, the calibration curve was linear in the range of 1-100 ng/mL with a coefficient of determination, r(2), of 0.9917. This assay was used to determine LSD in urine samples and the AP MALDI-MS results were comparable to the HPLC/ ESI-MS results.

  18. DSC, X-ray and FTIR studies of a gemfibrozil/dimethyl-β-cyclodextrin inclusion complex produced by co-grinding.

    PubMed

    Aigner, Z; Berkesi, O; Farkas, G; Szabó-Révész, P

    2012-01-05

    The steps of formation of an inclusion complex produced by the co-grinding of gemfibrozil and dimethyl-β-cyclodextrin were investigated by differential scanning calorimetry (DSC), X-ray powder diffractometry (XRPD) and Fourier transform infrared (FTIR) spectroscopy with curve-fitting analysis. The endothermic peak at 59.25°C reflecting the melting of gemfibrozil progressively disappeared from the DSC curves of the products on increase of the duration of co-grinding. The crystallinity of the samples too gradually decreased, and after 35min of co-grinding the product was totally amorphous. Up to this co-grinding time, XRPD and FTIR investigations indicated a linear correlation between the cyclodextrin complexation and the co-grinding time. After co-grinding for 30min, the ratio of complex formation did not increase. These studies demonstrated that co-grinding is a suitable method for the complexation of gemfibrozil with dimethyl-β-cyclodextrin. XRPD analysis revealed the amorphous state of the gemfibrozil-dimethyl-β-cyclodextrin product. FTIR spectroscopy with curve-fitting analysis may be useful as a semiquantitative analytical method for discriminating the molecular and amorphous states of gemfibrozil. Copyright © 2011 Elsevier B.V. All rights reserved.

  19. Analyzing psychotherapy process as intersubjective sensemaking: an approach based on discourse analysis and neural networks.

    PubMed

    Nitti, Mariangela; Ciavolino, Enrico; Salvatore, Sergio; Gennaro, Alessandro

    2010-09-01

    The authors propose a method for analyzing the psychotherapy process: discourse flow analysis (DFA). DFA is a technique representing the verbal interaction between therapist and patient as a discourse network, aimed at measuring the therapist-patient discourse ability to generate new meanings through time. DFA assumes that the main function of psychotherapy is to produce semiotic novelty. DFA is applied to the verbatim transcript of the psychotherapy. It defines the main meanings active within the therapeutic discourse by means of the combined use of text analysis and statistical techniques. Subsequently, it represents the dynamic interconnections among these meanings in terms of a "discursive network." The dynamic and structural indexes of the discursive network have been shown to provide a valid representation of the patient-therapist communicative flow as well as an estimation of its clinical quality. Finally, a neural network is designed specifically to identify patterns of functioning of the discursive network and to verify the clinical validity of these patterns in terms of their association with specific phases of the psychotherapy process. An application of the DFA to a case of psychotherapy is provided to illustrate the method and the kinds of results it produces.

  20. A new automated spectral feature extraction method and its application in spectral classification and defective spectra recovery

    NASA Astrophysics Data System (ADS)

    Wang, Ke; Guo, Ping; Luo, A.-Li

    2017-03-01

    Spectral feature extraction is a crucial procedure in automated spectral analysis. This procedure starts from the spectral data and produces informative and non-redundant features, facilitating the subsequent automated processing and analysis with machine-learning and data-mining techniques. In this paper, we present a new automated feature extraction method for astronomical spectra, with application in spectral classification and defective spectra recovery. The basic idea of our approach is to train a deep neural network to extract features of spectra with different levels of abstraction in different layers. The deep neural network is trained with a fast layer-wise learning algorithm in an analytical way without any iterative optimization procedure. We evaluate the performance of the proposed scheme on real-world spectral data. The results demonstrate that our method is superior regarding its comprehensive performance, and the computational cost is significantly lower than that for other methods. The proposed method can be regarded as a new valid alternative general-purpose feature extraction method for various tasks in spectral data analysis.

  1. Comparison of Analytical Methods for the Determination of Uranium in Seawater Using Inductively Coupled Plasma Mass Spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, Jordana R.; Gill, Gary A.; Kuo, Li-Jung

    2016-04-20

    Trace element determinations in seawater by inductively coupled plasma mass spectrometry are analytically challenging due to the typically very low concentrations of the trace elements and the potential interference of the salt matrix. In this study, we did a comparison for uranium analysis using inductively coupled plasma mass spectrometry (ICP-MS) of Sequim Bay seawater samples and three seawater certified reference materials (SLEW-3, CASS-5 and NASS-6) using seven different analytical approaches. The methods evaluated include: direct analysis, Fe/Pd reductive precipitation, standard addition calibration, online automated dilution using an external calibration with and without matrix matching, and online automated pre-concentration. The methodmore » which produced the most accurate results was the method of standard addition calibration, recovering uranium from a Sequim Bay seawater sample at 101 ± 1.2%. The on-line preconcentration method and the automated dilution with matrix-matched calibration method also performed well. The two least effective methods were the direct analysis and the Fe/Pd reductive precipitation using sodium borohydride« less

  2. Multi-level slug tests in highly permeable formations: 2. Hydraulic conductivity identification, method verification, and field applications

    USGS Publications Warehouse

    Zlotnik, V.A.; McGuire, V.L.

    1998-01-01

    Using the developed theory and modified Springer-Gelhar (SG) model, an identification method is proposed for estimating hydraulic conductivity from multi-level slug tests. The computerized algorithm calculates hydraulic conductivity from both monotonic and oscillatory well responses obtained using a double-packer system. Field verification of the method was performed at a specially designed fully penetrating well of 0.1-m diameter with a 10-m screen in a sand and gravel alluvial aquifer (MSEA site, Shelton, Nebraska). During well installation, disturbed core samples were collected every 0.6 m using a split-spoon sampler. Vertical profiles of hydraulic conductivity were produced on the basis of grain-size analysis of the disturbed core samples. These results closely correlate with the vertical profile of horizontal hydraulic conductivity obtained by interpreting multi-level slug test responses using the modified SG model. The identification method was applied to interpret the response from 474 slug tests in 156 locations at the MSEA site. More than 60% of responses were oscillatory. The method produced a good match to experimental data for both oscillatory and monotonic responses using an automated curve matching procedure. The proposed method allowed us to drastically increase the efficiency of each well used for aquifer characterization and to process massive arrays of field data. Recommendations generalizing this experience to massive application of the proposed method are developed.Using the developed theory and modified Springer-Gelhar (SG) model, an identification method is proposed for estimating hydraulic conductivity from multi-level slug tests. The computerized algorithm calculates hydraulic conductivity from both monotonic and oscillatory well responses obtained using a double-packer system. Field verification of the method was performed at a specially designed fully penetrating well of 0.1-m diameter with a 10-m screen in a sand and gravel alluvial aquifer (MSEA site, Shelton, Nebraska). During well installation, disturbed core samples were collected every 0.6 m using a split-spoon sampler. Vertical profiles of hydraulic conductivity were produced on the basis of grain-size analysis of the disturbed core samples. These results closely correlate with the vertical profile of horizontal hydraulic conductivity obtained by interpreting multi-level slug test responses using the modified SG model. The identification method was applied to interpret the response from 474 slug tests in 156 locations at the MSEA site. More than 60% of responses were oscillatory. The method produced a good match to experimental data for both oscillatory and monotonic responses using an automated curve matching procedure. The proposed method allowed us to drastically increase the efficiency of each well used for aquifer characterization and to process massive arrays of field data. Recommendations generalizing this experience to massive application of the proposed method are developed.

  3. Monte Carlo source simulation technique for solution of interference reactions in INAA experiments: a preliminary report

    NASA Astrophysics Data System (ADS)

    Allaf, M. Athari; Shahriari, M.; Sohrabpour, M.

    2004-04-01

    A new method using Monte Carlo source simulation of interference reactions in neutron activation analysis experiments has been developed. The neutron spectrum at the sample location has been simulated using the Monte Carlo code MCNP and the contributions of different elements to produce a specified gamma line have been determined. The produced response matrix has been used to measure peak areas and the sample masses of the elements of interest. A number of benchmark experiments have been performed and the calculated results verified against known values. The good agreement obtained between the calculated and known values suggests that this technique may be useful for the elimination of interference reactions in neutron activation analysis.

  4. The John Charnley Award: an accurate and sensitive method to separate, display, and characterize wear debris: part 1: polyethylene particles.

    PubMed

    Billi, Fabrizio; Benya, Paul; Kavanaugh, Aaron; Adams, John; Ebramzadeh, Edward; McKellop, Harry

    2012-02-01

    Numerous studies indicate highly crosslinked polyethylenes reduce the wear debris volume generated by hip arthroplasty acetabular liners. This, in turns, requires new methods to isolate and characterize them. We describe a method for extracting polyethylene wear particles from bovine serum typically used in wear tests and for characterizing their size, distribution, and morphology. Serum proteins were completely digested using an optimized enzymatic digestion method that prevented the loss of the smallest particles and minimized their clumping. Density-gradient ultracentrifugation was designed to remove contaminants and recover the particles without filtration, depositing them directly onto a silicon wafer. This provided uniform distribution of the particles and high contrast against the background, facilitating accurate, automated, morphometric image analysis. The accuracy and precision of the new protocol were assessed by recovering and characterizing particles from wear tests of three types of polyethylene acetabular cups (no crosslinking and 5 Mrads and 7.5 Mrads of gamma irradiation crosslinking). The new method demonstrated important differences in the particle size distributions and morphologic parameters among the three types of polyethylene that could not be detected using prior isolation methods. The new protocol overcomes a number of limitations, such as loss of nanometer-sized particles and artifactual clumping, among others. The analysis of polyethylene wear particles produced in joint simulator wear tests of prosthetic joints is a key tool to identify the wear mechanisms that produce the particles and predict and evaluate their effects on periprosthetic tissues.

  5. Cloning and sequencing of the histidine decarboxylase genes of gram-negative, histamine-producing bacteria and their application in detection and identification of these organisms in fish.

    PubMed

    Takahashi, Hajime; Kimura, Bon; Yoshikawa, Miwako; Fujii, Tateo

    2003-05-01

    The use of molecular tools for early and rapid detection of gram-negative histamine-producing bacteria is important for preventing the accumulation of histamine in fish products. To date, no molecular detection or identification system for gram-negative histamine-producing bacteria has been developed. A molecular method that allows the rapid detection of gram-negative histamine producers by PCR and simultaneous differentiation by single-strand conformation polymorphism (SSCP) analysis using the amplification product of the histidine decarboxylase genes (hdc) was developed. A collection of 37 strains of histamine-producing bacteria (8 reference strains from culture collections and 29 isolates from fish) and 470 strains of non-histamine-producing bacteria isolated from fish were tested. Histamine production of bacteria was determined by paper chromatography and confirmed by high-performance liquid chromatography. Among 37 strains of histamine-producing bacteria, all histidine-decarboxylating gram-negative bacteria produced a PCR product, except for a strain of Citrobacter braakii. In contrast, none of the non-histamine-producing strains (470 strains) produced an amplification product. Specificity of the amplification was further confirmed by sequencing the 0.7-kbp amplification product. A phylogenetic tree of the isolates constructed using newly determined sequences of partial hdc was similar to the phylogenetic tree generated from 16S ribosomal DNA sequences. Histamine accumulation occurred when PCR amplification of hdc was positive in all of fish samples tested and the presence of powerful histamine producers was confirmed by subsequent SSCP identification. The potential application of the PCR-SSCP method as a rapid monitoring tool is discussed.

  6. Analysis of prototypical narratives produced by aphasic individuals and cognitively healthy subjects

    PubMed Central

    Silveira, Gabriela; Mansur, Letícia Lessa

    2015-01-01

    Aphasia can globally or selectively affect comprehension and production of verbal and written language. Discourse analysis can aid language assessment and diagnosis. Objective [1] To explore narratives that produce a number of valid indicators for diagnosing aphasia in speakers of Brazilian Portuguese. [2] To analyze the macrostructural aspects of the discourse of normal individuals. [3] To analyze the macrostructural aspects of the discourse of aphasic individuals. Methods The macrostructural aspects of three narratives produced by aphasic individuals and cognitively healthy subjects were analyzed. Results A total of 30 volunteers were examined comprising 10 aphasic individuals (AG) and 20 healthy controls (CG). The CG included 5 males. The CG had a mean age of 38.9 years (SD=15.61) and mean schooling of 13 years (SD=2.67) whereas the AG had a mean age of 51.7 years (SD=17.3) and mean schooling of 9.1 years (SD=3.69). Participants were asked to narrate three fairy tales as a basis for analyzing the macrostructure of discourse. Comparison of the three narratives revealed no statistically significant difference in number of propositions produced by the groups. A significant negative correlation was found between age and number of propositions produced. Also, statistically significant differences were observed in the number of propositions produced by the individuals in the CG and the AG for the three tales. Conclusion It was concluded that the three tales are applicable for discourse assessment, containing a similar number of propositions and differentiating aphasic individuals and cognitively healthy subjects based on analysis of the macrostructure of discourse. PMID:29213973

  7. Method to mosaic gratings that relies on analysis of far-field intensity patterns in two wavelengths

    NASA Astrophysics Data System (ADS)

    Hu, Yao; Zeng, Lijiang; Li, Lifeng

    2007-01-01

    We propose an experimental method to coherently mosaic two planar diffraction gratings. The method uses a Twyman-Green interferometer to guarantee the planar parallelism of the two sub-aperture gratings, and obtains the in-plane rotational error and the two translational errors from analysis of the far-field diffraction intensity patterns in two alignment wavelengths. We adjust the relative attitude and position of the two sub-aperture gratings to produce Airy disk diffraction patterns in both wavelengths. In our experiment, the repeatability of in-plane rotation adjustment was 2.35 μrad and that of longitudinal adjustment was 0.11 μm. The accuracy of lateral adjustment was about 2.9% of the grating period.

  8. Second Generation International Space Station (ISS) Total Organic Carbon Analyzer (TOCA) Verification Testing and On-Orbit Performance Results

    NASA Technical Reports Server (NTRS)

    Bentley, Nicole L.; Thomas, Evan A.; VanWie, Michael; Morrison, Chad; Stinson, Richard G.

    2010-01-01

    The Total Organic Carbon Analyzer (TOGA) is designed to autonomously determine recovered water quality as a function of TOC. The current TOGA has been on the International Space Station since November 2008. Functional checkout and operations revealed complex operating considerations. Specifically, failure of the hydrogen catalyst resulted in the development of an innovative oxidation analysis method. This method reduces the activation time and limits the hydrogen produced during analysis, while retaining the ability to indicate TOC concentrations within 25% accuracy. Subsequent testing and comparison to archived samples returned from the Station and tested on the ground yield high confidence in this method, and in the quality of the recovered water.

  9. On 3-D inelastic analysis methods for hot section components. Volume 1: Special finite element models

    NASA Technical Reports Server (NTRS)

    Nakazawa, S.

    1988-01-01

    This annual status report presents the results of work performed during the fourth year of the 3-D Inelastic Analysis Methods for Hot Section Components program (NASA Contract NAS3-23697). The objective of the program is to produce a series of new computer codes permitting more accurate and efficient 3-D analysis of selected hot section components, i.e., combustor liners, turbine blades and turbine vanes. The computer codes embody a progression of math models and are streamlined to take advantage of geometrical features, loading conditions, and forms of material response that distinguish each group of selected components. Volume 1 of this report discusses the special finite element models developed during the fourth year of the contract.

  10. Analysis of macromolecules, ligands and macromolecule-ligand complexes

    DOEpatents

    Von Dreele, Robert B [Los Alamos, NM

    2008-12-23

    A method for determining atomic level structures of macromolecule-ligand complexes through high-resolution powder diffraction analysis and a method for providing suitable microcrystalline powder for diffraction analysis are provided. In one embodiment, powder diffraction data is collected from samples of polycrystalline macromolecule and macromolecule-ligand complex and the refined structure of the macromolecule is used as an approximate model for a combined Rietveld and stereochemical restraint refinement of the macromolecule-ligand complex. A difference Fourier map is calculated and the ligand position and points of interaction between the atoms of the macromolecule and the atoms of the ligand can be deduced and visualized. A suitable polycrystalline sample of macromolecule-ligand complex can be produced by physically agitating a mixture of lyophilized macromolecule, ligand and a solvent.

  11. Fast Image Texture Classification Using Decision Trees

    NASA Technical Reports Server (NTRS)

    Thompson, David R.

    2011-01-01

    Texture analysis would permit improved autonomous, onboard science data interpretation for adaptive navigation, sampling, and downlink decisions. These analyses would assist with terrain analysis and instrument placement in both macroscopic and microscopic image data products. Unfortunately, most state-of-the-art texture analysis demands computationally expensive convolutions of filters involving many floating-point operations. This makes them infeasible for radiation- hardened computers and spaceflight hardware. A new method approximates traditional texture classification of each image pixel with a fast decision-tree classifier. The classifier uses image features derived from simple filtering operations involving integer arithmetic. The texture analysis method is therefore amenable to implementation on FPGA (field-programmable gate array) hardware. Image features based on the "integral image" transform produce descriptive and efficient texture descriptors. Training the decision tree on a set of training data yields a classification scheme that produces reasonable approximations of optimal "texton" analysis at a fraction of the computational cost. A decision-tree learning algorithm employing the traditional k-means criterion of inter-cluster variance is used to learn tree structure from training data. The result is an efficient and accurate summary of surface morphology in images. This work is an evolutionary advance that unites several previous algorithms (k-means clustering, integral images, decision trees) and applies them to a new problem domain (morphology analysis for autonomous science during remote exploration). Advantages include order-of-magnitude improvements in runtime, feasibility for FPGA hardware, and significant improvements in texture classification accuracy.

  12. An improved soft-chemistry approach to the preparation of spinel powders

    NASA Astrophysics Data System (ADS)

    Cook, Ronald

    2007-04-01

    Spinel powders for the production of transparent polycrystalline ceramic windows have been produced using a number of traditional ceramic and sol-gel methods. We have demonstrated that magnesium aluminate spinel powders produced from the reaction of organo-magnesium compounds with surface modified boehmite precursors can be used to produce high quality transparent spinel parts. In previous work, the spinel powders were prepared by the reaction of surface-modified boehmite nanoparticles with magnesium acetylacetonate. While the magnesium acetylacetonate can produce small quantities of high quality spinel powders, it use for large scale production of spinel powders is problematic. Through a thermodynamic analysis we have identified a new high-purity, low-cost, low-toxicity organomagnesium compound that reacts the with surface modified boehmite nanoparticles to produce a spinel precursor. The magnesium doped precursor readily transforms into pure phase spinel at temperature between 900°C and 1200°C.

  13. Method and apparatus for assessing weld quality

    DOEpatents

    Smartt, Herschel B.; Kenney, Kevin L.; Johnson, John A.; Carlson, Nancy M.; Clark, Denis E.; Taylor, Paul L.; Reutzel, Edward W.

    2001-01-01

    Apparatus for determining a quality of a weld produced by a welding device according to the present invention includes a sensor operatively associated with the welding device. The sensor is responsive to at least one welding process parameter during a welding process and produces a welding process parameter signal that relates to the at least one welding process parameter. A computer connected to the sensor is responsive to the welding process parameter signal produced by the sensor. A user interface operatively associated with the computer allows a user to select a desired welding process. The computer processes the welding process parameter signal produced by the sensor in accordance with one of a constant voltage algorithm, a short duration weld algorithm or a pulsed current analysis module depending on the desired welding process selected by the user. The computer produces output data indicative of the quality of the weld.

  14. [Extended-spectrum beta-lactamase detection in Enterobacteriaceae and antibiotic susceptibility analysis].

    PubMed

    Cao, Wei; Tong, Ming-hua; Wang, Ji-gui

    2002-02-28

    To detect the extended-spectrum beta-lactamases (ESBLs) in family Enterobacteriaceae and analyze the antibiotic susceptibility of those ESBLs-producing strains. ESBLs were determined by the double-disk confirmatory test and 8 antibiotic susceptibilities were tested with the disk disffusion method in those strains producing ESBLs. Forty-seven ESBLs-producing strains comprised of 25 of E. coli, 14 of K. pneumoniae, 5 of E. cloacae, 1 of K. oxytoca, 1 of K. rhinoscleromatis, and 1 of S. liquefaciens. The susceptibility rates of those strains were: 100% for imipenem and meropenem, 89.4% for piperacillin/tazobactam, 72.4% for cefoxitin and 65.9% for cefotetan. E. coli and K. pneumoniae are the prime strains producing ESBLs in Enterobacteriaceae. Imipenem and meropenem are the best drugs to deal with those ESBLs-producing strains. Piperacillin/tazobactam is better than cephamycins and other beta-lactama/beta-lactamase inhibitor combination.

  15. Combining conversation analysis and event sequencing to study health communication.

    PubMed

    Pecanac, Kristen E

    2018-06-01

    Good communication is essential in patient-centered care. The purpose of this paper is to describe conversation analysis and event sequencing and explain how integrating these methods strengthened the analysis in a study of communication between clinicians and surrogate decision makers in an intensive care unit. Conversation analysis was first used to determine how clinicians introduced the need for decision-making regarding life-sustaining treatment and how surrogate decision makers responded. Event sequence analysis then was used to determine the transitional probability (probability of one event leading to another in the interaction) that a given type of clinician introduction would lead to surrogate resistance or alignment. Conversation analysis provides a detailed analysis of the interaction between participants in a conversation. When combined with a quantitative analysis of the patterns of communication in an interaction, these data add information on the communication strategies that produce positive outcomes. Researchers can apply this mixed-methods approach to identify beneficial conversational practices and design interventions to improve health communication. © 2018 Wiley Periodicals, Inc.

  16. Two-Step Cycle for Producing Multiple Anodic Aluminum Oxide (AAO) Films with Increasing Long-Range Order

    PubMed Central

    2017-01-01

    Nanoporous anodic aluminum oxide (AAO) membranes are being used for an increasing number of applications. However, the original two-step anodization method in which the first anodization is sacrificial to pre-pattern the second is still widely used to produce them. This method provides relatively low throughput and material utilization as half of the films are discarded. An alternative scheme that relies on alternating anodization and cathodic delamination is demonstrated that allows for the fabrication of several AAO films with only one sacrificial layer thus greatly improving total aluminum to alumina yield. The thickness for which the cathodic delamination performs best to yield full, unbroken AAO sheets is around 85 μm. Additionally, an image analysis method is used to quantify the degree of long-range ordering of the unit cells in the AAO films which was found to increase with each successive iteration of the fabrication cycle. PMID:28630684

  17. Two-Step Cycle for Producing Multiple Anodic Aluminum Oxide (AAO) Films with Increasing Long-Range Order.

    PubMed

    Choudhary, Eric; Szalai, Veronika

    2016-01-01

    Nanoporous anodic aluminum oxide (AAO) membranes are being used for an increasing number of applications. However, the original two-step anodization method in which the first anodization is sacrificial to pre-pattern the second is still widely used to produce them. This method provides relatively low throughput and material utilization as half of the films are discarded. An alternative scheme that relies on alternating anodization and cathodic delamination is demonstrated that allows for the fabrication of several AAO films with only one sacrificial layer thus greatly improving total aluminum to alumina yield. The thickness for which the cathodic delamination performs best to yield full, unbroken AAO sheets is around 85 μm. Additionally, an image analysis method is used to quantify the degree of long-range ordering of the unit cells in the AAO films which was found to increase with each successive iteration of the fabrication cycle.

  18. Design for producing fiberglass fabric in a lunar environment

    NASA Technical Reports Server (NTRS)

    Benson, Rafer M.; Causby, Dana R.; Johnson, Michael C.; Storey, Mark A.; Tran, Dal T.; Zahr, Thomas A.

    1992-01-01

    The purpose of this project was to design a method of producing a fabric material on the lunar surface from readily available glass fibers. Various methods for forming fabrics were analyzed to determine which methods were appropriate for the lunar conditions. A nonwoven process was determined to be the most suitable process for making a fabric material out of fiberglass under these conditions. Various resins were considered for adhering the fibers. A single thermoplastic resin (AURUM) was found to be the only applicable resin. The end product of the process was determined to be suitable for use as a roadway surfacing material, canopy material, reflective material, or packaging material. A cost analysis of the lunar process versus shipping the end-product from the Earth suggests that the lunar formation is highly feasible. A design for a lunar, nonwoven process was determined and is included.

  19. Design for producing fiberglass fabric in a lunar environment

    NASA Technical Reports Server (NTRS)

    Dorrity, J. Lewis; Patel, Suneer; Benson, Rafer M.; Johnson, Michael C.; Storey, Mark A.; Tran, Dai T.; Zahr, Thomas A.; Causby, Dana R.

    1992-01-01

    The purpose of this project was to design a method of producing a fabric material on the lunar surface from readily available glass fibers. Various methods for forming fabrics were analyzed to determine which methods were appropriate for the lunar conditions. A nonwoven process was determined to be the most suitable process for making a fabric material out of fiberglass under these conditions. Various resins were considered for adhering the fibers. A single thermoplastic resin (AURUM) was found to be the only applicable resin. The end product of the process was determined to be suitable for use as a roadway surfacing material, canopy materials, reflective material, or packaging material. A cost analysis of the lunar process versus shipping the end-product from the earth suggests that the lunar formation is highly feasible. A design for a lunar, nonwoven process was determined and included in the following document.

  20. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohd, Shukri; Holford, Karen M.; Pullin, Rhys

    2014-02-12

    Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup usingmore » H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure.« less

  1. An attempt to understand flicker vertigo: changes in pupil size and choroidal blood flow under flickering conditions

    NASA Astrophysics Data System (ADS)

    Masi, Maria Gabriella; Peretto, Lorenzo; Rovati, Luigi; Ansari, Rafat R.

    2010-02-01

    Light flickering at a rate of 4- 20 cycles per second can produce unpleasant reactions such as nausea and vertigo. In this paper, the possibility of achieving an objective evaluation/prediction of the physiological effects induced by flicker is investigated using a new imaging method based on the pupil size determination. This method is also compared with the blood flow analysis in the choroid.

  2. Diagnostic support for glaucoma using retinal images: a hybrid image analysis and data mining approach.

    PubMed

    Yu, Jin; Abidi, Syed Sibte Raza; Artes, Paul; McIntyre, Andy; Heywood, Malcolm

    2005-01-01

    The availability of modern imaging techniques such as Confocal Scanning Laser Tomography (CSLT) for capturing high-quality optic nerve images offer the potential for developing automatic and objective methods for diagnosing glaucoma. We present a hybrid approach that features the analysis of CSLT images using moment methods to derive abstract image defining features. The features are then used to train classifers for automatically distinguishing CSLT images of normal and glaucoma patient. As a first, in this paper, we present investigations in feature subset selction methods for reducing the relatively large input space produced by the moment methods. We use neural networks and support vector machines to determine a sub-set of moments that offer high classification accuracy. We demonstratee the efficacy of our methods to discriminate between healthy and glaucomatous optic disks based on shape information automatically derived from optic disk topography and reflectance images.

  3. Antimicrobial Resistance of Thermotolerant Campylobacter Species Isolated from Humans, Food-Producing Animals, and Products of Animal Origin: A Worldwide Meta-Analysis.

    PubMed

    Signorini, Marcelo L; Rossler, Eugenia; Díaz David, Diego C; Olivero, Carolina R; Romero-Scharpen, Analía; Soto, Lorena P; Astesana, Diego M; Berisvil, Ayelen P; Zimmermann, Jorge A; Fusari, Marcia L; Frizzo, Laureano S; Zbrun, María V

    2018-04-30

    The objective of this meta-analysis was to summarize available information on the prevalence of antimicrobial-resistant Campylobacter species in humans, food-producing animals, and products of animal origin. A number of multilevel random-effect meta-analysis models were fitted to estimate mean occurrence rate of antimicrobial-resistant thermotolerant Campylobacter and to compare them throughout the years and among the species, food-producing animals (i.e., bovine, pigs, broilers, hen, goat, and sheep), country of origin, sample type, methodology to determine the antimicrobial susceptibility, and the species of Campylobacter. Among the considered antibiotics, thermotolerant Campylobacter showed the highest resistance to tetracycline (pool estimate [PE] = 0.493; 95% CI 0.466-0.519), nalidixic acid (PE = 0.385; 95% CI 0.348-0.423), and ciprofloxacin (PE = 0.376; 95% CI 0.339-0.415). In general, the prevalence of antimicrobial-resistant Campylobacter spp. was higher in hen, broilers, and swine. Campylobacter coli showed a higher prevalence of antimicrobial resistance than Campylobacter jejuni. Independent of the antimicrobial evaluated, the disk diffusion method showed higher prevalence of antimicrobial-resistant Campylobacter than the methods based on the minimum inhibitory concentration estimation. The meta-analysis showed that the prevalence of antimicrobial-resistant Campylobacter is relevant essentially in foods derived from hens and broilers, and it was observed worldwide. The prevalence of this pathogen is of public health importance and the increase in the prevalence of Campylobacter strains resistant to the antimicrobial of choice worsens the situation, hence, national authorities must monitor the situation in each country with the aim to establish the appropriate risk management measures.

  4. Fungicide residue identification and discrimination using a conducting polymer electronic-nose

    Treesearch

    Alphus D. Wilson

    2013-01-01

    The identification of fungicide residues on crop foliage is necessary to make periodic pest management decisions. The determination of fungicide residue identities currently is difficult and time consuming using conventional chemical analysis methods such as gas chromatography-mass spectroscopy. Different fungicide types produce unique electronic aroma signature...

  5. Critical Path Method Networks and Their Use in Claims Analysis.

    DTIC Science & Technology

    1984-01-01

    produced will only be as good as the time invested and the knowledge of the scheduler. A schedule which is based on faulty logic or which contains... fundementals of putting a schedule together but also *how the construction process functions so that the delays can be accurately inserted. When

  6. The Internationalisation of Higher Education: Perspectives on Self-Conceptions in Teaching

    ERIC Educational Resources Information Center

    Korhonen, Vesa; Weil, Markus

    2015-01-01

    In this article, we aim to identify university teachers' experiences of their roles and practices in different internationalising higher education environments. Reflections on the university teachers' self-conceptions are examined using a specific story-writing assignment and content analysis of the stories teachers produce. The method was adapted…

  7. ELISA MEASUREMENT OF STACHYLYSIN (TM) IN SERUM TO QUANTIFY HUMAN EXPOSURES TO THE INDOOR MOLD STACHYBOTRYS CHARTARUM

    EPA Science Inventory

    Antibodies were produced against the hemolytic agent stachylysin obtained from the mold Stachybotryis chartarum. These antibodies were used to develop two enzyme-linked immunosorbent assay (ELISA) methods for the analysis of stachylysin in human and rat sera and environmental sa...

  8. METHOD EVALUATION TO MEASURE PERSISTENT BIOACCUMULATIVE TOXIC POLLUTANTS IN COW MILK

    EPA Science Inventory

    It is important to understand the persistent and bioaccumulative toxic (PBT) levels in milk, as milk fat may be one of the highest dietary sources of PBT exposure. Analysis of milk also allows the opportunity to investigate geographic variability, as milk is produced and distrib...

  9. Stages in Educational Reform; The Max Planck Institute Has Produced a Report on Education.

    ERIC Educational Resources Information Center

    Pfeffer, Gottfried

    1981-01-01

    Outlines the Max Planck Institute's exhaustive report on West German educational trends since World War II. An analysis of the effects of changing social values and demographic factors on educational policy, school organization, enrollment trends, curriculum design, and teaching methods is included. (AM)

  10. Axisymmetric inlet minimum weight design method

    NASA Technical Reports Server (NTRS)

    Nadell, Shari-Beth

    1995-01-01

    An analytical method for determining the minimum weight design of an axisymmetric supersonic inlet has been developed. The goal of this method development project was to improve the ability to predict the weight of high-speed inlets in conceptual and preliminary design. The initial model was developed using information that was available from inlet conceptual design tools (e.g., the inlet internal and external geometries and pressure distributions). Stiffened shell construction was assumed. Mass properties were computed by analyzing a parametric cubic curve representation of the inlet geometry. Design loads and stresses were developed at analysis stations along the length of the inlet. The equivalent minimum structural thicknesses for both shell and frame structures required to support the maximum loads produced by various load conditions were then determined. Preliminary results indicated that inlet hammershock pressures produced the critical design load condition for a significant portion of the inlet. By improving the accuracy of inlet weight predictions, the method will improve the fidelity of propulsion and vehicle design studies and increase the accuracy of weight versus cost studies.

  11. Pyrocatechol violet in pharmaceutical analysis. Part I. A spectrophotometric method for the determination of some beta-lactam antibiotics in pure and in pharmaceutical dosage forms.

    PubMed

    Amin, A S

    2001-03-01

    A fairly sensitive, simple and rapid spectrophotometric method for the determination of some beta-lactam antibiotics, namely ampicillin (Amp), amoxycillin (Amox), 6-aminopenicillanic acid (6APA), cloxacillin (Clox), dicloxacillin (Diclox) and flucloxacillin sodium (Fluclox) in bulk samples and in pharmaceutical dosage forms is described. The proposed method involves the use of pyrocatechol violet as a chromogenic reagent. These drugs produce a reddish brown coloured ion pair with absorption maximum at 604, 641, 645, 604, 649 and 641 nm for Amp, Amox, 6APA, Clox, Diclox and Flucolx, respectively. The colours produced obey Beer's law and are suitable for the quantitative determination of the named compounds. The optimization of different experimental conditions is described. The molar ratio of the ion pairs was established and a proposal for the reaction pathway is given. The procedure described was applied successfully to determine the examined drugs in dosage forms and the results obtained were comparable to those obtained with the official methods.

  12. Mass Spectrometric Analysis of Synthetic Organic Pigments.

    PubMed

    Sugaya, Naeko; Takahashi, Mitsuko; Sakurai, Katsumi; Tanaka, Nobuko; Okubo, Ichiro; Kawakami, Tsuyoshi

    2018-04-18

    Though synthetic organic colorants are used in various applications nowadays, there is the concern that impurities by-produced during the manufacturing and degradation products in some of these colorants are persistent organic pollutants and carcinogens. Thus, it is important to identify the synthetic organic colorants in various products, such as commercial paints, ink, cosmetics, food, textile, and plastics. Dyes, which are soluble in water and other solvents, could be analyzed by chromatographic methods. In contrast, it is difficult to analyze synthetic organic pigments by these methods because of their insolubility. This review is an overview of mass spectrometric analysis of synthetic organic pigments by various ionization methods. We highlight a recent study of textile samples by atmospheric pressure solid analysis probe MS. Furthermore, the mass spectral features of synthetic organic pigments and their separation from other components such as paint media and plasticizers are discussed.

  13. Matrix Assisted Ionization Vacuum (MAIV), a New Ionization Method for Biological Materials Analysis Using Mass Spectrometry*

    PubMed Central

    Inutan, Ellen D.; Trimpin, Sarah

    2013-01-01

    The introduction of electrospray ionization (ESI) and matrix-assisted laser desorption/ionization (MALDI) for the mass spectrometric analysis of peptides and proteins had a dramatic impact on biological science. We now report that a wide variety of compounds, including peptides, proteins, and protein complexes, are transported directly from a solid-state small molecule matrix to gas-phase ions when placed into the vacuum of a mass spectrometer without the use of high voltage, a laser, or added heat. This ionization process produces ions having charge states similar to ESI, making the method applicable for high performance mass spectrometers designed for atmospheric pressure ionization. We demonstrate highly sensitive ionization using intermediate pressure MALDI and modified ESI sources. This matrix and vacuum assisted soft ionization method is suitable for the direct surface analysis of biological materials, including tissue, via mass spectrometry. PMID:23242551

  14. A new approach to monitoring the social environment for natural resource management and policy: The case of US national forest benefits and values

    Treesearch

    David N. Bengston; David P. Fan; D. N. Celarier

    1999-01-01

    This paper describes a new approach for monitoring the social environment for natural resource management and policy, based on content analysis of online news media stories. Content analysis of the media has repeatedly been shown to produce results that are closely correlated with attitude surveys and opinion polls. Computer methods were used to analyse almost 30,000...

  15. An Analysis of the Navy’s Fiscal Year 2017 Shipbuilding Plan

    DTIC Science & Technology

    2017-02-01

    Navy would build a larger fleet of about 350 ships (see Table 5). Those three alternatives were chosen for illustrative purposes because variations ...3.2 billion. 2. For more on procedures for estimating and applying learning curves, see Matthew S. Goldberg and Anduin E. Touw, Statistical Methods...guidance from Matthew Goldberg (formerly of CBO) and David Mosher. Raymond Hall of CBO’s Budget Analysis Division produced the cost estimates with

  16. Principles of cost-effective resource allocation in health care organizations.

    PubMed

    Weinstein, M C

    1990-01-01

    Cost-effectiveness analysis (CEA) is a method of economic evaluation that can be used to assess the efficiency with which health care technologies use limited resources to produce health outputs. However, inconsistencies in the way that such ratios are constructed often lead to misleading conclusions when CEAs are compared. Some of these inconsistencies, such as failure to discount or to calculate incremental ratios correctly, reflect analytical errors that, if corrected, would resolve the inconsistencies. Others reflect fundamental differences in the viewpoint of the analysis. The perspectives of different decision-making entities can properly lead to different items in the numerator and denominator of the cost-effectiveness (C/E) ratio. Producers and consumers of CEA need to be more conscious of the perspectives of analysis, so that C/E comparisons from a given perspective are based upon a common understanding of the elements that are properly included.

  17. Analysis of Exhaled Breath for Disease Detection

    NASA Astrophysics Data System (ADS)

    Amann, Anton; Miekisch, Wolfram; Schubert, Jochen; Buszewski, Bogusław; Ligor, Tomasz; Jezierski, Tadeusz; Pleil, Joachim; Risby, Terence

    2014-06-01

    Breath analysis is a young field of research with great clinical potential. As a result of this interest, researchers have developed new analytical techniques that permit real-time analysis of exhaled breath with breath-to-breath resolution in addition to the conventional central laboratory methods using gas chromatography-mass spectrometry. Breath tests are based on endogenously produced volatiles, metabolites of ingested precursors, metabolites produced by bacteria in the gut or the airways, or volatiles appearing after environmental exposure. The composition of exhaled breath may contain valuable information for patients presenting with asthma, renal and liver diseases, lung cancer, chronic obstructive pulmonary disease, inflammatory lung disease, or metabolic disorders. In addition, oxidative stress status may be monitored via volatile products of lipid peroxidation. Measurement of enzyme activity provides phenotypic information important in personalized medicine, whereas breath measurements provide insight into perturbations of the human exposome and can be interpreted as preclinical signals of adverse outcome pathways.

  18. An innovative method for nondestructive analysis of cast iron artifacts at Hopewell Furnace National Historic Site, Pennsylvania

    USGS Publications Warehouse

    Sloto, Ronald A.; Martin f. Helmke,

    2014-01-01

    Sampling cast iron produced by the furnace posed two problems. First, verification that the iron was actually cast at Hopewell Furnace was necessary, as some iron objects found at Hopewell may not have originated there. This was accomplished by using artifacts on display at the Hopewell visitor center (fig. 2). All artifacts on display have been positively attributed to the furnace, and stoves produced by the furnace are easily recognized by the name “Hopewell” cast into them. The second problem was the analysis of the trace metal content of the cast iron, because it was not possible to break off part of a historically important artifact and send it to a laboratory for analysis. This problem was solved when the USGS collaborated with West Chester University, which owns a portable X-ray fluorescence (XRF) spectrometer.

  19. Repertoire and classification of non-song calls in Southeast Alaskan humpback whales (Megaptera novaeangliae).

    PubMed

    Fournet, Michelle E; Szabo, Andy; Mellinger, David K

    2015-01-01

    On low-latitude breeding grounds, humpback whales produce complex and highly stereotyped songs as well as a range of non-song sounds associated with breeding behaviors. While on their Southeast Alaskan foraging grounds, humpback whales produce a range of previously unclassified non-song vocalizations. This study investigates the vocal repertoire of Southeast Alaskan humpback whales from a sample of 299 non-song vocalizations collected over a 3-month period on foraging grounds in Frederick Sound, Southeast Alaska. Three classification systems were used, including aural spectrogram analysis, statistical cluster analysis, and discriminant function analysis, to describe and classify vocalizations. A hierarchical acoustic structure was identified; vocalizations were classified into 16 individual call types nested within four vocal classes. The combined classification method shows promise for identifying variability in call stereotypy between vocal groupings and is recommended for future classification of broad vocal repertoires.

  20. An introduction to wavelet analysis in oceanography and meteorology - With application to the dispersion of Yanai waves

    NASA Technical Reports Server (NTRS)

    Meyers, Steven D.; Kelly, B. G.; O'Brien, J. J.

    1993-01-01

    Wavelet analysis is a relatively new technique that is an important addition to standard signal analysis methods. Unlike Fourier analysis that yields an average amplitude and phase for each harmonic in a dataset, the wavelet transform produces an instantaneous estimate or local value for the amplitude and phase of each harmonic. This allows detailed study of nonstationary spatial or time-dependent signal characteristics. The wavelet transform is discussed, examples are given, and some methods for preprocessing data for wavelet analysis are compared. By studying the dispersion of Yanai waves in a reduced gravity equatorial model, the usefulness of the transform is demonstrated. The group velocity is measured directly over a finite range of wavenumbers by examining the time evolution of the transform. The results agree well with linear theory at higher wavenumber but the measured group velocity is reduced at lower wavenumbers, possibly due to interaction with the basin boundaries.

  1. A generic, cost-effective, and scalable cell lineage analysis platform

    PubMed Central

    Biezuner, Tamir; Spiro, Adam; Raz, Ofir; Amir, Shiran; Milo, Lilach; Adar, Rivka; Chapal-Ilani, Noa; Berman, Veronika; Fried, Yael; Ainbinder, Elena; Cohen, Galit; Barr, Haim M.; Halaban, Ruth; Shapiro, Ehud

    2016-01-01

    Advances in single-cell genomics enable commensurate improvements in methods for uncovering lineage relations among individual cells. Current sequencing-based methods for cell lineage analysis depend on low-resolution bulk analysis or rely on extensive single-cell sequencing, which is not scalable and could be biased by functional dependencies. Here we show an integrated biochemical-computational platform for generic single-cell lineage analysis that is retrospective, cost-effective, and scalable. It consists of a biochemical-computational pipeline that inputs individual cells, produces targeted single-cell sequencing data, and uses it to generate a lineage tree of the input cells. We validated the platform by applying it to cells sampled from an ex vivo grown tree and analyzed its feasibility landscape by computer simulations. We conclude that the platform may serve as a generic tool for lineage analysis and thus pave the way toward large-scale human cell lineage discovery. PMID:27558250

  2. Fusing modeling techniques to support domain analysis for reuse opportunities identification

    NASA Technical Reports Server (NTRS)

    Hall, Susan Main; Mcguire, Eileen

    1993-01-01

    Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.

  3. Design rainfall depth estimation through two regional frequency analysis methods in Hanjiang River Basin, China

    NASA Astrophysics Data System (ADS)

    Xu, Yue-Ping; Yu, Chaofeng; Zhang, Xujie; Zhang, Qingqing; Xu, Xiao

    2012-02-01

    Hydrological predictions in ungauged basins are of significant importance for water resources management. In hydrological frequency analysis, regional methods are regarded as useful tools in estimating design rainfall/flood for areas with only little data available. The purpose of this paper is to investigate the performance of two regional methods, namely the Hosking's approach and the cokriging approach, in hydrological frequency analysis. These two methods are employed to estimate 24-h design rainfall depths in Hanjiang River Basin, one of the largest tributaries of Yangtze River, China. Validation is made through comparing the results to those calculated from the provincial handbook approach which uses hundreds of rainfall gauge stations. Also for validation purpose, five hypothetically ungauged sites from the middle basin are chosen. The final results show that compared to the provincial handbook approach, the Hosking's approach often overestimated the 24-h design rainfall depths while the cokriging approach most of the time underestimated. Overall, the Hosking' approach produced more accurate results than the cokriging approach.

  4. Analysis of anabolic steroids in urine by gas chromatography-microchip atmospheric pressure photoionization-mass spectrometry with chlorobenzene as dopant.

    PubMed

    Hintikka, Laura; Haapala, Markus; Kuuranne, Tiia; Leinonen, Antti; Kostiainen, Risto

    2013-10-18

    A gas chromatography-microchip atmospheric pressure photoionization-tandem mass spectrometry (GC-μAPPI-MS/MS) method was developed for the analysis of anabolic androgenic steroids in urine as their trimethylsilyl derivatives. The method utilizes a heated nebulizer microchip in atmospheric pressure photoionization mode (μAPPI) with chlorobenzene as dopant, which provides high ionization efficiency by producing abundant radical cations with minimal fragmentation. The performance of GC-μAPPI-MS/MS was evaluated with respect to repeatability, linearity, linear range, and limit of detection (LOD). The results confirmed the potential of the method for doping control analysis of anabolic steroids. Repeatability (RSD<10%), linearity (R(2)≥0.996) and sensitivity (LODs 0.05-0.1ng/mL) were acceptable. Quantitative performance of the method was tested and compared with that of conventional GC-electron ionization-MS, and the results were in good agreement. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Using budget-friendly methods to analyze sport specific movements

    NASA Astrophysics Data System (ADS)

    Jackson, Lindsay; Williams, Sarah; Ferrara, Davon

    2015-03-01

    When breaking down the physics behind sport specific movements, athletes, usually professional, are often assessed in multimillion-dollar laboratories and facilities. Budget-friendly methods, such as video analysis using low-cost cameras, iPhone sensors, or inexpensive force sensors can make this process more accessible to amateur athletes, which in-turn can give insight into injury mechanisms. Here we present a comparison of two methods of determining the forces experienced by a cheerleader during co-education stunting and soccer goalies while side-diving. For the cheerleader, accelerometer measurements were taken by an iPhone 5 and compared to video analysis. The measurements done on the soccer players were taken using FlexiForce force sensors and again compared to video analysis. While these budget-friendly methods could use some refining, they show promise for producing usable measurements for possibly increasing our understanding of injury in amateur players. Furthermore, low-cost physics experiments with sports can foster an active learning environment for students with minimum physics and mathematical background.

  6. A Review on overboard CEOR discharged produced water treatment and remediation

    NASA Astrophysics Data System (ADS)

    Rawindran, H.; Krishnan, S.; Sinnathambi, C. M.

    2017-06-01

    Produced water is a waste by-product generated during oil and gas recovery operations. It contains the mixture of organic and inorganic compounds. Produced water management is a challenge faced by the petroleum practitioners worldwide. Build-up of chemical wastes from produced water causes huge footprint, which results in high CapEx and OpEx. Different technologies are practiced by various practitioners to treat the produced waste water. However, the constituents removed by each technology and the degree of organic compound removal has to be considered to identify the potential and effective treatment technologies for offshore industrial applications. Current produced water technologies and their successful applications have advantages and disadvantages and can be ranked on the basis of several factors, such as their discharge limit into water bodies, reinjection in producing well, or for any miscellaneous beneficial use. This paper attempts to provide a review of existing physical and chemical treatment technologies used for management of produced water. Based on our analysis, suitable methods will be recommended for offshore waste water treatment technologies.

  7. Gaia DR2 documentation Chapter 7: Variability

    NASA Astrophysics Data System (ADS)

    Eyer, L.; Guy, L.; Distefano, E.; Clementini, G.; Mowlavi, N.; Rimoldini, L.; Roelens, M.; Audard, M.; Holl, B.; Lanzafame, A.; Lebzelter, T.; Lecoeur-Taïbi, I.; Molnár, L.; Ripepi, V.; Sarro, L.; Jevardat de Fombelle, G.; Nienartowicz, K.; De Ridder, J.; Juhász, Á.; Molinaro, R.; Plachy, E.; Regibo, S.

    2018-04-01

    This chapter of the Gaia DR2 documentation describes the models and methods used on the 22 months of data to produce the Gaia variable star results for Gaia DR2. The variability processing and analysis was based mostly on the calibrated G and integrated BP and RP photometry. The variability analysis approach to the Gaia data has been described in Eyer et al. (2017), and the Gaia DR2 results are presented in Holl et al. (2018). Detailed methods on specific topics will be published in a number of separate articles. Variability behaviour in the colour magnitude diagram is presented in Gaia Collaboration et al. (2018c).

  8. Digital color analysis of color-ratio composite LANDSAT scenes. [Nevada

    NASA Technical Reports Server (NTRS)

    Raines, G. L.

    1977-01-01

    A method is presented that can be used to calculate approximate Munsell coordinates of the colors produced by making a color composite from three registered images. Applied to the LANDSAT MSS data of the Goldfield, Nevada, area, this method permits precise and quantitative definition of the limonitic areas originally observed in a LANDSAT color ratio composite. In addition, areas of transported limonite can be discriminated from the limonite in the hydrothermally altered areas of the Goldfield mining district. From the analysis, the numerical distinction between limonitic and nonlimonitic ground is generally less than 3% using the LANDSAT bands and as much as 8% in ratios of LANDSAT MSS bands.

  9. Thermochromatography and activation analysis

    NASA Astrophysics Data System (ADS)

    Stattarov, G. S.; Kist, A. A.

    1999-01-01

    Gas thermochromatography is a promising method in combination with neutron activation analysis. The procedure includes heating of irradiated samples in a stream of reacting gas carrier (air, chlorine, etc.) or heating in presence of compounds evolving gas at high temperatures. Gaseous products are passed through a tube with certain temperature gradient filled with various sorbents and the gases condense in different parts of the column. Studies of the processes of producing and trapping of volatile compounds allowed to work out various set-ups of apparatus with sorption tubes of various length and various temperature gradients, various filters, sorbents, etc. Sensitivity of these methods is sufficiently better then in INAA.

  10. 3-D inelastic analysis methods for hot section components. Volume 2: Advanced special functions models

    NASA Technical Reports Server (NTRS)

    Wilson, R. B.; Banerjee, P. K.

    1987-01-01

    This Annual Status Report presents the results of work performed during the third year of the 3-D Inelastic Analysis Methods for Hot Sections Components program (NASA Contract NAS3-23697). The objective of the program is to produce a series of computer codes that permit more accurate and efficient three-dimensional analyses of selected hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The computer codes embody a progression of mathematical models and are streamlined to take advantage of geometrical features, loading conditions, and forms of material response that distinguish each group of selected components.

  11. Cross validation issues in multiobjective clustering

    PubMed Central

    Brusco, Michael J.; Steinley, Douglas

    2018-01-01

    The implementation of multiobjective programming methods in combinatorial data analysis is an emergent area of study with a variety of pragmatic applications in the behavioural sciences. Most notably, multiobjective programming provides a tool for analysts to model trade offs among competing criteria in clustering, seriation, and unidimensional scaling tasks. Although multiobjective programming has considerable promise, the technique can produce numerically appealing results that lack empirical validity. With this issue in mind, the purpose of this paper is to briefly review viable areas of application for multiobjective programming and, more importantly, to outline the importance of cross-validation when using this method in cluster analysis. PMID:19055857

  12. Comparative Analysis of Volatile Defensive Secretions of Three Species of Pyrrhocoridae (Insecta: Heteroptera) by Gas Chromatography-Mass Spectrometric Method

    PubMed Central

    Krajicek, Jan; Havlikova, Martina; Bursova, Miroslava; Ston, Martin; Cabala, Radomir; Exnerova, Alice; Stys, Pavel; Bosakova, Zuzana

    2016-01-01

    The true bugs (Hemiptera: Heteroptera) have evolved a system of well-developed scent glands that produce diverse and frequently strongly odorous compounds that act mainly as chemical protection against predators. A new method of non-lethal sampling with subsequent separation using gas chromatography with mass spectrometric detection was proposed for analysis of these volatile defensive secretions. Separation was performed on Rtx-200 column containing fluorinated polysiloxane stationary phase. Various mechanical irritation methods (ultrasonics, shaking, pressing bugs with plunger of syringe) were tested for secretion sampling with a special focus on non-lethal irritation. The preconcentration step was performed by sorption on solid phase microextraction (SPME) fibers with different polarity. For optimization of sampling procedure, Pyrrhocoris apterus was selected. The entire multi-parameter optimization procedure of secretion sampling was performed using response surface methodology. The irritation of bugs by pressing them with a plunger of syringe was shown to be the most suitable. The developed method was applied to analysis of secretions produced by adult males and females of Pyrrhocoris apterus, Pyrrhocoris tibialis and Scantius aegyptius (all Heteroptera: Pyrrhocoridae). The chemical composition of secretion, particularly that of alcohols, aldehydes and esters, is species-specific in all three pyrrhocorid species studied. The sexual dimorphism in occurrence of particular compounds is largely limited to alcohols and suggests their epigamic intraspecific function. The phenetic overall similarities in composition of secretion do not reflect either relationship of species or similarities in antipredatory color pattern. The similarities of secretions may be linked with antipredatory strategies. The proposed method requires only a few individuals which remain alive after the procedure. Thus secretions of a number of species including even the rare ones can be analyzed and broadly conceived comparative studies can be carried out. PMID:27997627

  13. A step-by-step guide to non-linear regression analysis of experimental data using a Microsoft Excel spreadsheet.

    PubMed

    Brown, A M

    2001-06-01

    The objective of this present study was to introduce a simple, easily understood method for carrying out non-linear regression analysis based on user input functions. While it is relatively straightforward to fit data with simple functions such as linear or logarithmic functions, fitting data with more complicated non-linear functions is more difficult. Commercial specialist programmes are available that will carry out this analysis, but these programmes are expensive and are not intuitive to learn. An alternative method described here is to use the SOLVER function of the ubiquitous spreadsheet programme Microsoft Excel, which employs an iterative least squares fitting routine to produce the optimal goodness of fit between data and function. The intent of this paper is to lead the reader through an easily understood step-by-step guide to implementing this method, which can be applied to any function in the form y=f(x), and is well suited to fast, reliable analysis of data in all fields of biology.

  14. Elemental X-ray Imaging Using the Maia Detector Array: The Benefits and Challenges of Large Solid-Angle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryan, C.G.; De Geronimo, G.; Kirkham, R.

    2009-11-13

    The fundamental parameter method for quantitative SXRF and PIXE analysis and imaging using the dynamic analysis method is extended to model the changing X-ray yields and detector sensitivity with angle across large detector arrays. The method is implemented in the GeoPIXE software and applied to cope with the large solid-angle of the new Maia 384 detector array and its 96 detector prototype developed by CSIRO and BNL for SXRF imaging applications at the Australian and NSLS synchrotrons. Peak-to-background is controlled by mitigating charge-sharing between detectors through careful optimization of a patterned molybdenum absorber mask. A geological application demonstrates the capabilitymore » of the method to produce high definition elemental images up to {approx}100 M pixels in size.« less

  15. Accelerating the weighted histogram analysis method by direct inversion in the iterative subspace.

    PubMed

    Zhang, Cheng; Lai, Chun-Liang; Pettitt, B Montgomery

    The weighted histogram analysis method (WHAM) for free energy calculations is a valuable tool to produce free energy differences with the minimal errors. Given multiple simulations, WHAM obtains from the distribution overlaps the optimal statistical estimator of the density of states, from which the free energy differences can be computed. The WHAM equations are often solved by an iterative procedure. In this work, we use a well-known linear algebra algorithm which allows for more rapid convergence to the solution. We find that the computational complexity of the iterative solution to WHAM and the closely-related multiple Bennett acceptance ratio (MBAR) method can be improved by using the method of direct inversion in the iterative subspace. We give examples from a lattice model, a simple liquid and an aqueous protein solution.

  16. Design of optical element combining Fresnel lens with microlens array for uniform light-emitting diode lighting.

    PubMed

    Wang, Guangzhen; Wang, Lili; Li, Fuli; Kong, Depeng

    2012-09-01

    One kind of optical element combining Fresnel lens with microlens array is designed simply for LED lighting based on geometrical optics and nonimaging optics. This design method imposes no restriction on the source intensity pattern. The designed element has compact construction and can produce multiple shapes of illumination distribution. Taking square lighting as an example, tolerance analysis is carried out to determine tolerance limits for applying the element in the assembly process. This element can produce on-axis lighting and off-axis lighting.

  17. Annually recurring parthenogenesis in a zebra shark Stegostoma fasciatum.

    PubMed

    Robinson, D P; Baverstock, W; Al-Jaru, A; Hyland, K; Khazanehdari, K A

    2011-11-01

    A zebra shark, Stegostoma fasciatum, held in captivity at the Burj Al Arab aquarium, produced embryos and pups in the absence of a male. A total of 15 pups were produced from eggs laid within the aquarium over a period of four consecutive years commencing 2007. Parthenogenesis was confirmed through DNA analysis for three pups sampled during the first two consecutive egg cycles and is presumed to be the method of reproduction responsible thereafter. © 2011 The Authors. Journal of Fish Biology © 2011 The Fisheries Society of the British Isles.

  18. The analysis of bottom forming process for hybrid heating device

    NASA Astrophysics Data System (ADS)

    Bałon, Paweł; Świątoniowski, Andrzej; Kiełbasa, Bartłomiej

    2017-10-01

    In this paper the authors present an unusual method for bottom forming applicable for various industrial purposes including the manufacture of water heaters or pressure equipment. This method allows for the fabrication of the bottom of a given piece of stainless steel into a pre-determined shape conforming to the DIN standard which determines the most advantageous dimensions for the bottom cross section in terms of working pressure loading. The authors checked the validity of the method in a numerical and experimental way generating a tool designed to produce bottoms of specified geometry. Many problems are encountered during the design and production of parts, especially excessive sheet wrinkling over a large area of the part. The experiment showed that a lack of experience and numerical analysis in the design of such elements would result in the production of highly wrinkled parts. This defect would render the parts impossible to assemble with the cylindrical part. Many tool shops employ a method for drawing elements with a spherical surface which involves additional spinning, stamping, and grading operations, which greatly increases the cost of parts production. The authors present and compare two forming methods for spherical and parabolic objects, and experimentally confirm the validity of the sheet reversing method with adequate pressure force. The applied method produces parts in one drawing operation and in a following operation that is based on laser or water cutting to obtain a round blank. This reduces the costs of tooling manufacturing by requiring just one tool which can be placed on any hydraulic press with a minimum force of 2 000 kN.

  19. A multi-criteria analysis approach for ranking and selection of microorganisms for the production of oils for biodiesel production.

    PubMed

    Ahmad, Farah B; Zhang, Zhanying; Doherty, William O S; O'Hara, Ian M

    2015-08-01

    Oleaginous microorganisms have potential to be used to produce oils as alternative feedstock for biodiesel production. Microalgae (Chlorella protothecoides and Chlorella zofingiensis), yeasts (Cryptococcus albidus and Rhodotorula mucilaginosa), and fungi (Aspergillus oryzae and Mucor plumbeus) were investigated for their ability to produce oil from glucose, xylose and glycerol. Multi-criteria analysis (MCA) using analytic hierarchy process (AHP) and preference ranking organization method for the enrichment of evaluations (PROMETHEE) with graphical analysis for interactive aid (GAIA), was used to rank and select the preferred microorganisms for oil production for biodiesel application. This was based on a number of criteria viz., oil concentration, content, production rate and yield, substrate consumption rate, fatty acids composition, biomass harvesting and nutrient costs. PROMETHEE selected A. oryzae, M. plumbeus and R. mucilaginosa as the most prospective species for oil production. However, further analysis by GAIA Webs identified A. oryzae and M. plumbeus as the best performing microorganisms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Spherical Harmonic Analysis of Particle Velocity Distribution Function: Comparison of Moments and Anisotropies using Cluster Data

    NASA Technical Reports Server (NTRS)

    Gurgiolo, Chris; Vinas, Adolfo F.

    2009-01-01

    This paper presents a spherical harmonic analysis of the plasma velocity distribution function using high-angular, energy, and time resolution Cluster data obtained from the PEACE spectrometer instrument to demonstrate how this analysis models the particle distribution function and its moments and anisotropies. The results show that spherical harmonic analysis produced a robust physical representation model of the velocity distribution function, resolving the main features of the measured distributions. From the spherical harmonic analysis, a minimum set of nine spectral coefficients was obtained from which the moment (up to the heat flux), anisotropy, and asymmetry calculations of the velocity distribution function were obtained. The spherical harmonic method provides a potentially effective "compression" technique that can be easily carried out onboard a spacecraft to determine the moments and anisotropies of the particle velocity distribution function for any species. These calculations were implemented using three different approaches, namely, the standard traditional integration, the spherical harmonic (SPH) spectral coefficients integration, and the singular value decomposition (SVD) on the spherical harmonic methods. A comparison among the various methods shows that both SPH and SVD approaches provide remarkable agreement with the standard moment integration method.

  1. Regional analysis of annual maximum rainfall using TL-moments method

    NASA Astrophysics Data System (ADS)

    Shabri, Ani Bin; Daud, Zalina Mohd; Ariff, Noratiqah Mohd

    2011-06-01

    Information related to distributions of rainfall amounts are of great importance for designs of water-related structures. One of the concerns of hydrologists and engineers is the probability distribution for modeling of regional data. In this study, a novel approach to regional frequency analysis using L-moments is revisited. Subsequently, an alternative regional frequency analysis using the TL-moments method is employed. The results from both methods were then compared. The analysis was based on daily annual maximum rainfall data from 40 stations in Selangor Malaysia. TL-moments for the generalized extreme value (GEV) and generalized logistic (GLO) distributions were derived and used to develop the regional frequency analysis procedure. TL-moment ratio diagram and Z-test were employed in determining the best-fit distribution. Comparison between the two approaches showed that the L-moments and TL-moments produced equivalent results. GLO and GEV distributions were identified as the most suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation was used for performance evaluation, and it showed that the method of TL-moments was more efficient for lower quantile estimation compared with the L-moments.

  2. Three-dimensional murine airway segmentation in micro-CT images

    NASA Astrophysics Data System (ADS)

    Shi, Lijun; Thiesse, Jacqueline; McLennan, Geoffrey; Hoffman, Eric A.; Reinhardt, Joseph M.

    2007-03-01

    Thoracic imaging for small animals has emerged as an important tool for monitoring pulmonary disease progression and therapy response in genetically engineered animals. Micro-CT is becoming the standard thoracic imaging modality in small animal imaging because it can produce high-resolution images of the lung parenchyma, vasculature, and airways. Segmentation, measurement, and visualization of the airway tree is an important step in pulmonary image analysis. However, manual analysis of the airway tree in micro-CT images can be extremely time-consuming since a typical dataset is usually on the order of several gigabytes in size. Automated and semi-automated tools for micro-CT airway analysis are desirable. In this paper, we propose an automatic airway segmentation method for in vivo micro-CT images of the murine lung and validate our method by comparing the automatic results to manual tracing. Our method is based primarily on grayscale morphology. The results show good visual matches between manually segmented and automatically segmented trees. The average true positive volume fraction compared to manual analysis is 91.61%. The overall runtime for the automatic method is on the order of 30 minutes per volume compared to several hours to a few days for manual analysis.

  3. Advanced statistical methods for improved data analysis of NASA astrophysics missions

    NASA Technical Reports Server (NTRS)

    Feigelson, Eric D.

    1992-01-01

    The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.

  4. Analysis of decay chains of superheavy nuclei produced in the 249Bk+48Ca and 243Am+48Ca reactions

    NASA Astrophysics Data System (ADS)

    Zlokazov, V. B.; Utyonkov, V. K.

    2017-07-01

    The analysis of decay chains starting at superheavy nuclei 293Ts and 289Mc is presented. The spectroscopic properties of nuclei identified during the experiments using the 249Bk+48Ca and 243Am+48Ca reactions studied at the gas-filled separators DGFRS, TASCA and BGS are considered. We present the analysis of decay data using widely adopted statistical methods and applying them to the short decay chains of parent odd-Z nuclei. We find out that the recently suggested method of analyzing decay chains by Forsberg et al may lead to questionable conclusions when applied for the analysis of radioactive decays. Our discussion demonstrates reasonable congruence of α-particle energies and decay times of nuclei assigned to isotopes 289Mc, 285Nh and 281Rg observed in both reactions.

  5. Meta-analysis of haplotype-association studies: comparison of methods and empirical evaluation of the literature

    PubMed Central

    2011-01-01

    Background Meta-analysis is a popular methodology in several fields of medical research, including genetic association studies. However, the methods used for meta-analysis of association studies that report haplotypes have not been studied in detail. In this work, methods for performing meta-analysis of haplotype association studies are summarized, compared and presented in a unified framework along with an empirical evaluation of the literature. Results We present multivariate methods that use summary-based data as well as methods that use binary and count data in a generalized linear mixed model framework (logistic regression, multinomial regression and Poisson regression). The methods presented here avoid the inflation of the type I error rate that could be the result of the traditional approach of comparing a haplotype against the remaining ones, whereas, they can be fitted using standard software. Moreover, formal global tests are presented for assessing the statistical significance of the overall association. Although the methods presented here assume that the haplotypes are directly observed, they can be easily extended to allow for such an uncertainty by weighting the haplotypes by their probability. Conclusions An empirical evaluation of the published literature and a comparison against the meta-analyses that use single nucleotide polymorphisms, suggests that the studies reporting meta-analysis of haplotypes contain approximately half of the included studies and produce significant results twice more often. We show that this excess of statistically significant results, stems from the sub-optimal method of analysis used and, in approximately half of the cases, the statistical significance is refuted if the data are properly re-analyzed. Illustrative examples of code are given in Stata and it is anticipated that the methods developed in this work will be widely applied in the meta-analysis of haplotype association studies. PMID:21247440

  6. Integrated seismic stochastic inversion and multi-attributes to delineate reservoir distribution: Case study MZ fields, Central Sumatra Basin

    NASA Astrophysics Data System (ADS)

    Haris, A.; Novriyani, M.; Suparno, S.; Hidayat, R.; Riyanto, A.

    2017-07-01

    This study presents the integration of seismic stochastic inversion and multi-attributes for delineating the reservoir distribution in term of lithology and porosity in the formation within depth interval between the Top Sihapas and Top Pematang. The method that has been used is a stochastic inversion, which is integrated with multi-attribute seismic by applying neural network Probabilistic Neural Network (PNN). Stochastic methods are used to predict the probability mapping sandstone as the result of impedance varied with 50 realizations that will produce a good probability. Analysis of Stochastic Seismic Tnversion provides more interpretive because it directly gives the value of the property. Our experiment shows that AT of stochastic inversion provides more diverse uncertainty so that the probability value will be close to the actual values. The produced AT is then used for an input of a multi-attribute analysis, which is used to predict the gamma ray, density and porosity logs. To obtain the number of attributes that are used, stepwise regression algorithm is applied. The results are attributes which are used in the process of PNN. This PNN method is chosen because it has the best correlation of others neural network method. Finally, we interpret the product of the multi-attribute analysis are in the form of pseudo-gamma ray volume, density volume and volume of pseudo-porosity to delineate the reservoir distribution. Our interpretation shows that the structural trap is identified in the southeastern part of study area, which is along the anticline.

  7. Image analysis method for the measurement of water saturation in a two-dimensional experimental flow tank

    NASA Astrophysics Data System (ADS)

    Belfort, Benjamin; Weill, Sylvain; Lehmann, François

    2017-07-01

    A novel, non-invasive imaging technique is proposed that determines 2D maps of water content in unsaturated porous media. This method directly relates digitally measured intensities to the water content of the porous medium. This method requires the classical image analysis steps, i.e., normalization, filtering, background subtraction, scaling and calibration. The main advantages of this approach are that no calibration experiment is needed, because calibration curve relating water content and reflected light intensities is established during the main monitoring phase of each experiment and that no tracer or dye is injected into the flow tank. The procedure enables effective processing of a large number of photographs and thus produces 2D water content maps at high temporal resolution. A drainage/imbibition experiment in a 2D flow tank with inner dimensions of 40 cm × 14 cm × 6 cm (L × W × D) is carried out to validate the methodology. The accuracy of the proposed approach is assessed using a statistical framework to perform an error analysis and numerical simulations with a state-of-the-art computational code that solves the Richards' equation. Comparison of the cumulative mass leaving and entering the flow tank and water content maps produced by the photographic measurement technique and the numerical simulations demonstrate the efficiency and high accuracy of the proposed method for investigating vadose zone flow processes. Finally, the photometric procedure has been developed expressly for its extension to heterogeneous media. Other processes may be investigated through different laboratory experiments which will serve as benchmark for numerical codes validation.

  8. PyMICE: APython library for analysis of IntelliCage data.

    PubMed

    Dzik, Jakub M; Puścian, Alicja; Mijakowska, Zofia; Radwanska, Kasia; Łęski, Szymon

    2018-04-01

    IntelliCage is an automated system for recording the behavior of a group of mice housed together. It produces rich, detailed behavioral data calling for new methods and software for their analysis. Here we present PyMICE, a free and open-source library for analysis of IntelliCage data in the Python programming language. We describe the design and demonstrate the use of the library through a series of examples. PyMICE provides easy and intuitive access to IntelliCage data, and thus facilitates the possibility of using numerous other Python scientific libraries to form a complete data analysis workflow.

  9. Evaluation of Rock Powdering Methods to Obtain Fine-grained Samples for CHEMIN, a Combined XRD/XRF Instrument

    NASA Technical Reports Server (NTRS)

    Chipera, S. J.; Vaniman, D. T.; Bish, D. L.; Sarrazin, P.; Feldman, S.; Blake, D. F.; Bearman, G.; Bar-Cohen, Y.

    2004-01-01

    A miniature XRD/XRF (X-ray diffraction / X-ray fluorescence) instrument, CHEMIN, is currently being developed for definitive mineralogic analysis of soils and rocks on Mars. One of the technical issues that must be addressed to enable remote XRD analysis is how best to obtain a representative sample powder for analysis. For powder XRD analyses, it is beneficial to have a fine-grained sample to reduce preferred orientation effects and to provide a statistically significant number of crystallites to the X-ray beam. Although a two-dimensional detector as used in the CHEMIN instrument will produce good results even with poorly prepared powder, the quality of the data will improve and the time required for data collection will be reduced if the sample is fine-grained and randomly oriented. A variety of methods have been proposed for XRD sample preparation. Chipera et al. presented grain size distributions and XRD results from powders generated with an Ultrasonic/Sonic Driller/Corer (USDC) currently being developed at JPL. The USDC was shown to be an effective instrument for sampling rock to produce powder suitable for XRD. In this paper, we compare powder prepared using the USDC with powder obtained with a miniaturized rock crusher developed at JPL and with powder obtained with a rotary tungsten carbide bit to powders obtained from a laboratory bench-scale Retsch mill (provides benchmark mineralogical data). These comparisons will allow assessment of the suitability of these methods for analysis by an XRD/XRF instrument such as CHEMIN.

  10. Examination of a Rotorcraft Noise Prediction Method and Comparison to Flight Test Data

    NASA Technical Reports Server (NTRS)

    Boyd, D. Douglas, Jr.; Greenwood, Eric; Watts, Michael E.; Lopes, Leonard V.

    2017-01-01

    With a view that rotorcraft noise should be included in the preliminary design process, a relatively fast noise prediction method is examined in this paper. A comprehensive rotorcraft analysis is combined with a noise prediction method to compute several noise metrics of interest. These predictions are compared to flight test data. Results show that inclusion of only the main rotor noise will produce results that severely underpredict integrated metrics of interest. Inclusion of the tail rotor frequency content is essential for accurately predicting these integrated noise metrics.

  11. Motor monitoring method and apparatus using high frequency current components

    DOEpatents

    Casada, D.A.

    1996-05-21

    A motor current analysis method and apparatus for monitoring electrical-motor-driven devices are disclosed. The method and apparatus utilize high frequency portions of the motor current spectra to evaluate the condition of the electric motor and the device driven by the electric motor. The motor current signal produced as a result of an electric motor is monitored and the low frequency components of the signal are removed by a high-pass filter. The signal is then analyzed to determine the condition of the electrical motor and the driven device. 16 figs.

  12. Discovering Central Practitioners in a Medical Discussion Forum Using Semantic Web Analytics.

    PubMed

    Rajabi, Enayat; Abidi, Syed Sibte Raza

    2017-01-01

    The aim of this paper is to investigate semantic web based methods to enrich and transform a medical discussion forum in order to perform semantics-driven social network analysis. We use the centrality measures as well as semantic similarity metrics to identify the most influential practitioners within a discussion forum. The centrality results of our approach are in line with centrality measures produced by traditional SNA methods, thus validating the applicability of semantic web based methods for SNA, particularly for analyzing social networks for specialized discussion forums.

  13. Symetrica Measurements at PNNL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kouzes, Richard T.; Mace, Emily K.; Redding, Rebecca L.

    2009-01-26

    Symetrica is a small company based in Southampton, England, that has developed an algorithm for processing gamma ray spectra obtained from a variety of scintillation detectors. Their analysis method applied to NaI(Tl), BGO, and LaBr spectra results in deconvoluted spectra with the “resolution” improved by about a factor of three to four. This method has also been applied by Symetrica to plastic scintillator with the result that full energy peaks are produced. If this method is valid and operationally viable, it could lead to a significantly improved plastic scintillator based radiation portal monitor system.

  14. Motor monitoring method and apparatus using high frequency current components

    DOEpatents

    Casada, Donald A.

    1996-01-01

    A motor current analysis method and apparatus for monitoring electrical-motor-driven devices. The method and apparatus utilize high frequency portions of the motor current spectra to evaluate the condition of the electric motor and the device driven by the electric motor. The motor current signal produced as a result of an electric motor is monitored and the low frequency components of the signal are removed by a high-pass filter. The signal is then analyzed to determine the condition of the electrical motor and the driven device.

  15. Determination of organo-zinc based fungicides in timber treatments employing gas chromatographic analysis with mass selective detection and/or inductively coupled plasma atomic emission spectroscopy.

    PubMed

    Rimmer, D A; Johnson, P D; Bradley, S D

    2001-09-14

    A method for the determination of zinc octoate (zinc 2-ethylhexanoate) and acypetacs zinc in occupational hygiene samples and wood treatments formulations is described. The zinc carboxylates are liquid-liquid partitioned between toluene and 1 M HCl, with the liberated acids being extracted into the toluene and zinc (chloride) into the acid. The carboxylic acids are then methylated using trimethylsilyldiazomethane-methanol and the resultant methyl esters are selectively and sensitively analysed by gas chromatography with mass selective detection (GC-MS). Alternatively, the zinc content of the acid extract can be analysed by inductively coupled plasma atomic emission spectroscopy (ICP-AES). GC-MS is the preferred method of analysis for zinc octoate, where a single analyte (methyl-2-ethylhexanoate) is produced for analysis. Because acypetacs zinc contains a complex mixture of carboxylates, quantitative GC-MS analysis of the methyl esters produced is impractical and ICP-AES is the preferred method for quantitation. In this case, GC-MS can be used to confirm the identity of the product used. The analysis of occupational hygiene samples (cotton pads, gloves and socks as well as Tenax tubes and GF/A filters) spiked with metal carboxylates is demonstrated. Recoveries around 70-90% and reproducibilities of 5-23% (n=6-8) were typically achieved for the determination of tin octoate (a surrogate for zinc octoate) at spiking levels ranging from 4 to 190 microg per sampling device. Recoveries around 102-106% and reproducibilities of 10-12% (n=5-6) were typically achieved for acypetacs zinc at spiking levels ranging from 100 mg per sampling device. Reaction yields for the octoate methylation reaction were in the region of 85-87%. The method was used to monitor for occupational exposure to zinc octoate and acypetacs zinc during the application of wood treatments to fences.

  16. Automating Structural Analysis of Spacecraft Vehicles

    NASA Technical Reports Server (NTRS)

    Hrinda, Glenn A.

    2004-01-01

    A major effort within NASA's vehicle analysis discipline has been to automate structural analysis and sizing optimization during conceptual design studies of advanced spacecraft. Traditional spacecraft structural sizing has involved detailed finite element analysis (FEA) requiring large degree-of-freedom (DOF) finite element models (FEM). Creation and analysis of these models can be time consuming and limit model size during conceptual designs. The goal is to find an optimal design that meets the mission requirements but produces the lightest structure. A structural sizing tool called HyperSizer has been successfully used in the conceptual design phase of a reusable launch vehicle and planetary exploration spacecraft. The program couples with FEA to enable system level performance assessments and weight predictions including design optimization of material selections and sizing of spacecraft members. The software's analysis capabilities are based on established aerospace structural methods for strength, stability and stiffness that produce adequately sized members and reliable structural weight estimates. The software also helps to identify potential structural deficiencies early in the conceptual design so changes can be made without wasted time. HyperSizer's automated analysis and sizing optimization increases productivity and brings standardization to a systems study. These benefits will be illustrated in examining two different types of conceptual spacecraft designed using the software. A hypersonic air breathing, single stage to orbit (SSTO), reusable launch vehicle (RLV) will be highlighted as well as an aeroshell for a planetary exploration vehicle used for aerocapture at Mars. By showing the two different types of vehicles, the software's flexibility will be demonstrated with an emphasis on reducing aeroshell structural weight. Member sizes, concepts and material selections will be discussed as well as analysis methods used in optimizing the structure. Analysis based on the HyperSizer structural sizing software will be discussed. Design trades required to optimize structural weight will be presented.

  17. Head-to-head comparisons of metabolic side effects of second generation antipsychotics in the treatment of schizophrenia: a systematic review and meta-analysis

    PubMed Central

    Rummel-Kluge, Christine; Komossa, Katja; Schwarz, Sandra; Hunger, Heike; Schmid, Franziska; Lobos, Claudia Asenjo; Kissling, Werner; Davis, John M; Leucht, Stefan

    2010-01-01

    Objective The metabolic side effects of second-generation antipsychotics (SGA) are serious and have not been compared head to head in a meta-analysis. We conducted a meta-analysis of studies comparing the metabolic side effects of the following SGAs head-to-head: amisulpride, aripiprazole, clozapine, olanzapine, quetiapine, risperidone, sertindole, ziprasidone, zotepine. Method We searched the register of the Cochrane schizophrenia group (last search May 2007), supplemented by MEDLINE and EMBASE (last search January 2009) for randomized, blinded studies comparing the above mentioned SGA in the treatment of schizophrenia or related disorders. At least three reviewers extracted the data independently. The primary outcome was weight change. We also assessed changes of cholesterol and glucose. The results were combined in a meta-analysis. Results We included 48 studies with 105 relevant arms. Olanzapine produced more weight gain than all other second-generation antipsychotics except for clozapine where no difference was found. Clozapine produced more weight gain than risperidone, risperidone more than amisulpride, and sertindole more than risperidone. Olanzapine produced more cholesterol increase than aripiprazole, risperidone and ziprasidone. (No differences with amisulpride, clozapine and quetiapine were found). Quetiapine produced more cholesterol increase than risperidone and ziprasidone. Olanzapine produced more increase in glucose than amisulpride, aripiprazole, quetiapine, risperidone and ziprasidone; no difference was found with clozapine. Conclusions Some SGAs lead to substantially more metabolic side effects than other SGAs. When choosing an SGA for an individual patient these side effects with their potential cause of secondary diseases must be weighed against efficacy and characteristics of the individual patient. PMID:20692814

  18. Freeform fabrication of tissue-simulating phantom for potential use of surgical planning in conjoined twins separation surgery.

    PubMed

    Shen, Shuwei; Wang, Haili; Xue, Yue; Yuan, Li; Zhou, Ximing; Zhao, Zuhua; Dong, Erbao; Liu, Bin; Liu, Wendong; Cromeens, Barrett; Adler, Brent; Besner, Gail; Xu, Ronald X

    2017-09-08

    Preoperative assessment of tissue anatomy and accurate surgical planning is crucial in conjoined twin separation surgery. We developed a new method that combines three-dimensional (3D) printing, assembling, and casting to produce anatomic models of high fidelity for surgical planning. The related anatomic features of the conjoined twins were captured by computed tomography (CT), classified as five organ groups, and reconstructed as five computer models. Among these organ groups, the skeleton was produced by fused deposition modeling (FDM) using acrylonitrile-butadiene-styrene. For the other four organ groups, shell molds were prepared by FDM and cast with silica gel to simulate soft tissues, with contrast enhancement pigments added to simulate different CT and visual contrasts. The produced models were assembled, positioned firmly within a 3D printed shell mold simulating the skin boundary, and cast with transparent silica gel. The produced phantom was subject to further CT scan in comparison with that of the patient data for fidelity evaluation. Further data analysis showed that the produced model reassembled the geometric features of the original CT data with an overall mean deviation of less than 2 mm, indicating the clinical potential to use this method for surgical planning in conjoined twin separation surgery.

  19. Strain improvement of chymosin-producing strains of Aspergillus niger var. awamori using parasexual recombination.

    PubMed

    Bodie, E A; Armstrong, G L; Dunn-Coleman, N S

    1994-05-01

    Parasexual recombination was used to obtain improved chymosin-producing strains and to perform genetic analysis on existing strains. Chlorate resistance was used to select for a variety of spontaneous nitrate assimilation pathway mutations in strains previously improved for chymosin production using classical strain improvement methods including mutation and screening, and selection for 2-deoxyglucose resistance (dgr). Diploids of these improved strains were generated via parasexual recombination and were isolated on selective media by complementation of nitrate assimilation mutations. A preliminary genetic analysis of diploid and haploid segregants indicated that the dgr trait, resulting in overexpression of chymosin, was recessive. Also, mutations in two different dgr genes resulted in an increased level of chymosin production. When these mutations were combined via parasexual recombination, the resulting haploid segregants produced about 15% more chymosin than either parental strain. CHEF gel electrophoresis was used to determine the chromosomal location of the integrated chymosin DNA sequences, and to verify diploidy in one case where the chromosome composition of two haploid parents differed.

  20. Missing in space: an evaluation of imputation methods for missing data in spatial analysis of risk factors for type II diabetes.

    PubMed

    Baker, Jannah; White, Nicole; Mengersen, Kerrie

    2014-11-20

    Spatial analysis is increasingly important for identifying modifiable geographic risk factors for disease. However, spatial health data from surveys are often incomplete, ranging from missing data for only a few variables, to missing data for many variables. For spatial analyses of health outcomes, selection of an appropriate imputation method is critical in order to produce the most accurate inferences. We present a cross-validation approach to select between three imputation methods for health survey data with correlated lifestyle covariates, using as a case study, type II diabetes mellitus (DM II) risk across 71 Queensland Local Government Areas (LGAs). We compare the accuracy of mean imputation to imputation using multivariate normal and conditional autoregressive prior distributions. Choice of imputation method depends upon the application and is not necessarily the most complex method. Mean imputation was selected as the most accurate method in this application. Selecting an appropriate imputation method for health survey data, after accounting for spatial correlation and correlation between covariates, allows more complete analysis of geographic risk factors for disease with more confidence in the results to inform public policy decision-making.

Top