Sample records for surface-based statistical analysis

  1. A Statistical Analysis of Brain Morphology Using Wild Bootstrapping

    PubMed Central

    Ibrahim, Joseph G.; Tang, Niansheng; Rowe, Daniel B.; Hao, Xuejun; Bansal, Ravi; Peterson, Bradley S.

    2008-01-01

    Methods for the analysis of brain morphology, including voxel-based morphology and surface-based morphometries, have been used to detect associations between brain structure and covariates of interest, such as diagnosis, severity of disease, age, IQ, and genotype. The statistical analysis of morphometric measures usually involves two statistical procedures: 1) invoking a statistical model at each voxel (or point) on the surface of the brain or brain subregion, followed by mapping test statistics (e.g., t test) or their associated p values at each of those voxels; 2) correction for the multiple statistical tests conducted across all voxels on the surface of the brain region under investigation. We propose the use of new statistical methods for each of these procedures. We first use a heteroscedastic linear model to test the associations between the morphological measures at each voxel on the surface of the specified subregion (e.g., cortical or subcortical surfaces) and the covariates of interest. Moreover, we develop a robust test procedure that is based on a resampling method, called wild bootstrapping. This procedure assesses the statistical significance of the associations between a measure of given brain structure and the covariates of interest. The value of this robust test procedure lies in its computationally simplicity and in its applicability to a wide range of imaging data, including data from both anatomical and functional magnetic resonance imaging (fMRI). Simulation studies demonstrate that this robust test procedure can accurately control the family-wise error rate. We demonstrate the application of this robust test procedure to the detection of statistically significant differences in the morphology of the hippocampus over time across gender groups in a large sample of healthy subjects. PMID:17649909

  2. Surface flaw reliability analysis of ceramic components with the SCARE finite element postprocessor program

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, John P.; Nemeth, Noel N.

    1987-01-01

    The SCARE (Structural Ceramics Analysis and Reliability Evaluation) computer program on statistical fast fracture reliability analysis with quadratic elements for volume distributed imperfections is enhanced to include the use of linear finite elements and the capability of designing against concurrent surface flaw induced ceramic component failure. The SCARE code is presently coupled as a postprocessor to the MSC/NASTRAN general purpose, finite element analysis program. The improved version now includes the Weibull and Batdorf statistical failure theories for both surface and volume flaw based reliability analysis. The program uses the two-parameter Weibull fracture strength cumulative failure probability distribution model with the principle of independent action for poly-axial stress states, and Batdorf's shear-sensitive as well as shear-insensitive statistical theories. The shear-sensitive surface crack configurations include the Griffith crack and Griffith notch geometries, using the total critical coplanar strain energy release rate criterion to predict mixed-mode fracture. Weibull material parameters based on both surface and volume flaw induced fracture can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and grouped fracture data. The statistical fast fracture theories for surface flaw induced failure, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.

  3. Detection of reflecting surfaces by a statistical model

    NASA Astrophysics Data System (ADS)

    He, Qiang; Chu, Chee-Hung H.

    2009-02-01

    Remote sensing is widely used assess the destruction from natural disasters and to plan relief and recovery operations. How to automatically extract useful features and segment interesting objects from digital images, including remote sensing imagery, becomes a critical task for image understanding. Unfortunately, current research on automated feature extraction is ignorant of contextual information. As a result, the fidelity of populating attributes corresponding to interesting features and objects cannot be satisfied. In this paper, we present an exploration on meaningful object extraction integrating reflecting surfaces. Detection of specular reflecting surfaces can be useful in target identification and then can be applied to environmental monitoring, disaster prediction and analysis, military, and counter-terrorism. Our method is based on a statistical model to capture the statistical properties of specular reflecting surfaces. And then the reflecting surfaces are detected through cluster analysis.

  4. Detection of Fatty Acids from Intact Microorganisms by Molecular Beam Static Secondary Ion Mass Spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ingram, Jani Cheri; Lehman, Richard Michael; Bauer, William Francis

    We report the use of a surface analysis approach, static secondary ion mass spectrometry (SIMS) equipped with a molecular (ReO4-) ion primary beam, to analyze the surface of intact microbial cells. SIMS spectra of 28 microorganisms were compared to fatty acid profiles determined by gas chromatographic analysis of transesterfied fatty acids extracted from the same organisms. The results indicate that surface bombardment using the molecular primary beam cleaved the ester linkage characteristic of bacteria at the glycerophosphate backbone of the phospholipid components of the cell membrane. This cleavage enables direct detection of the fatty acid conjugate base of intact microorganismsmore » by static SIMS. The limit of detection for this approach is approximately 107 bacterial cells/cm2. Multivariate statistical methods were applied in a graded approach to the SIMS microbial data. The results showed that the full data set could initially be statistically grouped based upon major differences in biochemical composition of the cell wall. The gram-positive bacteria were further statistically analyzed, followed by final analysis of a specific bacterial genus that was successfully grouped by species. Additionally, the use of SIMS to detect microbes on mineral surfaces is demonstrated by an analysis of Shewanella oneidensis on crushed hematite. The results of this study provide evidence for the potential of static SIMS to rapidly detect bacterial species based on ion fragments originating from cell membrane lipids directly from sample surfaces.« less

  5. A Method of Relating General Circulation Model Simulated Climate to the Observed Local Climate. Part I: Seasonal Statistics.

    NASA Astrophysics Data System (ADS)

    Karl, Thomas R.; Wang, Wei-Chyung; Schlesinger, Michael E.; Knight, Richard W.; Portman, David

    1990-10-01

    Important surface observations such as the daily maximum and minimum temperature, daily precipitation, and cloud ceilings often have localized characteristics that are difficult to reproduce with the current resolution and the physical parameterizations in state-of-the-art General Circulation climate Models (GCMs). Many of the difficulties can be partially attributed to mismatches in scale, local topography. regional geography and boundary conditions between models and surface-based observations. Here, we present a method, called climatological projection by model statistics (CPMS), to relate GCM grid-point flee-atmosphere statistics, the predictors, to these important local surface observations. The method can be viewed as a generalization of the model output statistics (MOS) and perfect prog (PP) procedures used in numerical weather prediction (NWP) models. It consists of the application of three statistical methods: 1) principle component analysis (FICA), 2) canonical correlation, and 3) inflated regression analysis. The PCA reduces the redundancy of the predictors The canonical correlation is used to develop simultaneous relationships between linear combinations of the predictors, the canonical variables, and the surface-based observations. Finally, inflated regression is used to relate the important canonical variables to each of the surface-based observed variables.We demonstrate that even an early version of the Oregon State University two-level atmospheric GCM (with prescribed sea surface temperature) produces free-atmosphere statistics than can, when standardized using the model's internal means and variances (the MOS-like version of CPMS), closely approximate the observed local climate. When the model data are standardized by the observed free-atmosphere means and variances (the PP version of CPMS), however, the model does not reproduce the observed surface climate as well. Our results indicate that in the MOS-like version of CPMS the differences between the output of a ten-year GCM control run and the surface-based observations are often smaller than the differences between the observations of two ten-year periods. Such positive results suggest that GCMs may already contain important climatological information that can be used to infer the local climate.

  6. Characterization of Surface Water and Groundwater Quality in the Lower Tano River Basin Using Statistical and Isotopic Approach.

    NASA Astrophysics Data System (ADS)

    Edjah, Adwoba; Stenni, Barbara; Cozzi, Giulio; Turetta, Clara; Dreossi, Giuliano; Tetteh Akiti, Thomas; Yidana, Sandow

    2017-04-01

    Adwoba Kua- Manza Edjaha, Barbara Stennib,c,Giuliano Dreossib, Giulio Cozzic, Clara Turetta c,T.T Akitid ,Sandow Yidanae a,eDepartment of Earth Science, University of Ghana Legon, Ghana West Africa bDepartment of Enviromental Sciences, Informatics and Statistics, Ca Foscari University of Venice, Italy cInstitute for the Dynamics of Environmental Processes, CNR, Venice, Italy dDepartment of Nuclear Application and Techniques, Graduate School of Nuclear and Allied Sciences University of Ghana Legon This research is part of a PhD research work "Hydrogeological Assessment of the Lower Tano river basin for sustainable economic usage, Ghana, West - Africa". In this study, the researcher investigated surface water and groundwater quality in the Lower Tano river basin. This assessment was based on some selected sampling sites associated with mining activities, and the development of oil and gas. Statistical approach was applied to characterize the quality of surface water and groundwater. Also, water stable isotopes, which is a natural tracer of the hydrological cycle was used to investigate the origin of groundwater recharge in the basin. The study revealed that Pb and Ni values of the surface water and groundwater samples exceeded the WHO standards for drinking water. In addition, water quality index (WQI), based on physicochemical parameters(EC, TDS, pH) and major ions(Ca2+, Na+, Mg2+, HCO3-,NO3-, CL-, SO42-, K+) exhibited good quality water for 60% of the sampled surface water and groundwater. Other statistical techniques, such as Heavy metal pollution index (HPI), degree of contamination (Cd), and heavy metal evaluation index (HEI), based on trace element parameters in the water samples, reveal that 90% of the surface water and groundwater samples belong to high level of pollution. Principal component analysis (PCA) also suggests that the water quality in the basin is likely affected by rock - water interaction and anthropogenic activities (sea water intrusion). This was confirm by further statistical analysis (cluster analysis and correlation matrix) of the water quality parameters. Spatial distribution of water quality parameters, trace elements and the results obtained from the statistical analysis was determined by geographical information system (GIS). In addition, the isotopic analysis of the sampled surface water and groundwater revealed that most of the surface water and groundwater were of meteoric origin with little or no isotopic variations. It is expected that outcomes of this research will form a baseline for making appropriate decision on water quality management by decision makers in the Lower Tano river Basin. Keywords: Water stable isotopes, Trace elements, Multivariate statistics, Evaluation indices, Lower Tano river basin.

  7. Two-dimensional random surface model for asperity-contact in elastohydrodynamic lubrication

    NASA Technical Reports Server (NTRS)

    Coy, J. J.; Sidik, S. M.

    1979-01-01

    Relations for the asperity-contact time function during elastohydrodynamic lubrication of a ball bearing are presented. The analysis is based on a two-dimensional random surface model, and actual profile traces of the bearing surfaces are used as statistical sample records. The results of the analysis show that transition from 90 percent contact to 1 percent contact occurs within a dimensionless film thickness range of approximately four to five. This thickness ratio is several times large than reported in the literature where one-dimensional random surface models were used. It is shown that low pass filtering of the statistical records will bring agreement between the present results and those in the literature.

  8. Multivariate Tensor-based Morphometry on Surfaces: Application to Mapping Ventricular Abnormalities in HIV/AIDS

    PubMed Central

    Wang, Yalin; Zhang, Jie; Gutman, Boris; Chan, Tony F.; Becker, James T.; Aizenstein, Howard J.; Lopez, Oscar L.; Tamburo, Robert J.; Toga, Arthur W.; Thompson, Paul M.

    2010-01-01

    Here we developed a new method, called multivariate tensor-based surface morphometry (TBM), and applied it to study lateral ventricular surface differences associated with HIV/AIDS. Using concepts from differential geometry and the theory of differential forms, we created mathematical structures known as holomorphic one-forms, to obtain an efficient and accurate conformal parameterization of the lateral ventricular surfaces in the brain. The new meshing approach also provides a natural way to register anatomical surfaces across subjects, and improves on prior methods as it handles surfaces that branch and join at complex 3D junctions. To analyze anatomical differences, we computed new statistics from the Riemannian surface metrics - these retain multivariate information on local surface geometry. We applied this framework to analyze lateral ventricular surface morphometry in 3D MRI data from 11 subjects with HIV/AIDS and 8 healthy controls. Our method detected a 3D profile of surface abnormalities even in this small sample. Multivariate statistics on the local tensors gave better effect sizes for detecting group differences, relative to other TBM-based methods including analysis of the Jacobian determinant, the largest and smallest eigenvalues of the surface metric, and the pair of eigenvalues of the Jacobian matrix. The resulting analysis pipeline may improve the power of surface-based morphometry studies of the brain. PMID:19900560

  9. Response surface method in geotechnical/structural analysis, phase 1

    NASA Astrophysics Data System (ADS)

    Wong, F. S.

    1981-02-01

    In the response surface approach, an approximating function is fit to a long running computer code based on a limited number of code calculations. The approximating function, called the response surface, is then used to replace the code in subsequent repetitive computations required in a statistical analysis. The procedure of the response surface development and feasibility of the method are shown using a sample problem in slop stability which is based on data from centrifuge experiments of model soil slopes and involves five random soil parameters. It is shown that a response surface can be constructed based on as few as four code calculations and that the response surface is computationally extremely efficient compared to the code calculation. Potential applications of this research include probabilistic analysis of dynamic, complex, nonlinear soil/structure systems such as slope stability, liquefaction, and nuclear reactor safety.

  10. Visual wetness perception based on image color statistics.

    PubMed

    Sawayama, Masataka; Adelson, Edward H; Nishida, Shin'ya

    2017-05-01

    Color vision provides humans and animals with the abilities to discriminate colors based on the wavelength composition of light and to determine the location and identity of objects of interest in cluttered scenes (e.g., ripe fruit among foliage). However, we argue that color vision can inform us about much more than color alone. Since a trichromatic image carries more information about the optical properties of a scene than a monochromatic image does, color can help us recognize complex material qualities. Here we show that human vision uses color statistics of an image for the perception of an ecologically important surface condition (i.e., wetness). Psychophysical experiments showed that overall enhancement of chromatic saturation, combined with a luminance tone change that increases the darkness and glossiness of the image, tended to make dry scenes look wetter. Theoretical analysis along with image analysis of real objects indicated that our image transformation, which we call the wetness enhancing transformation, is consistent with actual optical changes produced by surface wetting. Furthermore, we found that the wetness enhancing transformation operator was more effective for the images with many colors (large hue entropy) than for those with few colors (small hue entropy). The hue entropy may be used to separate surface wetness from other surface states having similar optical properties. While surface wetness and surface color might seem to be independent, there are higher order color statistics that can influence wetness judgments, in accord with the ecological statistics. The present findings indicate that the visual system uses color image statistics in an elegant way to help estimate the complex physical status of a scene.

  11. SWToolbox: A surface-water tool-box for statistical analysis of streamflow time series

    USGS Publications Warehouse

    Kiang, Julie E.; Flynn, Kate; Zhai, Tong; Hummel, Paul; Granato, Gregory

    2018-03-07

    This report is a user guide for the low-flow analysis methods provided with version 1.0 of the Surface Water Toolbox (SWToolbox) computer program. The software combines functionality from two software programs—U.S. Geological Survey (USGS) SWSTAT and U.S. Environmental Protection Agency (EPA) DFLOW. Both of these programs have been used primarily for computation of critical low-flow statistics. The main analysis methods are the computation of hydrologic frequency statistics such as the 7-day minimum flow that occurs on average only once every 10 years (7Q10), computation of design flows including biologically based flows, and computation of flow-duration curves and duration hydrographs. Other annual, monthly, and seasonal statistics can also be computed. The interface facilitates retrieval of streamflow discharge data from the USGS National Water Information System and outputs text reports for a record of the analysis. Tools for graphing data and screening tests are available to assist the analyst in conducting the analysis.

  12. Landing Site Dispersion Analysis and Statistical Assessment for the Mars Phoenix Lander

    NASA Technical Reports Server (NTRS)

    Bonfiglio, Eugene P.; Adams, Douglas; Craig, Lynn; Spencer, David A.; Strauss, William; Seelos, Frank P.; Seelos, Kimberly D.; Arvidson, Ray; Heet, Tabatha

    2008-01-01

    The Mars Phoenix Lander launched on August 4, 2007 and successfully landed on Mars 10 months later on May 25, 2008. Landing ellipse predicts and hazard maps were key in selecting safe surface targets for Phoenix. Hazard maps were based on terrain slopes, geomorphology maps and automated rock counts of MRO's High Resolution Imaging Science Experiment (HiRISE) images. The expected landing dispersion which led to the selection of Phoenix's surface target is discussed as well as the actual landing dispersion predicts determined during operations in the weeks, days, and hours before landing. A statistical assessment of these dispersions is performed, comparing the actual landing-safety probabilities to criteria levied by the project. Also discussed are applications for this statistical analysis which were used by the Phoenix project. These include using the statistical analysis used to verify the effectiveness of a pre-planned maneuver menu and calculating the probability of future maneuvers.

  13. Surface inspection of flat products by means of texture analysis: on-line implementation using neural networks

    NASA Astrophysics Data System (ADS)

    Fernandez, Carlos; Platero, Carlos; Campoy, Pascual; Aracil, Rafael

    1994-11-01

    This paper describes some texture-based techniques that can be applied to quality assessment of flat products continuously produced (metal strips, wooden surfaces, cork, textile products, ...). Since the most difficult task is that of inspecting for product appearance, human-like inspection ability is required. A common feature to all these products is the presence of non- deterministic texture on their surfaces. Two main subjects are discussed: statistical techniques for both surface finishing determination and surface defect analysis as well as real-time implementation for on-line inspection in high-speed applications. For surface finishing determination a Gray Level Difference technique is presented to perform over low resolution images, that is, no-zoomed images. Defect analysis is performed by means of statistical texture analysis over defective portions of the surface. On-line implementation is accomplished by means of neural networks. When a defect arises, textural analysis is applied which result in a data-vector, acting as input of a neural net, previously trained in a supervised way. This approach tries to reach on-line performance in automated visual inspection applications when texture is presented in flat product surfaces.

  14. Comparative evaluation of the effect of denture cleansers on the surface topography of denture base materials: An in-vitro study

    PubMed Central

    Jeyapalan, Karthigeyan; Kumar, Jaya Krishna; Azhagarasan, N. S.

    2015-01-01

    Aims: The aim was to evaluate and compare the effects of three chemically different commercially available denture cleansing agents on the surface topography of two different denture base materials. Materials and Methods: Three chemically different denture cleansers (sodium perborate, 1% sodium hypochlorite, 0.2% chlorhexidine gluconate) were used on two denture base materials (acrylic resin and chrome cobalt alloy) and the changes were evaluated at 3 times intervals (56 h, 120 h, 240 h). Changes from baseline for surface roughness were recorded using a surface profilometer and standard error of the mean (SEM) both quantitatively and qualitatively, respectively. Qualitative surface analyses for all groups were done by SEM. Statistical Analysis Used: The values obtained were analyzed statistically using one-way ANOVA and paired t-test. Results: All three denture cleanser solutions showed no statistically significant surface changes on the acrylic resin portions at 56 h, 120 h, and 240 h of immersion. However, on the alloy portion changes were significant at the end of 120 h and 240 h. Conclusion: Of the three denture cleansers used in the study, none produced significant changes on the two denture base materials for the short duration of immersion, whereas changes were seen as the immersion periods were increased. PMID:26538915

  15. Multivariate analysis for stormwater quality characteristics identification from different urban surface types in macau.

    PubMed

    Huang, J; Du, P; Ao, C; Ho, M; Lei, M; Zhao, D; Wang, Z

    2007-12-01

    Statistical analysis of stormwater runoff data enables general identification of runoff characteristics. Six catchments with different urban surface type including roofs, roadway, park, and residential/commercial in Macau were selected for sampling and study during the period from June 2005 to September 2006. Based on univariate statistical analysis of data sampled, major pollutants discharged from different urban surface type were identified. As for iron roof runoff, Zn is the most significant pollutant. The major pollutants from urban roadway runoff are TSS and COD. Stormwater runoff from commercial/residential and Park catchments show high level of COD, TN, and TP concentration. Principal component analysis was further done for identification of linkages between stormwater quality and urban surface types. Two potential pollution sources were identified for study catchments with different urban surface types. The first one is referred as nutrients losses, soil losses and organic pollutants discharges, the second is related to heavy metals losses. PCA was proved to be a viable tool to explain the type of pollution sources and its mechanism for different urban surface type catchments.

  16. A subdivision-based parametric deformable model for surface extraction and statistical shape modeling of the knee cartilages

    NASA Astrophysics Data System (ADS)

    Fripp, Jurgen; Crozier, Stuart; Warfield, Simon K.; Ourselin, Sébastien

    2006-03-01

    Subdivision surfaces and parameterization are desirable for many algorithms that are commonly used in Medical Image Analysis. However, extracting an accurate surface and parameterization can be difficult for many anatomical objects of interest, due to noisy segmentations and the inherent variability of the object. The thin cartilages of the knee are an example of this, especially after damage is incurred from injuries or conditions like osteoarthritis. As a result, the cartilages can have different topologies or exist in multiple pieces. In this paper we present a topology preserving (genus 0) subdivision-based parametric deformable model that is used to extract the surfaces of the patella and tibial cartilages in the knee. These surfaces have minimal thickness in areas without cartilage. The algorithm inherently incorporates several desirable properties, including: shape based interpolation, sub-division remeshing and parameterization. To illustrate the usefulness of this approach, the surfaces and parameterizations of the patella cartilage are used to generate a 3D statistical shape model.

  17. Population analysis of the cingulum bundle using the tubular surface model for schizophrenia detection

    NASA Astrophysics Data System (ADS)

    Mohan, Vandana; Sundaramoorthi, Ganesh; Kubicki, Marek; Terry, Douglas; Tannenbaum, Allen

    2010-03-01

    We propose a novel framework for population analysis of DW-MRI data using the Tubular Surface Model. We focus on the Cingulum Bundle (CB) - a major tract for the Limbic System and the main connection of the Cingulate Gyrus, which has been associated with several aspects of Schizophrenia symptomatology. The Tubular Surface Model represents a tubular surface as a center-line with an associated radius function. It provides a natural way to sample statistics along the length of the fiber bundle and reduces the registration of fiber bundle surfaces to that of 4D curves. We apply our framework to a population of 20 subjects (10 normal, 10 schizophrenic) and obtain excellent results with neural network based classification (90% sensitivity, 95% specificity) as well as unsupervised clustering (k-means). Further, we apply statistical analysis to the feature data and characterize the discrimination ability of local regions of the CB, as a step towards localizing CB regions most relevant to Schizophrenia.

  18. Comparison of time-dependent changes in the surface hardness of different composite resins

    PubMed Central

    Ozcan, Suat; Yikilgan, Ihsan; Uctasli, Mine Betul; Bala, Oya; Kurklu, Zeliha Gonca Bek

    2013-01-01

    Objective: The aim of this study was to evaluate the change in surface hardness of silorane-based composite resin (Filtek Silorane) in time and compare the results with the surface hardness of two methacrylate-based resins (Filtek Supreme and Majesty Posterior). Materials and Methods: From each composite material, 18 wheel-shaped samples (5-mm diameter and 2-mm depth) were prepared. Top and bottom surface hardness of these samples was measured using a Vicker's hardness tester. The samples were then stored at 37°C and 100% humidity. After 24 h and 7, 30 and 90 days, the top and bottom surface hardness of the samples was measured. In each measurement, the rate between the hardness of the top and bottom surfaces were recorded as the hardness rate. Statistical analysis was performed by one-way analysis of variance, multiple comparisons by Tukey's test and binary comparisons by t-test with a significance level of P = 0.05. Results: The highest hardness values were obtained from each two surfaces of Majesty Posterior and the lowest from Filtek Silorane. Both the top and bottom surface hardness of the methacrylate based composite resins was high and there was a statistically significant difference between the top and bottom hardness values of only the silorane-based composite, Filtek Silorane (P < 0.05). The lowest was obtained with Filtek Silorane. The hardness values of all test groups increased after 24 h (P < 0.05). Conclusion: Although silorane-based composite resin Filtek Silorane showed adequate hardness ratio, the use of incremental technic during application is more important than methacrylate based composites. PMID:24966724

  19. The potential of 2D Kalman filtering for soil moisture data assimilation

    USDA-ARS?s Scientific Manuscript database

    We examine the potential for parameterizing a two-dimensional (2D) land data assimilation system using spatial error auto-correlation statistics gleaned from a triple collocation analysis and the triplet of: (1) active microwave-, (2) passive microwave- and (3) land surface model-based surface soil ...

  20. Reliability and statistical power analysis of cortical and subcortical FreeSurfer metrics in a large sample of healthy elderly.

    PubMed

    Liem, Franziskus; Mérillat, Susan; Bezzola, Ladina; Hirsiger, Sarah; Philipp, Michel; Madhyastha, Tara; Jäncke, Lutz

    2015-03-01

    FreeSurfer is a tool to quantify cortical and subcortical brain anatomy automatically and noninvasively. Previous studies have reported reliability and statistical power analyses in relatively small samples or only selected one aspect of brain anatomy. Here, we investigated reliability and statistical power of cortical thickness, surface area, volume, and the volume of subcortical structures in a large sample (N=189) of healthy elderly subjects (64+ years). Reliability (intraclass correlation coefficient) of cortical and subcortical parameters is generally high (cortical: ICCs>0.87, subcortical: ICCs>0.95). Surface-based smoothing increases reliability of cortical thickness maps, while it decreases reliability of cortical surface area and volume. Nevertheless, statistical power of all measures benefits from smoothing. When aiming to detect a 10% difference between groups, the number of subjects required to test effects with sufficient power over the entire cortex varies between cortical measures (cortical thickness: N=39, surface area: N=21, volume: N=81; 10mm smoothing, power=0.8, α=0.05). For subcortical regions this number is between 16 and 76 subjects, depending on the region. We also demonstrate the advantage of within-subject designs over between-subject designs. Furthermore, we publicly provide a tool that allows researchers to perform a priori power analysis and sensitivity analysis to help evaluate previously published studies and to design future studies with sufficient statistical power. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  2. Statistical robustness of machine-learning estimates for characterizing a groundwater-surface water system, Southland, New Zealand

    NASA Astrophysics Data System (ADS)

    Friedel, M. J.; Daughney, C.

    2016-12-01

    The development of a successful surface-groundwater management strategy depends on the quality of data provided for analysis. This study evaluates the statistical robustness when using a modified self-organizing map (MSOM) technique to estimate missing values for three hypersurface models: synoptic groundwater-surface water hydrochemistry, time-series of groundwater-surface water hydrochemistry, and mixed-survey (combination of groundwater-surface water hydrochemistry and lithologies) hydrostratigraphic unit data. These models of increasing complexity are developed and validated based on observations from the Southland region of New Zealand. In each case, the estimation method is sufficiently robust to cope with groundwater-surface water hydrochemistry vagaries due to sample size and extreme data insufficiency, even when >80% of the data are missing. The estimation of surface water hydrochemistry time series values enabled the evaluation of seasonal variation, and the imputation of lithologies facilitated the evaluation of hydrostratigraphic controls on groundwater-surface water interaction. The robust statistical results for groundwater-surface water models of increasing data complexity provide justification to apply the MSOM technique in other regions of New Zealand and abroad.

  3. Incorporating Multi-criteria Optimization and Uncertainty Analysis in the Model-Based Systems Engineering of an Autonomous Surface Craft

    DTIC Science & Technology

    2009-09-01

    SAS Statistical Analysis Software SE Systems Engineering SEP Systems Engineering Process SHP Shaft Horsepower SIGINT Signals Intelligence......management occurs (OSD 2002). The Systems Engineering Process (SEP), displayed in Figure 2, is a comprehensive , iterative and recursive problem

  4. A data base of geologic field spectra

    NASA Technical Reports Server (NTRS)

    Kahle, A. B.; Goetz, A. F. H.; Paley, H. N.; Alley, R. E.; Abbott, E. A.

    1981-01-01

    It is noted that field samples measured in the laboratory do not always present an accurate picture of the ground surface sensed by airborne or spaceborne instruments because of the heterogeneous nature of most surfaces and because samples are disturbed and surface characteristics changed by collection and handling. The development of new remote sensing instruments relies on the analysis of surface materials in their natural state. The existence of thousands of Portable Field Reflectance Spectrometer (PFRS) spectra has necessitated a single, all-inclusive data base that permits greatly simplified searching and sorting procedures and facilitates further statistical analyses. The data base developed at JPL for cataloging geologic field spectra is discussed.

  5. The slip resistance of common footwear materials measured with two slipmeters.

    PubMed

    Chang, W R; Matz, S

    2001-12-01

    The slip resistance of 16 commonly used footwear materials was measured with the Brungraber Mark II and the English XL on 3 floor surfaces under surface conditions of dry, wet, oily and oily wet. Three samples were used for each material combination and surface condition. The results of a one way ANOVA analysis indicated that the differences among different samples were statistically significant for a large number of material combinations and surface conditions. The results indicated that the ranking of materials based on their slip resistance values depends highly on the slipmeters, floor surfaces and surface conditions. For contaminated surfaces including wet, oily and oily wet surfaces, the slip resistance obtained with the English XL was usually higher than that measured with the Brungraber Mark II. The correlation coefficients between the slip resistance obtained with these two slipmeters calculated for different surface conditions indicated a strong correlation with statistical significance.

  6. Linear retrieval and global measurements of wind speed from the Seasat SMMR

    NASA Technical Reports Server (NTRS)

    Pandey, P. C.

    1983-01-01

    Retrievals of wind speed (WS) from Seasat Scanning Multichannel Microwave Radiometer (SMMR) were performed using a two-step statistical technique. Nine subsets of two to five SMMR channels were examined for wind speed retrieval. These subsets were derived by using a leaps and bound procedure based on the coefficient of determination selection criteria to a statistical data base of brightness temperatures and geophysical parameters. Analysis of Monsoon Experiment and ocean station PAPA data showed a strong correlation between sea surface temperature and water vapor. This relation was used in generating the statistical data base. Global maps of WS were produced for one and three month periods.

  7. Sparse approximation of currents for statistics on curves and surfaces.

    PubMed

    Durrleman, Stanley; Pennec, Xavier; Trouvé, Alain; Ayache, Nicholas

    2008-01-01

    Computing, processing, visualizing statistics on shapes like curves or surfaces is a real challenge with many applications ranging from medical image analysis to computational geometry. Modelling such geometrical primitives with currents avoids feature-based approach as well as point-correspondence method. This framework has been proved to be powerful to register brain surfaces or to measure geometrical invariants. However, if the state-of-the-art methods perform efficiently pairwise registrations, new numerical schemes are required to process groupwise statistics due to an increasing complexity when the size of the database is growing. Statistics such as mean and principal modes of a set of shapes often have a heavy and highly redundant representation. We propose therefore to find an adapted basis on which mean and principal modes have a sparse decomposition. Besides the computational improvement, this sparse representation offers a way to visualize and interpret statistics on currents. Experiments show the relevance of the approach on 34 sets of 70 sulcal lines and on 50 sets of 10 meshes of deep brain structures.

  8. Effect of tray-based and trayless tooth whitening systems on microhardness of enamel surface and subsurface.

    PubMed

    Teixeira, Erica C N; Ritter, André V; Thompson, Jeffrey Y; Leonard, Ralph H; Swift, Edward J

    2004-12-01

    To evaluate the effect of tray-based and trayless tooth whitening systems on surface and subsurface microhardness of human enamel. Enamel slabs were obtained from recently extracted human third molars. Specimens were randomly assigned to six groups according to tooth whitening treatment (n = 10): 6.0% hydrogen peroxide (HP) (Crest Whitestrips), 6.5% HP (Crest Professional Whitestrips), 7.5% HP (Day White Excel 3), 9.5% HP (Day White Excel 3), 10% carbamide peroxide (Opalescence), and a control group (untreated). Specimens were treated for 14 days following manufacturers' recommended protocols, and were immersed in artificial saliva between treatments. Enamel surface Knoop microhardness (KHN) was measured immediately before treatment, and at days 1, 7, and 14 of treatment. After treatment, subsurface microhardness was measured at depths of 50-500 microm. Data were analyzed for statistical significance using analysis of variance. Differences in microhardness for treated vs. untreated enamel surface were not statistically significant at any time interval. For 6.5% and 9.5% HP, there was a decrease in surface microhardness values during treatment, but at the end of treatment the microhardness values were not statistically different from the baseline values. For the enamel subsurface values, no differences were observed between treated vs. untreated specimens at each depth. Trayless and tray-based tooth whitening treatments do not significantly affect surface or subsurface enamel microhardness.

  9. Numerical analysis of the effect of surface roughness on mechanical fields in polycrystalline aggregates

    NASA Astrophysics Data System (ADS)

    Guilhem, Yoann; Basseville, Stéphanie; Curtit, François; Stéphan, Jean-Michel; Cailletaud, Georges

    2018-06-01

    This paper is dedicated to the study of the influence of surface roughness on local stress and strain fields in polycrystalline aggregates. Finite element computations are performed with a crystal plasticity model on a 316L stainless steel polycrystalline material element with different roughness states on its free surface. The subsequent analysis of the plastic strain localization patterns shows that surface roughness strongly affects the plastic strain localization induced by crystallography. Nevertheless, this effect mainly takes place at the surface and vanishes under the first layer of grains, which implies the existence of a critical perturbed depth. A statistical analysis based on the plastic strain distribution obtained for different roughness levels provides a simple rule to define the size of the affected zone depending on the rough surface parameters.

  10. Utility of Gram stain for the microbiological analysis of burn wound surfaces.

    PubMed

    Elsayed, Sameer; Gregson, Daniel B; Lloyd, Tracie; Crichton, Marilyn; Church, Deirdre L

    2003-11-01

    Surface swab cultures have attracted attention as a potential alternative to biopsy histology or quantitative culture methods for microbiological burn wound monitoring. To our knowledge, the utility of adding a Gram-stained slide in this context has not been evaluated previously. To determine the degree of correlation of Gram stain with culture for the microbiological analysis of burn wound surfaces. Prospective laboratory analysis. Urban health region/centralized diagnostic microbiology laboratory. Burn patients hospitalized in any Calgary Health Region burn center from November 2000 to September 2001. Gram stain plus culture of burn wound surface swab specimens obtained during routine dressing changes or based on clinical signs of infection. Degree of correlation (complete, high, partial, none), including weighted kappa statistic (kappa(w)), of Gram stain with culture based on quantitative microscopy and degree of culture growth. A total of 375 specimens from 50 burn patients were evaluated. Of these, 239 were negative by culture and Gram stain, 7 were positive by Gram stain only, 89 were positive by culture only, and 40 were positive by both methods. The degree of complete, high, partial, and no correlation of Gram stain with culture was 70.9% (266/375), 1.1% (4/375), 2.4% (9/375), and 25.6% (96/375), respectively. The degree of correlation for all 375 specimens, as expressed by the weighted kappa statistic, was found to be fair (kappa(w) = 0.32).Conclusion.-The Gram stain is not suitable for the microbiological analysis of burn wound surfaces.

  11. Urban pavement surface temperature. Comparison of numerical and statistical approach

    NASA Astrophysics Data System (ADS)

    Marchetti, Mario; Khalifa, Abderrahmen; Bues, Michel; Bouilloud, Ludovic; Martin, Eric; Chancibaut, Katia

    2015-04-01

    The forecast of pavement surface temperature is very specific in the context of urban winter maintenance. to manage snow plowing and salting of roads. Such forecast mainly relies on numerical models based on a description of the energy balance between the atmosphere, the buildings and the pavement, with a canyon configuration. Nevertheless, there is a specific need in the physical description and the numerical implementation of the traffic in the energy flux balance. This traffic was originally considered as a constant. Many changes were performed in a numerical model to describe as accurately as possible the traffic effects on this urban energy balance, such as tires friction, pavement-air exchange coefficient, and infrared flux neat balance. Some experiments based on infrared thermography and radiometry were then conducted to quantify the effect fo traffic on urban pavement surface. Based on meteorological data, corresponding pavement temperature forecast were calculated and were compared with fiels measurements. Results indicated a good agreement between the forecast from the numerical model based on this energy balance approach. A complementary forecast approach based on principal component analysis (PCA) and partial least-square regression (PLS) was also developed, with data from thermal mapping usng infrared radiometry. The forecast of pavement surface temperature with air temperature was obtained in the specific case of urban configurtation, and considering traffic into measurements used for the statistical analysis. A comparison between results from the numerical model based on energy balance, and PCA/PLS was then conducted, indicating the advantages and limits of each approach.

  12. Scanning probe recognition microscopy investigation of tissue scaffold properties

    PubMed Central

    Fan, Yuan; Chen, Qian; Ayres, Virginia M; Baczewski, Andrew D; Udpa, Lalita; Kumar, Shiva

    2007-01-01

    Scanning probe recognition microscopy is a new scanning probe microscopy technique which enables selective scanning along individual nanofibers within a tissue scaffold. Statistically significant data for multiple properties can be collected by repetitively fine-scanning an identical region of interest. The results of a scanning probe recognition microscopy investigation of the surface roughness and elasticity of a series of tissue scaffolds are presented. Deconvolution and statistical methods were developed and used for data accuracy along curved nanofiber surfaces. Nanofiber features were also independently analyzed using transmission electron microscopy, with results that supported the scanning probe recognition microscopy-based analysis. PMID:18203431

  13. Scanning probe recognition microscopy investigation of tissue scaffold properties.

    PubMed

    Fan, Yuan; Chen, Qian; Ayres, Virginia M; Baczewski, Andrew D; Udpa, Lalita; Kumar, Shiva

    2007-01-01

    Scanning probe recognition microscopy is a new scanning probe microscopy technique which enables selective scanning along individual nanofibers within a tissue scaffold. Statistically significant data for multiple properties can be collected by repetitively fine-scanning an identical region of interest. The results of a scanning probe recognition microscopy investigation of the surface roughness and elasticity of a series of tissue scaffolds are presented. Deconvolution and statistical methods were developed and used for data accuracy along curved nanofiber surfaces. Nanofiber features were also independently analyzed using transmission electron microscopy, with results that supported the scanning probe recognition microscopy-based analysis.

  14. Method and algorithm of automatic estimation of road surface type for variable damping control

    NASA Astrophysics Data System (ADS)

    Dąbrowski, K.; Ślaski, G.

    2016-09-01

    In this paper authors presented an idea of road surface estimation (recognition) on a base of suspension dynamic response signals statistical analysis. For preliminary analysis cumulated distribution function (CDF) was used, and some conclusion that various roads have responses values in a different ranges of limits for the same percentage of samples or for the same limits different percentages of samples are located within the range between limit values. That was the base for developed and presented algorithm which was tested using suspension response signals recorded during road test riding over various surfaces. Proposed algorithm can be essential part of adaptive damping control algorithm for a vehicle suspension or adaptive control strategy for suspension damping control.

  15. Airport Surface Delays and Causes: A Preliminary Analysis

    NASA Technical Reports Server (NTRS)

    Chin, David K.; Goldberg, Jay; Tang, Tammy

    1997-01-01

    This report summarizes FAA Program Analysis and Operations Research Service (ASD-400)/Lockheed Martin activities and findings related to airport surface delays and causes, in support of NASA Langley Research Center's Terminal Area Productivity (TAP) Program. The activities described in this report were initiated in June 1995. A preliminary report was published on September 30, 1995. The final report incorporates data collection forms filled out by traffic managers, other FAA staff, and an airline for the New York City area, some updates, data previously requested from various sources to support this analysis, and further quantification and documentation than in the preliminary report. This final report is based on data available as of April 12, 1996. This report incorporates data obtained from review and analysis of data bases and literature, discussions/interviews with engineers, air-traffic staff, other FAA technical personnel, and airline staff, site visits, and a survey on surface delays and causes. It includes analysis of delay statistics; preliminary findings and conclusions on surface movement, surface delay sources and causes, runway occupancy time (ROT), and airport characteristics impacting surface operations and delays; and site-specific data on the New York City area airports, which are the focus airports for this report.

  16. Statistical characterization of short wind waves from stereo images of the sea surface

    NASA Astrophysics Data System (ADS)

    Mironov, Alexey; Yurovskaya, Maria; Dulov, Vladimir; Hauser, Danièle; Guérin, Charles-Antoine

    2013-04-01

    We propose a methodology to extract short-scale statistical characteristics of the sea surface topography by means of stereo image reconstruction. The possibilities and limitations of the technique are discussed and tested on a data set acquired from an oceanographic platform at the Black Sea. The analysis shows that reconstruction of the topography based on stereo method is an efficient way to derive non-trivial statistical properties of surface short- and intermediate-waves (say from 1 centimer to 1 meter). Most technical issues pertaining to this type of datasets (limited range of scales, lacunarity of data or irregular sampling) can be partially overcome by appropriate processing of the available points. The proposed technique also allows one to avoid linear interpolation which dramatically corrupts properties of retrieved surfaces. The processing technique imposes that the field of elevation be polynomially detrended, which has the effect of filtering out the large scales. Hence the statistical analysis can only address the small-scale components of the sea surface. The precise cut-off wavelength, which is approximatively half the patch size, can be obtained by applying a high-pass frequency filter on the reference gauge time records. The results obtained for the one- and two-points statistics of small-scale elevations are shown consistent, at least in order of magnitude, with the corresponding gauge measurements as well as other experimental measurements available in the literature. The calculation of the structure functions provides a powerful tool to investigate spectral and statistical properties of the field of elevations. Experimental parametrization of the third-order structure function, the so-called skewness function, is one of the most important and original outcomes of this study. This function is of primary importance in analytical scattering models from the sea surface and was up to now unavailable in field conditions. Due to the lack of precise reference measurements for the small-scale wave field, we could not quantify exactly the accuracy of the retrieval technique. However, it appeared clearly that the obtained accuracy is good enough for the estimation of second-order statistical quantities (such as the correlation function), acceptable for third-order quantities (such as the skwewness function) and insufficient for fourth-order quantities (such as kurtosis). Therefore, the stereo technique in the present stage should not be thought as a self-contained universal tool to characterize the surface statistics. Instead, it should be used in conjunction with other well calibrated but sparse reference measurement (such as wave gauges) for cross-validation and calibration. It then completes the statistical analysis in as much as it provides a snapshot of the three-dimensional field and allows for the evaluation of higher-order spatial statistics.

  17. The EUSTACE project: delivering global, daily information on surface air temperature

    NASA Astrophysics Data System (ADS)

    Ghent, D.; Rayner, N. A.

    2017-12-01

    Day-to-day variations in surface air temperature affect society in many ways; however, daily surface air temperature measurements are not available everywhere. A global daily analysis cannot be achieved with measurements made in situ alone, so incorporation of satellite retrievals is needed. To achieve this, in the EUSTACE project (2015-2018, https://www.eustaceproject.eu) we have developed an understanding of the relationships between traditional (land and marine) surface air temperature measurements and retrievals of surface skin temperature from satellite measurements, i.e. Land Surface Temperature, Ice Surface Temperature, Sea Surface Temperature and Lake Surface Water Temperature. Here we discuss the science needed to produce a fully-global daily analysis (or ensemble of analyses) of surface air temperature on the centennial scale, integrating different ground-based and satellite-borne data types. Information contained in the satellite retrievals is used to create globally-complete fields in the past, using statistical models of how surface air temperature varies in a connected way from place to place. This includes developing new "Big Data" analysis methods as the data volumes involved are considerable. We will present recent progress along this road in the EUSTACE project, i.e.: • identifying inhomogeneities in daily surface air temperature measurement series from weather stations and correcting for these over Europe; • estimating surface air temperature over all surfaces of Earth from surface skin temperature retrievals; • using new statistical techniques to provide information on higher spatial and temporal scales than currently available, making optimum use of information in data-rich eras. Information will also be given on how interested users can become involved.

  18. Comparison of Response Surface and Kriging Models in the Multidisciplinary Design of an Aerospike Nozzle

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.

    1998-01-01

    The use of response surface models and kriging models are compared for approximating non-random, deterministic computer analyses. After discussing the traditional response surface approach for constructing polynomial models for approximation, kriging is presented as an alternative statistical-based approximation method for the design and analysis of computer experiments. Both approximation methods are applied to the multidisciplinary design and analysis of an aerospike nozzle which consists of a computational fluid dynamics model and a finite element analysis model. Error analysis of the response surface and kriging models is performed along with a graphical comparison of the approximations. Four optimization problems are formulated and solved using both approximation models. While neither approximation technique consistently outperforms the other in this example, the kriging models using only a constant for the underlying global model and a Gaussian correlation function perform as well as the second order polynomial response surface models.

  19. Object Classification Based on Analysis of Spectral Characteristics of Seismic Signal Envelopes

    NASA Astrophysics Data System (ADS)

    Morozov, Yu. V.; Spektor, A. A.

    2017-11-01

    A method for classifying moving objects having a seismic effect on the ground surface is proposed which is based on statistical analysis of the envelopes of received signals. The values of the components of the amplitude spectrum of the envelopes obtained applying Hilbert and Fourier transforms are used as classification criteria. Examples illustrating the statistical properties of spectra and the operation of the seismic classifier are given for an ensemble of objects of four classes (person, group of people, large animal, vehicle). It is shown that the computational procedures for processing seismic signals are quite simple and can therefore be used in real-time systems with modest requirements for computational resources.

  20. Statistical characterization of surface features from tungsten-coated divertor inserts in the DIII-D Metal Rings Campaign

    NASA Astrophysics Data System (ADS)

    Adams, Jacob; Unterberg, Ezekial; Chrobak, Christopher; Stahl, Brian; Abrams, Tyler

    2017-10-01

    Continuing analysis of tungsten-coated inserts from the recent DIII-D Metal Rings Campaign utilizes a statistical approach to study carbon migration and deposition on W surfaces and to characterize the pre- versus post-exposure surface morphology. A TZM base was coated with W using both CVD and PVD and allowed for comparison between the two coating methods. The W inserts were positioned in the lower DIII-D divertor in both the upper (shelf) region and lower (floor) region and subjected to multiple plasma shots, primarily in H-mode. Currently, the post-exposure W inserts are being characterized using SEM/EDX to qualify the surface morphology and to quantify the surface chemical composition. In addition, profilometry is being used to measure the surface roughness of the inserts both before and after plasma exposure. Preliminary results suggest a correlation between the pre-exposure surface roughness and the level of carbon deposited on the surface. Furthermore, ongoing in-depth analysis may reveal insights into the formation mechanism of nanoscale bumps found in the carbon-rich regions of the W surfaces that have not yet been explained. Work supported in part by US DoE under the Science Undergraduate Laboratory Internship (SULI) program and under DE-FC02-04ER54698.

  1. Statistics-based optimization of the polarimetric radar hydrometeor classification algorithm and its application for a squall line in South China

    NASA Astrophysics Data System (ADS)

    Wu, Chong; Liu, Liping; Wei, Ming; Xi, Baozhu; Yu, Minghui

    2018-03-01

    A modified hydrometeor classification algorithm (HCA) is developed in this study for Chinese polarimetric radars. This algorithm is based on the U.S. operational HCA. Meanwhile, the methodology of statistics-based optimization is proposed including calibration checking, datasets selection, membership functions modification, computation thresholds modification, and effect verification. Zhuhai radar, the first operational polarimetric radar in South China, applies these procedures. The systematic bias of calibration is corrected, the reliability of radar measurements deteriorates when the signal-to-noise ratio is low, and correlation coefficient within the melting layer is usually lower than that of the U.S. WSR-88D radar. Through modification based on statistical analysis of polarimetric variables, the localized HCA especially for Zhuhai is obtained, and it performs well over a one-month test through comparison with sounding and surface observations. The algorithm is then utilized for analysis of a squall line process on 11 May 2014 and is found to provide reasonable details with respect to horizontal and vertical structures, and the HCA results—especially in the mixed rain-hail region—can reflect the life cycle of the squall line. In addition, the kinematic and microphysical processes of cloud evolution and the differences between radar-detected hail and surface observations are also analyzed. The results of this study provide evidence for the improvement of this HCA developed specifically for China.

  2. Spectral Analysis of B Stars: An Application of Bayesian Statistics

    NASA Astrophysics Data System (ADS)

    Mugnes, J.-M.; Robert, C.

    2012-12-01

    To better understand the processes involved in stellar physics, it is necessary to obtain accurate stellar parameters (effective temperature, surface gravity, abundances…). Spectral analysis is a powerful tool for investigating stars, but it is also vital to reduce uncertainties at a decent computational cost. Here we present a spectral analysis method based on a combination of Bayesian statistics and grids of synthetic spectra obtained with TLUSTY. This method simultaneously constrains the stellar parameters by using all the lines accessible in observed spectra and thus greatly reduces uncertainties and improves the overall spectrum fitting. Preliminary results are shown using spectra from the Observatoire du Mont-Mégantic.

  3. A statistical motion model based on biomechanical simulations for data fusion during image-guided prostate interventions.

    PubMed

    Hu, Yipeng; Morgan, Dominic; Ahmed, Hashim Uddin; Pendsé, Doug; Sahu, Mahua; Allen, Clare; Emberton, Mark; Hawkes, David; Barratt, Dean

    2008-01-01

    A method is described for generating a patient-specific, statistical motion model (SMM) of the prostate gland. Finite element analysis (FEA) is used to simulate the motion of the gland using an ultrasound-based 3D FE model over a range of plausible boundary conditions and soft-tissue properties. By applying principal component analysis to the displacements of the FE mesh node points inside the gland, the simulated deformations are then used as training data to construct the SMM. The SMM is used to both predict the displacement field over the whole gland and constrain a deformable surface registration algorithm, given only a small number of target points on the surface of the deformed gland. Using 3D transrectal ultrasound images of the prostates of five patients, acquired before and after imposing a physical deformation, to evaluate the accuracy of predicted landmark displacements, the mean target registration error was found to be less than 1.9 mm.

  4. The Effect of Different Chemical Surface Treatments of Denture Teeth on Shear Bond Strength: A Comparative Study

    PubMed Central

    Palekar, Umesh; Awinashe, Vaibav; Mishra, Sunil Kumar; Kawadkar, Abhishek; Rahangdale, Tripti

    2014-01-01

    Background: The development of better cross linked acrylic resin teeth has solved the problems related to wearing and discoloration of acrylic teeth. The same cross linking at ridge lap region acts as a double edge sword as it weakens the bond between denture base and tooth. Aim of Study: The purpose of study was to evaluate the effect of surface treatment on the bond strength of resin teeth to denture base resin using monomethyl methacrylate monomer and dichloromethane with no surface treatment acting as control. Settings and Design:Denture base cylinder samples in wax (n=180) were made with maxillary central incisor attached at 450 (JIST 6506). These samples were randomly and equally divided into three groups of 60 each. These specimens were then flasked, dewaxed as per the standard protocol. Materials and Methods: Before acrylization, ridge lap area was treated as follows: Group A- no surface treatment act as control, Group B treated with monomethyl methacrylate monomer, Group C treated with dichloromethane. Digitally controlled acryliser was used for acrylization as per manufacturer’s instructions and shear bond strength was tested on Universal Testing Machine (Servo Hydraulic, 50kN High Strain, BISS Research). Statistical Analysis used: Result was statistically analyzed with One-way analysis of variance (ANOVA) and Post-hoc ANOVA Tukey’s HSD test at 5% level of significance. Results: The application of dichloromethane showed increased bond strength between cross linked acrylic resin teeth and heat cure denture base resin followed by monomethyl methacrylate monomer and control group. Conclusion: The application of dichloromethane on the ridge lap surface of the resin teeth before packing of the dough into the mold significantly increased the bond strength between cross linked acrylic resin teeth and heat cure denture base resin. PMID:25121057

  5. Improved disparity map analysis through the fusion of monocular image segmentations

    NASA Technical Reports Server (NTRS)

    Perlant, Frederic P.; Mckeown, David M.

    1991-01-01

    The focus is to examine how estimates of three dimensional scene structure, as encoded in a scene disparity map, can be improved by the analysis of the original monocular imagery. The utilization of surface illumination information is provided by the segmentation of the monocular image into fine surface patches of nearly homogeneous intensity to remove mismatches generated during stereo matching. These patches are used to guide a statistical analysis of the disparity map based on the assumption that such patches correspond closely with physical surfaces in the scene. Such a technique is quite independent of whether the initial disparity map was generated by automated area-based or feature-based stereo matching. Stereo analysis results are presented on a complex urban scene containing various man-made and natural features. This scene contains a variety of problems including low building height with respect to the stereo baseline, buildings and roads in complex terrain, and highly textured buildings and terrain. The improvements are demonstrated due to monocular fusion with a set of different region-based image segmentations. The generality of this approach to stereo analysis and its utility in the development of general three dimensional scene interpretation systems are also discussed.

  6. The EUSTACE project: delivering global, daily information on surface air temperature

    NASA Astrophysics Data System (ADS)

    Rayner, Nick

    2017-04-01

    Day-to-day variations in surface air temperature affect society in many ways; however, daily surface air temperature measurements are not available everywhere. A global daily analysis cannot be achieved with measurements made in situ alone, so incorporation of satellite retrievals is needed. To achieve this, in the EUSTACE project (2015-June 2018, https://www.eustaceproject.eu) we are developing an understanding of the relationships between traditional (land and marine) surface air temperature measurements and retrievals of surface skin temperature from satellite measurements, i.e. Land Surface Temperature, Ice Surface Temperature, Sea Surface Temperature and Lake Surface Water Temperature. Here we discuss the science needed to produce a fully-global daily analysis (or ensemble of analyses) of surface air temperature on the centennial scale, integrating different ground-based and satellite-borne data types. Information contained in the satellite retrievals is used to create globally-complete fields in the past, using statistical models of how surface air temperature varies in a connected way from place to place. As the data volumes involved are considerable, such work needs to include development of new "Big Data" analysis methods. We will present recent progress along this road in the EUSTACE project: 1. providing new, consistent, multi-component estimates of uncertainty in surface skin temperature retrievals from satellites; 2. identifying inhomogeneities in daily surface air temperature measurement series from weather stations and correcting for these over Europe; 3. estimating surface air temperature over all surfaces of Earth from surface skin temperature retrievals; 4. using new statistical techniques to provide information on higher spatial and temporal scales than currently available, making optimum use of information in data-rich eras. Information will also be given on how interested users can become involved.

  7. The EUSTACE project: delivering global, daily information on surface air temperature

    NASA Astrophysics Data System (ADS)

    Ghent, D.; Rayner, N. A.

    2016-12-01

    Day-to-day variations in surface air temperature affect society in many ways; however, daily surface air temperature measurements are not available everywhere. A global daily analysis cannot be achieved with measurements made in situ alone, so incorporation of satellite retrievals is needed. To achieve this, in the EUSTACE project (2015-June 2018, https://www.eustaceproject.eu) we are developing an understanding of the relationships between traditional (land and marine) surface air temperature measurements and retrievals of surface skin temperature from satellite measurements, i.e. Land Surface Temperature, Ice Surface Temperature, Sea Surface Temperature and Lake Surface Water Temperature. Here we discuss the science needed to produce a fully-global daily analysis (or ensemble of analyses) of surface air temperature on the centennial scale, integrating different ground-based and satellite-borne data types. Information contained in the satellite retrievals is used to create globally-complete fields in the past, using statistical models of how surface air temperature varies in a connected way from place to place. As the data volumes involved are considerable, such work needs to include development of new "Big Data" analysis methods. We will present recent progress along this road in the EUSTACE project, i.e.: • providing new, consistent, multi-component estimates of uncertainty in surface skin temperature retrievals from satellites; • identifying inhomogeneities in daily surface air temperature measurement series from weather stations and correcting for these over Europe; • estimating surface air temperature over all surfaces of Earth from surface skin temperature retrievals; • using new statistical techniques to provide information on higher spatial and temporal scales than currently available, making optimum use of information in data-rich eras. Information will also be given on how interested users can become involved.

  8. Digital recovery, modification, and analysis of Tetra Tech seismic horizon mapping, National Petroleum Reserve Alaska (NPRA), northern Alaska

    USGS Publications Warehouse

    Saltus, R.W.; Kulander, Christopher S.; Potter, Christopher J.

    2002-01-01

    We have digitized, modified, and analyzed seismic interpretation maps of 12 subsurface stratigraphic horizons spanning portions of the National Petroleum Reserve in Alaska (NPRA). These original maps were prepared by Tetra Tech, Inc., based on about 15,000 miles of seismic data collected from 1974 to 1981. We have also digitized interpreted faults and seismic velocities from Tetra Tech maps. The seismic surfaces were digitized as two-way travel time horizons and converted to depth using Tetra Tech seismic velocities. The depth surfaces were then modified by long-wavelength corrections based on recent USGS seismic re-interpretation along regional seismic lines. We have developed and executed an algorithm to identify and calculate statistics on the area, volume, height, and depth of closed structures based on these seismic horizons. These closure statistics are tabulated and have been used as input to oil and gas assessment calculations for the region. Directories accompanying this report contain basic digitized data, processed data, maps, tabulations of closure statistics, and software relating to this project.

  9. Analysis of Cortical Shape in Children with Simplex Autism

    PubMed Central

    Dierker, Donna L.; Feczko, Eric; Pruett, John R.; Petersen, Steven E.; Schlaggar, Bradley L.; Constantino, John N.; Harwell, John W.; Coalson, Timothy S.; Van Essen, David C.

    2015-01-01

    We used surface-based morphometry to test for differences in cortical shape between children with simplex autism (n = 34, mean age 11.4 years) and typical children (n = 32, mean age 11.3 years). This entailed testing for group differences in sulcal depth and in 3D coordinates after registering cortical midthickness surfaces to an atlas target using 2 independent registration methods. We identified bilateral differences in sulcal depth in restricted portions of the anterior-insula and frontal-operculum (aI/fO) and in the temporoparietal junction (TPJ). The aI/fO depth differences are associated with and likely to be caused by a shape difference in the inferior frontal gyrus in children with simplex autism. Comparisons of average midthickness surfaces of children with simplex autism and those of typical children suggest that the significant sulcal depth differences represent local peaks in a larger pattern of regional differences that are below statistical significance when using coordinate-based analysis methods. Cortical regions that are statistically significant before correction for multiple measures are peaks of more extended, albeit subtle regional differences that may guide hypothesis generation for studies using other imaging modalities. PMID:24165833

  10. Effect of denture cleansers on color stability, surface roughness, and hardness of different denture base resins

    PubMed Central

    Porwal, Anand; Khandelwal, Meenakshi; Punia, Vikas; Sharma, Vivek

    2017-01-01

    Aim: The purpose of this study was to evaluate the effect of different denture cleansers on the color stability, surface hardness, and roughness of different denture base resins. Materials and Methods: Three denture base resin materials (conventional heat cure resin, high impact resin, and polyamide denture base resin) were immersed for 180 days in commercially available two denture cleansers (sodium perborate and sodium hypochlorite). Color, surface roughness, and hardness were measured for each sample before and after immersion procedure. Statistical Analysis: One-way analysis of variance and Tukey's post hoc honestly significant difference test were used to evaluate color, surface roughness, and hardness data before and after immersion in denture cleanser (α =0.05). Results: All denture base resins tested exhibited a change in color, surface roughness, and hardness to some degree in both denture cleansers. Polyamides resin immersed in sodium perborate showed a maximum change in color after immersion for 180 days. Conventional heat cure resin immersed in sodium hypochlorite showed a maximum change in surface roughness and conventional heat cure immersed in sodium perborate showed a maximum change in hardness. Conclusion: Color changes of all denture base resins were within the clinically accepted range for color difference. Surface roughness change of conventional heat cure resin was not within the clinically accepted range of surface roughness. The choice of denture cleanser for different denture base resins should be based on the chemistry of resin and cleanser, denture cleanser concentration, and duration of immersion. PMID:28216847

  11. Helioseismology of pre-emerging active regions. III. Statistical analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnes, G.; Leka, K. D.; Braun, D. C.

    The subsurface properties of active regions (ARs) prior to their appearance at the solar surface may shed light on the process of AR formation. Helioseismic holography has been applied to samples taken from two populations of regions on the Sun (pre-emergence and without emergence), each sample having over 100 members, that were selected to minimize systematic bias, as described in Paper I. Paper II showed that there are statistically significant signatures in the average helioseismic properties that precede the formation of an AR. This paper describes a more detailed analysis of the samples of pre-emergence regions and regions without emergencemore » based on discriminant analysis. The property that is best able to distinguish the populations is found to be the surface magnetic field, even a day before the emergence time. However, after accounting for the correlations between the surface field and the quantities derived from helioseismology, there is still evidence of a helioseismic precursor to AR emergence that is present for at least a day prior to emergence, although the analysis presented cannot definitively determine the subsurface properties prior to emergence due to the small sample sizes.« less

  12. Polypropylene Production Optimization in Fluidized Bed Catalytic Reactor (FBCR): Statistical Modeling and Pilot Scale Experimental Validation

    PubMed Central

    Khan, Mohammad Jakir Hossain; Hussain, Mohd Azlan; Mujtaba, Iqbal Mohammed

    2014-01-01

    Propylene is one type of plastic that is widely used in our everyday life. This study focuses on the identification and justification of the optimum process parameters for polypropylene production in a novel pilot plant based fluidized bed reactor. This first-of-its-kind statistical modeling with experimental validation for the process parameters of polypropylene production was conducted by applying ANNOVA (Analysis of variance) method to Response Surface Methodology (RSM). Three important process variables i.e., reaction temperature, system pressure and hydrogen percentage were considered as the important input factors for the polypropylene production in the analysis performed. In order to examine the effect of process parameters and their interactions, the ANOVA method was utilized among a range of other statistical diagnostic tools such as the correlation between actual and predicted values, the residuals and predicted response, outlier t plot, 3D response surface and contour analysis plots. The statistical analysis showed that the proposed quadratic model had a good fit with the experimental results. At optimum conditions with temperature of 75°C, system pressure of 25 bar and hydrogen percentage of 2%, the highest polypropylene production obtained is 5.82% per pass. Hence it is concluded that the developed experimental design and proposed model can be successfully employed with over a 95% confidence level for optimum polypropylene production in a fluidized bed catalytic reactor (FBCR). PMID:28788576

  13. Research Update: Spatially resolved mapping of electronic structure on atomic level by multivariate statistical analysis

    NASA Astrophysics Data System (ADS)

    Belianinov, Alex; Ganesh, Panchapakesan; Lin, Wenzhi; Sales, Brian C.; Sefat, Athena S.; Jesse, Stephen; Pan, Minghu; Kalinin, Sergei V.

    2014-12-01

    Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe0.55Se0.45 (Tc = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe1-xSex structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified by their electronic signature and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.

  14. Tolerancing aspheres based on manufacturing statistics

    NASA Astrophysics Data System (ADS)

    Wickenhagen, S.; Möhl, A.; Fuchs, U.

    2017-11-01

    A standard way of tolerancing optical elements or systems is to perform a Monte Carlo based analysis within a common optical design software package. Although, different weightings and distributions are assumed they are all counting on statistics, which usually means several hundreds or thousands of systems for reliable results. Thus, employing these methods for small batch sizes is unreliable, especially when aspheric surfaces are involved. The huge database of asphericon was used to investigate the correlation between the given tolerance values and measured data sets. The resulting probability distributions of these measured data were analyzed aiming for a robust optical tolerancing process.

  15. REGRESSION ANALYSIS OF SEA-SURFACE-TEMPERATURE PATTERNS FOR THE NORTH PACIFIC OCEAN.

    DTIC Science & Technology

    SEA WATER, *SURFACE TEMPERATURE, *OCEANOGRAPHIC DATA, PACIFIC OCEAN, REGRESSION ANALYSIS , STATISTICAL ANALYSIS, UNDERWATER EQUIPMENT, DETECTION, UNDERWATER COMMUNICATIONS, DISTRIBUTION, THERMAL PROPERTIES, COMPUTERS.

  16. Terrain-analysis procedures for modeling radar backscatter

    USGS Publications Warehouse

    Schaber, Gerald G.; Pike, Richard J.; Berlin, Graydon Lennis

    1978-01-01

    The collection and analysis of detailed information on the surface of natural terrain are important aspects of radar-backscattering modeling. Radar is especially sensitive to surface-relief changes in the millimeter- to-decimeter scale four conventional K-band (~1-cm wavelength) to L-band (~25-cm wavelength) radar systems. Surface roughness statistics that characterize these changes in detail have been generated by a comprehensive set of seven programmed calculations for radar-backscatter modeling from sets of field measurements. The seven programs are 1) formatting of data in readable form for subsequent topographic analysis program; 2) relief analysis; 3) power spectral analysis; 4) power spectrum plots; 5) slope angle between slope reversals; 6) slope angle against slope interval plots; and 7) base length slope angle and curvature. This complete Fortran IV software package, 'Terrain Analysis', is here presented for the first time. It was originally developed a decade ago for investigations of lunar morphology and surface trafficability for the Apollo Lunar Roving Vehicle.

  17. Studies of oceanic tectonics based on GEOS-3 satellite altimetry

    NASA Technical Reports Server (NTRS)

    Poehls, K. A.; Kaula, W. M.; Schubert, G.; Sandwell, D.

    1979-01-01

    Using statistical analysis, geoidal admittance (the relationship between the ocean geoid and seafloor topography) obtained from GEOS-3 altimetry was compared to various model admittances. Analysis of several altimetry tracks in the Pacific Ocean demonstrated a low coherence between altimetry and seafloor topography except where the track crosses active or recent tectonic features. However, global statistical studies using the much larger data base of all available gravimetry showed a positive correlation of oceanic gravity with topography. The oceanic lithosphere was modeled by simultaneously inverting surface wave dispersion, topography, and gravity data. Efforts to incorporate geoid data into the inversion showed that the base of the subchannel can be better resolved with geoid rather than gravity data. Thermomechanical models of seafloor spreading taking into account differing plate velocities, heat source distributions, and rock rheologies were discussed.

  18. An experimental analysis of the real contact area between an electrical contact and a glass plane

    NASA Astrophysics Data System (ADS)

    Down, Michael; Jiang, Liudi; McBride, John W.

    2013-06-01

    The exact contact between two rough surfaces is usually estimated using statistical mathematics and surface analysis before and after contact has occurred. To date the majority of real contact and loaded surfaces has been theoretical or by numerical analyses. A method of analysing real contact area under various loads, by utilizing a con-contact laser surface profiler, allows direct measurement of contact area and deformation in terms of contact force and plane displacement between two surfaces. A laser performs a scan through a transparent flat side supported in a fixed position above the base. A test contact, mounted atop a spring and force sensor, and a screw support which moves into contact with the transparent surface. This paper presents the analysis of real contact area of various surfaces under various loads. The surfaces analysed are a pair of Au coated hemispherical contacts, one is a used Au to Au coated multi-walled carbon nanotubes surface, from a MEMS relay application, the other a new contact surface of the same configuration.

  19. Gridding Cloud and Irradiance to Quantify Variability at the ARM Southern Great Plains Site

    NASA Astrophysics Data System (ADS)

    Riihimaki, L.; Long, C. N.; Gaustad, K.

    2017-12-01

    Ground-based radiometers provide the most accurate measurements of surface irradiance. However, geometry differences between surface point measurements and large area climate model grid boxes or satellite-based footprints can cause systematic differences in surface irradiance comparisons. In this work, irradiance measurements from a network of ground stations around Kansas and Oklahoma at the US Department of Energy Atmospheric Radiation Measurement (ARM) Southern Great Plains facility are examined. Upwelling and downwelling broadband shortwave and longwave radiometer measurements are available at each site as well as surface meteorological measurements. In addition to the measured irradiances, clear sky irradiance and cloud fraction estimates are analyzed using well established methods based on empirical fits to measured clear sky irradiances. Measurements are interpolated onto a 0.25 degree latitude and longitude grid using a Gaussian weight scheme in order to provide a more accurate statistical comparison between ground measurements and a larger area such as that used in climate models, plane parallel radiative transfer calculations, and other statistical and climatological research. Validation of the gridded product will be shown, as well as analysis that quantifies the impact of site location, cloud type, and other factors on the resulting surface irradiance estimates. The results of this work are being incorporated into the Surface Cloud Grid operational data product produced by ARM, and will be made publicly available for use by others.

  20. Principal Component Analysis in Construction of 3D Human Knee Joint Models Using a Statistical Shape Model Method

    PubMed Central

    Tsai, Tsung-Yuan; Li, Jing-Sheng; Wang, Shaobai; Li, Pingyue; Kwon, Young-Min; Li, Guoan

    2013-01-01

    The statistical shape model (SSM) method that uses 2D images of the knee joint to predict the 3D joint surface model has been reported in literature. In this study, we constructed a SSM database using 152 human CT knee joint models, including the femur, tibia and patella and analyzed the characteristics of each principal component of the SSM. The surface models of two in vivo knees were predicted using the SSM and their 2D bi-plane fluoroscopic images. The predicted models were compared to their CT joint models. The differences between the predicted 3D knee joint surfaces and the CT image-based surfaces were 0.30 ± 0.81 mm, 0.34 ± 0.79 mm and 0.36 ± 0.59 mm for the femur, tibia and patella, respectively (average ± standard deviation). The computational time for each bone of the knee joint was within 30 seconds using a personal computer. The analysis of this study indicated that the SSM method could be a useful tool to construct 3D surface models of the knee with sub-millimeter accuracy in real time. Thus it may have a broad application in computer assisted knee surgeries that require 3D surface models of the knee. PMID:24156375

  1. Base-flow characteristics of streams in the Valley and Ridge, the Blue Ridge, and the Piedmont physiographic provinces of Virginia

    USGS Publications Warehouse

    Nelms, David L.; Harlow, George E.; Hayes, Donald C.

    1997-01-01

    Growth within the Valley and Ridge, Blue Ridge, and Piedmont physiographic provinces of Virginia has focused concern about allocation of surface-water flow and increased demands on the ground-water resources. Potential surface-water yield was determined from statistical analysis of base-flow characteristics of streams. Base-flow characteristics also may provide a relative indication of the potential ground-water yield for areas that lack sufficient specific capacity or will-yield data; however, other factors need to be considered, such as geologic structure, lithology, precipitation, relief, and the degree of hydraulic interconnection between the regolith and bedrock.

  2. Statistical mapping of zones of focused groundwater/surface-water exchange using fiber-optic distributed temperature sensing

    USGS Publications Warehouse

    Mwakanyamale, Kisa; Day-Lewis, Frederick D.; Slater, Lee D.

    2013-01-01

    Fiber-optic distributed temperature sensing (FO-DTS) increasingly is used to map zones of focused groundwater/surface-water exchange (GWSWE). Previous studies of GWSWE using FO-DTS involved identification of zones of focused GWSWE based on arbitrary cutoffs of FO-DTS time-series statistics (e.g., variance, cross-correlation between temperature and stage, or spectral power). New approaches are needed to extract more quantitative information from large, complex FO-DTS data sets while concurrently providing an assessment of uncertainty associated with mapping zones of focused GSWSE. Toward this end, we present a strategy combining discriminant analysis (DA) and spectral analysis (SA). We demonstrate the approach using field experimental data from a reach of the Columbia River adjacent to the Hanford 300 Area site. Results of the combined SA/DA approach are shown to be superior to previous results from qualitative interpretation of FO-DTS spectra alone.

  3. A new approach for remediation of As-contaminated soil: ball mill-based technique.

    PubMed

    Shin, Yeon-Jun; Park, Sang-Min; Yoo, Jong-Chan; Jeon, Chil-Sung; Lee, Seung-Woo; Baek, Kitae

    2016-02-01

    In this study, a physical ball mill process instead of chemical extraction using toxic chemical agents was applied to remove arsenic (As) from contaminated soil. A statistical analysis was carried out to establish the optimal conditions for ball mill processing. As a result of the statistical analysis, approximately 70% of As was removed from the soil at the following conditions: 5 min, 1.0 cm, 10 rpm, and 5% of operating time, media size, rotational velocity, and soil loading conditions, respectively. A significant amount of As remained in the grinded fine soil after ball mill processing while more than 90% of soil has the original properties to be reused or recycled. As a result, the ball mill process could remove the metals bound strongly to the surface of soil by the surface grinding, which could be applied as a pretreatment before application of chemical extraction to reduce the load.

  4. Analysing and correcting the differences between multi-source and multi-scale spatial remote sensing observations.

    PubMed

    Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun

    2014-01-01

    Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation.

  5. Analysing and Correcting the Differences between Multi-Source and Multi-Scale Spatial Remote Sensing Observations

    PubMed Central

    Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun

    2014-01-01

    Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation. PMID:25405760

  6. Generation Mechanism of Nonlinear Rayleigh Surface Waves for Randomly Distributed Surface Micro-Cracks.

    PubMed

    Ding, Xiangyan; Li, Feilong; Zhao, Youxuan; Xu, Yongmei; Hu, Ning; Cao, Peng; Deng, Mingxi

    2018-04-23

    This paper investigates the propagation of Rayleigh surface waves in structures with randomly distributed surface micro-cracks using numerical simulations. The results revealed a significant ultrasonic nonlinear effect caused by the surface micro-cracks, which is mainly represented by a second harmonic with even more distinct third/quadruple harmonics. Based on statistical analysis from the numerous results of random micro-crack models, it is clearly found that the acoustic nonlinear parameter increases linearly with micro-crack density, the proportion of surface cracks, the size of micro-crack zone, and the excitation frequency. This study theoretically reveals that nonlinear Rayleigh surface waves are feasible for use in quantitatively identifying the physical characteristics of surface micro-cracks in structures.

  7. Generation Mechanism of Nonlinear Rayleigh Surface Waves for Randomly Distributed Surface Micro-Cracks

    PubMed Central

    Ding, Xiangyan; Li, Feilong; Xu, Yongmei; Cao, Peng; Deng, Mingxi

    2018-01-01

    This paper investigates the propagation of Rayleigh surface waves in structures with randomly distributed surface micro-cracks using numerical simulations. The results revealed a significant ultrasonic nonlinear effect caused by the surface micro-cracks, which is mainly represented by a second harmonic with even more distinct third/quadruple harmonics. Based on statistical analysis from the numerous results of random micro-crack models, it is clearly found that the acoustic nonlinear parameter increases linearly with micro-crack density, the proportion of surface cracks, the size of micro-crack zone, and the excitation frequency. This study theoretically reveals that nonlinear Rayleigh surface waves are feasible for use in quantitatively identifying the physical characteristics of surface micro-cracks in structures. PMID:29690580

  8. The Soil Moisture Dependence of TRMM Microwave Imager Rainfall Estimates

    NASA Astrophysics Data System (ADS)

    Seyyedi, H.; Anagnostou, E. N.

    2011-12-01

    This study presents an in-depth analysis of the dependence of overland rainfall estimates from the Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI) on the soil moisture conditions at the land surface. TMI retrievals are verified against rainfall fields derived from a high resolution rain-gauge network (MESONET) covering Oklahoma. Soil moisture (SOM) patterns are extracted based on recorded data from 2000-2007 with 30 minutes temporal resolution. The area is divided into wet and dry regions based on normalized SOM (Nsom) values. Statistical comparison between two groups is conducted based on recorded ground station measurements and the corresponding passive microwave retrievals from TMI overpasses at the respective MESONET station location and time. The zero order error statistics show that the Probability of Detection (POD) for the wet regions (higher Nsom values) is higher than the dry regions. The Falls Alarm Ratio (FAR) and volumetric FAR is lower for the wet regions. The volumetric missed rain for the wet region is lower than dry region. Analysis of the MESONET-to-TMI ratio values shows that TMI tends to overestimate for surface rainfall intensities less than 12 (mm/h), however the magnitude of the overestimation over the wet regions is lower than the dry regions.

  9. Combined acute ecotoxicity of malathion and deltamethrin to Daphnia magna (Crustacea, Cladocera): comparison of different data analysis approaches.

    PubMed

    Toumi, Héla; Boumaiza, Moncef; Millet, Maurice; Radetski, Claudemir Marcos; Camara, Baba Issa; Felten, Vincent; Masfaraud, Jean-François; Férard, Jean-François

    2018-04-19

    We studied the combined acute effect (i.e., after 48 h) of deltamethrin (a pyrethroid insecticide) and malathion (an organophosphate insecticide) on Daphnia magna. Two approaches were used to examine the potential interaction effects of eight mixtures of deltamethrin and malathion: (i) calculation of mixture toxicity index (MTI) and safety factor index (SFI) and (ii) response surface methodology coupled with isobole-based statistical model (using generalized linear model). According to the calculation of MTI and SFI, one tested mixture was found additive while the two other tested mixtures were found no additive (MTI) or antagonistic (SFI), but these differences between index responses are only due to differences in terminology related to these two indexes. Through the surface response approach and isobologram analysis, we concluded that there was a significant antagonistic effect of the binary mixtures of deltamethrin and malathion that occurs on D. magna immobilization, after 48 h of exposure. Index approaches and surface response approach with isobologram analysis are complementary. Calculation of mixture toxicity index and safety factor index allows identifying punctually the type of interaction for several tested mixtures, while the surface response approach with isobologram analysis integrates all the data providing a global outcome about the type of interactive effect. Only the surface response approach and isobologram analysis allowed the statistical assessment of the ecotoxicological interaction. Nevertheless, we recommend the use of both approaches (i) to identify the combined effects of contaminants and (ii) to improve risk assessment and environmental management.

  10. Nanoparticle surface characterization and clustering through concentration-dependent surface adsorption modeling.

    PubMed

    Chen, Ran; Zhang, Yuntao; Sahneh, Faryad Darabi; Scoglio, Caterina M; Wohlleben, Wendel; Haase, Andrea; Monteiro-Riviere, Nancy A; Riviere, Jim E

    2014-09-23

    Quantitative characterization of nanoparticle interactions with their surrounding environment is vital for safe nanotechnological development and standardization. A recent quantitative measure, the biological surface adsorption index (BSAI), has demonstrated promising applications in nanomaterial surface characterization and biological/environmental prediction. This paper further advances the approach beyond the application of five descriptors in the original BSAI to address the concentration dependence of the descriptors, enabling better prediction of the adsorption profile and more accurate categorization of nanomaterials based on their surface properties. Statistical analysis on the obtained adsorption data was performed based on three different models: the original BSAI, a concentration-dependent polynomial model, and an infinite dilution model. These advancements in BSAI modeling showed a promising development in the application of quantitative predictive modeling in biological applications, nanomedicine, and environmental safety assessment of nanomaterials.

  11. Initial Adsorption of Fe on an Ethanol-Saturated Si(111)7 × 7 Surface: Statistical Analysis in Scanning Tunneling Microscopy

    NASA Astrophysics Data System (ADS)

    Yang, Haoyu; Hattori, Ken

    2018-03-01

    We studied the initial stage of iron deposition on an ethanol-saturated Si(111)7 × 7 surface at room temperature using scanning tunneling microscopy (STM). The statistical analysis of the Si adatom height at empty states for Si(111)-C2H5OH before and after the Fe deposition showed different types of adatoms: type B (before the deposition) and type B' (after the deposition) assigned to bare adatoms, type D and type D' to C2H5O-terminated adatoms, and type E' to adatoms with Fe. The analysis of the height distribution revealed the protection of the molecule termination for the Fe capture at the initial stage. The analysis also indicated the preferential capture of a single Fe atom to a bare center-adatom rather than a bare corner-adatom which remain after the C2H5OH saturation, but no selectivity was observed in faulted and unfaulted half unit-cells. This is the first STM-based report proving that a remaining bare adatom, but not a molecule-terminated adatom, captures a metal.

  12. A Description of the Building Materials Data Base for Portland, Maine.

    DTIC Science & Technology

    1986-06-01

    WORDS (Continue on reveree side if neceseary and Identify by block number)". Acid precipitation, , Data bases, Damage assessment, Environmental...protection) Damage from acid deposition, Portland, Maine Damage to buildings, - Statistical analysis, . 20. ASsrRACT (Conlaue a reverse e(A It n -cwery md...types and amounts of building surface materials ex- posed to acid deposition. The stratified, systematic, unaligned random sampling approach was used

  13. Statistical analysis of $sup 239-240$Pu and $sup 241$Am contamination of soil and vegetation on NAEG study sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, R.O.; Eberhardt, L.L.; Fowler, E.B.

    Reported here are results of the statistical design and analysis work conducted during Calendar Year 1974 for the Nevada Applied Ecology Group (NAEG) at plutonium study sites on the Nevada Test Site (NTS) and the Tonopah Test Range (TTR). Estimates of $sup 239-240$Pu inventory in surface soil (0 to 5-cm depth) are given for each of the NAEG intensive study sites, together with activity maps based on FIDLER surveys showing the field areas to which these estimates apply. There is evidence of a preliminary nature to suggest that the plutonium present in surface soil may be covered by a thinmore » (less than 2.5 cm) layer of soil whose alpha activity is considerably less than that directly below. Computer-drawn $sup 239-240$Pu concentration contours and three-dimensional surfaces in soil and vegetation are given for Area 13 and GMX as a first attempt at estimating the geographical distribution of $sup 239-240$Pu at these sites. (CH)« less

  14. Elucidating distinct ion channel populations on the surface of hippocampal neurons via single-particle tracking recurrence analysis

    NASA Astrophysics Data System (ADS)

    Sikora, Grzegorz; Wyłomańska, Agnieszka; Gajda, Janusz; Solé, Laura; Akin, Elizabeth J.; Tamkun, Michael M.; Krapf, Diego

    2017-12-01

    Protein and lipid nanodomains are prevalent on the surface of mammalian cells. In particular, it has been recently recognized that ion channels assemble into surface nanoclusters in the soma of cultured neurons. However, the interactions of these molecules with surface nanodomains display a considerable degree of heterogeneity. Here, we investigate this heterogeneity and develop statistical tools based on the recurrence of individual trajectories to identify subpopulations within ion channels in the neuronal surface. We specifically study the dynamics of the K+ channel Kv1.4 and the Na+ channel Nav1.6 on the surface of cultured hippocampal neurons at the single-molecule level. We find that both these molecules are expressed in two different forms with distinct kinetics with regards to surface interactions, emphasizing the complex proteomic landscape of the neuronal surface. Further, the tools presented in this work provide new methods for the analysis of membrane nanodomains, transient confinement, and identification of populations within single-particle trajectories.

  15. Space-time measurements of oceanic sea states

    NASA Astrophysics Data System (ADS)

    Fedele, Francesco; Benetazzo, Alvise; Gallego, Guillermo; Shih, Ping-Chang; Yezzi, Anthony; Barbariol, Francesco; Ardhuin, Fabrice

    2013-10-01

    Stereo video techniques are effective for estimating the space-time wave dynamics over an area of the ocean. Indeed, a stereo camera view allows retrieval of both spatial and temporal data whose statistical content is richer than that of time series data retrieved from point wave probes. We present an application of the Wave Acquisition Stereo System (WASS) for the analysis of offshore video measurements of gravity waves in the Northern Adriatic Sea and near the southern seashore of the Crimean peninsula, in the Black Sea. We use classical epipolar techniques to reconstruct the sea surface from the stereo pairs sequentially in time, viz. a sequence of spatial snapshots. We also present a variational approach that exploits the entire data image set providing a global space-time imaging of the sea surface, viz. simultaneous reconstruction of several spatial snapshots of the surface in order to guarantee continuity of the sea surface both in space and time. Analysis of the WASS measurements show that the sea surface can be accurately estimated in space and time together, yielding associated directional spectra and wave statistics at a point in time that agrees well with probabilistic models. In particular, WASS stereo imaging is able to capture typical features of the wave surface, especially the crest-to-trough asymmetry due to second order nonlinearities, and the observed shape of large waves are fairly described by theoretical models based on the theory of quasi-determinism (Boccotti, 2000). Further, we investigate space-time extremes of the observed stationary sea states, viz. the largest surface wave heights expected over a given area during the sea state duration. The WASS analysis provides the first experimental proof that a space-time extreme is generally larger than that observed in time via point measurements, in agreement with the predictions based on stochastic theories for global maxima of Gaussian fields.

  16. Evaluation of gloss changes of two denture acrylic resin materials in four different beverages.

    PubMed

    Keyf, Filiz; Etikan, Ilker

    2004-03-01

    The primary disadvantages of the materials which are used in construction of complete and removable partial dentures is that their esthetic, physical and mechanical properties change rapidly with time in the oral environment. For esthetics, color stability is one of the criteria that needs careful attention. Color may provide important information on the serviceability of these materials. Color change affects the gloss of these materials. The objective of the present study was to determine the gloss changes resulting from the testing process in four different beverages in one heat-polymerized denture base resin and one cold-polymerized denture base repair resin. Thirty-six samples were fabricated for each material. Each sample had a smooth polished and a rough unpolished surface. The gloss measurements were made with a glossmeter before testing. Four different beverages (tea, coffee, cola and cherry juice) were used for testing. Two angles of illumination (20 and 60 degrees) were used for the gloss measurements. The samples were immersed in water, tea, coffee, cola and cherry juice solutions. The gloss of the samples was measured again with the glossmeter at the end of the 45th day and 135th day of testing. The arithmetic mean and standard deviation of each of the samples were calculated and compared with each other statistically by using the Wilcoxon test (within times) (p < or = 0.05 significant), the Kruskal-Wallis analysis of variance (p < or = 0.05 significant) and the Mann-Whitney U-test with Bonforoni correction (when the difference between the samples was significant) (p < or = 0.05 significant). The results of this study revealed that gloss changes occurred after testing in heat-polymerized denture base resin and cold-polymerized denture base repair resin. The significance of the gloss changes exhibited by each sample, kept for different lengths of time in the same solution, were compared using the Wilcoxon test. The results were statistically significant (p < or = 0.05). According to the Kruskal-Wallis analysis of variance, the difference between measurements for angles of illumination was statistically significant (p < or = 0.05). Also according to the Mann-Whitney U-test, the difference between two polished surfaces or two unpolished surfaces was statistically insignificant (p > 0.05), but the difference between smooth polished and rough unpolished surfaces was statistically significant (p < or = 0.05). It was found that either the gloss of heat-polymerized denture base resin or the gloss of cold-polymerized denture base repair resin was affected by tested agents, and the four beverages demonstrated noticeable gloss changes. Cherry juice demonstrated the least change, while tea exhibited the greatest change.

  17. Studying ventricular abnormalities in mild cognitive impairment with hyperbolic Ricci flow and tensor-based morphometry.

    PubMed

    Shi, Jie; Stonnington, Cynthia M; Thompson, Paul M; Chen, Kewei; Gutman, Boris; Reschke, Cole; Baxter, Leslie C; Reiman, Eric M; Caselli, Richard J; Wang, Yalin

    2015-01-01

    Mild Cognitive Impairment (MCI) is a transitional stage between normal aging and dementia and people with MCI are at high risk of progression to dementia. MCI is attracting increasing attention, as it offers an opportunity to target the disease process during an early symptomatic stage. Structural magnetic resonance imaging (MRI) measures have been the mainstay of Alzheimer's disease (AD) imaging research, however, ventricular morphometry analysis remains challenging because of its complicated topological structure. Here we describe a novel ventricular morphometry system based on the hyperbolic Ricci flow method and tensor-based morphometry (TBM) statistics. Unlike prior ventricular surface parameterization methods, hyperbolic conformal parameterization is angle-preserving and does not have any singularities. Our system generates a one-to-one diffeomorphic mapping between ventricular surfaces with consistent boundary matching conditions. The TBM statistics encode a great deal of surface deformation information that could be inaccessible or overlooked by other methods. We applied our system to the baseline MRI scans of a set of MCI subjects from the Alzheimer's Disease Neuroimaging Initiative (ADNI: 71 MCI converters vs. 62 MCI stable). Although the combined ventricular area and volume features did not differ between the two groups, our fine-grained surface analysis revealed significant differences in the ventricular regions close to the temporal lobe and posterior cingulate, structures that are affected early in AD. Significant correlations were also detected between ventricular morphometry, neuropsychological measures, and a previously described imaging index based on fluorodeoxyglucose positron emission tomography (FDG-PET) scans. This novel ventricular morphometry method may offer a new and more sensitive approach to study preclinical and early symptomatic stage AD. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. STUDYING VENTRICULAR ABNORMALITIES IN MILD COGNITIVE IMPAIRMENT WITH HYPERBOLIC RICCI FLOW AND TENSOR-BASED MORPHOMETRY

    PubMed Central

    Shi, Jie; Stonnington, Cynthia M.; Thompson, Paul M.; Chen, Kewei; Gutman, Boris; Reschke, Cole; Baxter, Leslie C.; Reiman, Eric M.; Caselli, Richard J.; Wang, Yalin

    2014-01-01

    Mild Cognitive Impairment (MCI) is a transitional stage between normal aging and dementia and people with MCI are at high risk of progression to dementia. MCI is attracting increasing attention, as it offers an opportunity to target the disease process during an early symptomatic stage. Structural magnetic resonance imaging (MRI) measures have been the mainstay of Alzheimer’s disease (AD) imaging research, however, ventricular morphometry analysis remains challenging because of its complicated topological structure. Here we describe a novel ventricular morphometry system based on the hyperbolic Ricci flow method and tensor-based morphometry (TBM) statistics. Unlike prior ventricular surface parameterization methods, hyperbolic conformal parameterization is angle-preserving and does not have any singularities. Our system generates a one-to-one diffeomorphic mapping between ventricular surfaces with consistent boundary matching conditions. The TBM statistics encode a great deal of surface deformation information that could be inaccessible or overlooked by other methods. We applied our system to the baseline MRI scans of a set of MCI subjects from the Alzheimer’s Disease Neuroimaging Initiative (ADNI: 71 MCI converters vs. 62 MCI stable). Although the combined ventricular area and volume features did not differ between the two groups, our fine-grained surface analysis revealed significant differences in the ventricular regions close to the temporal lobe and posterior cingulate, structures that are affected early in AD. Significant correlations were also detected between ventricular morphometry, neuropsychological measures, and a previously described imaging index based on fluorodeoxyglucose positron emission tomography (FDG-PET) scans. This novel ventricular morphometry method may offer a new and more sensitive approach to study preclinical and early symptomatic stage AD. PMID:25285374

  19. Multivariate Statistical Analysis of Water Quality data in Indian River Lagoon, Florida

    NASA Astrophysics Data System (ADS)

    Sayemuzzaman, M.; Ye, M.

    2015-12-01

    The Indian River Lagoon, is part of the longest barrier island complex in the United States, is a region of particular concern to the environmental scientist because of the rapid rate of human development throughout the region and the geographical position in between the colder temperate zone and warmer sub-tropical zone. Thus, the surface water quality analysis in this region always brings the newer information. In this present study, multivariate statistical procedures were applied to analyze the spatial and temporal water quality in the Indian River Lagoon over the period 1998-2013. Twelve parameters have been analyzed on twelve key water monitoring stations in and beside the lagoon on monthly datasets (total of 27,648 observations). The dataset was treated using cluster analysis (CA), principle component analysis (PCA) and non-parametric trend analysis. The CA was used to cluster twelve monitoring stations into four groups, with stations on the similar surrounding characteristics being in the same group. The PCA was then applied to the similar groups to find the important water quality parameters. The principal components (PCs), PC1 to PC5 was considered based on the explained cumulative variances 75% to 85% in each cluster groups. Nutrient species (phosphorus and nitrogen), salinity, specific conductivity and erosion factors (TSS, Turbidity) were major variables involved in the construction of the PCs. Statistical significant positive or negative trends and the abrupt trend shift were detected applying Mann-Kendall trend test and Sequential Mann-Kendall (SQMK), for each individual stations for the important water quality parameters. Land use land cover change pattern, local anthropogenic activities and extreme climate such as drought might be associated with these trends. This study presents the multivariate statistical assessment in order to get better information about the quality of surface water. Thus, effective pollution control/management of the surface waters can be undertaken.

  20. Investigation of quartz grain surface textures by atomic force microscopy for forensic analysis.

    PubMed

    Konopinski, D I; Hudziak, S; Morgan, R M; Bull, P A; Kenyon, A J

    2012-11-30

    This paper presents a study of quartz sand grain surface textures using atomic force microscopy (AFM) to image the surface. Until now scanning electron microscopy (SEM) has provided the primary technique used in the forensic surface texture analysis of quartz sand grains as a means of establishing the provenance of the grains for forensic reconstructions. The ability to independently corroborate the grain type classifications is desirable and provides additional weight to the findings of SEM analysis of the textures of quartz grains identified in forensic soil/sediment samples. AFM offers a quantitative means of analysis that complements SEM examination, and is a non-destructive technique that requires no sample preparation prior to scanning. It therefore has great potential to be used for forensic analysis where sample preservation is highly valuable. By taking quantitative topography scans, it is possible to produce 3D representations of microscopic surface textures and diagnostic features for examination. Furthermore, various empirical measures can be obtained from analysing the topography scans, including arithmetic average roughness, root-mean-square surface roughness, skewness, kurtosis, and multiple gaussian fits to height distributions. These empirical measures, combined with qualitative examination of the surfaces can help to discriminate between grain types and provide independent analysis that can corroborate the morphological grain typing based on the surface textures assigned using SEM. Furthermore, the findings from this study also demonstrate that quartz sand grain surfaces exhibit a statistically self-similar fractal nature that remains unchanged across scales. This indicates the potential for a further quantitative measure that could be utilised in the discrimination of quartz grains based on their provenance for forensic investigations. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  1. Research Update: Spatially resolved mapping of electronic structure on atomic level by multivariate statistical analysis

    DOE PAGES

    Belianinov, Alex; Panchapakesan, G.; Lin, Wenzhi; ...

    2014-12-02

    Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe0.55Se0.45 (Tc = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe1 x Sex structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified by their electronic signaturemore » and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.« less

  2. Research Update: Spatially resolved mapping of electronic structure on atomic level by multivariate statistical analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belianinov, Alex, E-mail: belianinova@ornl.gov; Ganesh, Panchapakesan; Lin, Wenzhi

    2014-12-01

    Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe{sub 0.55}Se{sub 0.45} (T{sub c} = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe{sub 1−x}Se{sub x} structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified bymore » their electronic signature and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.« less

  3. Grain boundary oxidation and an analysis of the effects of oxidation on fatigue crack nucleation life

    NASA Technical Reports Server (NTRS)

    Oshida, Y.; Liu, H. W.

    1988-01-01

    The effects of preoxidation on subsequent fatigue life were studied. Surface oxidation and grain boundary oxidation of a nickel-base superalloy (TAZ-8A) were studied at 600 to 1000 C for 10 to 1000 hours in air. Surface oxides were identified and the kinetics of surface oxidation was discussed. Grain boundary oxide penetration and morphology were studied. Pancake type grain boundary oxide penetrates deeper and its size is larger, therefore, it is more detrimental to fatigue life than cone-type grain boundary oxide. Oxide penetration depth, a (sub m), is related to oxidation temperature, T, and exposure time, t, by an empirical relation of the Arrhenius type. Effects of T and t on statistical variation of a (sub m) were analyzed according to the Weibull distribution function. Once the oxide is cracked, it serves as a fatigue crack nucleus. Statistical variation of the remaining fatigue life, after the formation of an oxide crack of a critical length, is related directly to the statistical variation of grain boundary oxide penetration depth.

  4. Grain boundary oxidation and an analysis of the effects of pre-oxidation on subsequent fatigue life

    NASA Technical Reports Server (NTRS)

    Oshida, Y.; Liu, H. W.

    1986-01-01

    The effects of preoxidation on subsequent fatigue life were studied. Surface oxidation and grain boundary oxidation of a nickel-base superalloy (TAZ-8A) were studied at 600 to 1000 C for 10 to 1000 hours in air. Surface oxides were identified and the kinetics of surface oxidation was discussed. Grain boundary oxide penetration and morphology were studied. Pancake type grain boundary oxide penetrates deeper and its size is larger, therefore, it is more detrimental to fatigue life than cone-type grain boundary oxide. Oxide penetration depth, a (sub m), is related to oxidation temperature, T, and exposure time, t, by an empirical relation of the Arrhenius type. Effects of T and t on statistical variation of a (sub m) were analyzed according to the Weibull distribution function. Once the oxide is cracked, it serves as a fatigue crack nucleus. Statistical variation of the remaining fatigue life, after the formation of an oxide crack of a critical length, is related directly to the statistical variation of grain boundary oxide penetration depth.

  5. Effect of in-office bleaching agents on physical properties of dental composite resins.

    PubMed

    Mourouzis, Petros; Koulaouzidou, Elisabeth A; Helvatjoglu-Antoniades, Maria

    2013-04-01

    The physical properties of dental restorative materials have a crucial effect on the longevity of restorations and moreover on the esthetic demands of patients, but they may be compromised by bleaching treatments. The purpose of this study was to evaluate the effects of in-office bleaching agents on the physical properties of three composite resin restorative materials. The bleaching agents used were hydrogen peroxide and carbamide peroxide at high concentrations. Specimens of each material were prepared, cured, and polished. Measurements of color difference, microhardness, and surface roughness were recorded before and after bleaching and data were examined statistically by analysis of variance (ANOVA) and Tukey HSD post-hoc test at P < .05. The measurements showed that hue and chroma of silorane-based composite resin altered after the bleaching procedure (P < .05). No statistically significant differences were found when testing the microhardness and surface roughness of composite resins tested (P > .05). The silorane-based composite resin tested showed some color alteration after bleaching procedures. The bleaching procedure did not alter the microhardness and the surface roughness of all composite resins tested.

  6. Principal component analysis in construction of 3D human knee joint models using a statistical shape model method.

    PubMed

    Tsai, Tsung-Yuan; Li, Jing-Sheng; Wang, Shaobai; Li, Pingyue; Kwon, Young-Min; Li, Guoan

    2015-01-01

    The statistical shape model (SSM) method that uses 2D images of the knee joint to predict the three-dimensional (3D) joint surface model has been reported in the literature. In this study, we constructed a SSM database using 152 human computed tomography (CT) knee joint models, including the femur, tibia and patella and analysed the characteristics of each principal component of the SSM. The surface models of two in vivo knees were predicted using the SSM and their 2D bi-plane fluoroscopic images. The predicted models were compared to their CT joint models. The differences between the predicted 3D knee joint surfaces and the CT image-based surfaces were 0.30 ± 0.81 mm, 0.34 ± 0.79 mm and 0.36 ± 0.59 mm for the femur, tibia and patella, respectively (average ± standard deviation). The computational time for each bone of the knee joint was within 30 s using a personal computer. The analysis of this study indicated that the SSM method could be a useful tool to construct 3D surface models of the knee with sub-millimeter accuracy in real time. Thus, it may have a broad application in computer-assisted knee surgeries that require 3D surface models of the knee.

  7. Proposal for a biometrics of the cortical surface: a statistical method for relative surface distance metrics

    NASA Astrophysics Data System (ADS)

    Bookstein, Fred L.

    1995-08-01

    Recent advances in computational geometry have greatly extended the range of neuroanatomical questions that can be approached by rigorous quantitative methods. One of the major current challenges in this area is to describe the variability of human cortical surface form and its implications for individual differences in neurophysiological functioning. Existing techniques for representation of stochastically invaginated surfaces do not conduce to the necessary parametric statistical summaries. In this paper, following a hint from David Van Essen and Heather Drury, I sketch a statistical method customized for the constraints of this complex data type. Cortical surface form is represented by its Riemannian metric tensor and averaged according to parameters of a smooth averaged surface. Sulci are represented by integral trajectories of the smaller principal strains of this metric, and their statistics follow the statistics of that relative metric. The diagrams visualizing this tensor analysis look like alligator leather but summarize all aspects of cortical surface form in between the principal sulci, the reliable ones; no flattening is required.

  8. Multiscale analysis of replication technique efficiency for 3D roughness characterization of manufactured surfaces

    NASA Astrophysics Data System (ADS)

    Jolivet, S.; Mezghani, S.; El Mansori, M.

    2016-09-01

    The replication of topography has been generally restricted to optimizing material processing technologies in terms of statistical and single-scale features such as roughness. By contrast, manufactured surface topography is highly complex, irregular, and multiscale. In this work, we have demonstrated the use of multiscale analysis on replicates of surface finish to assess the precise control of the finished replica. Five commercial resins used for surface replication were compared. The topography of five standard surfaces representative of common finishing processes were acquired both directly and by a replication technique. Then, they were characterized using the ISO 25178 standard and multiscale decomposition based on a continuous wavelet transform, to compare the roughness transfer quality at different scales. Additionally, atomic force microscope force modulation mode was used in order to compare the resins’ stiffness properties. The results showed that less stiff resins are able to replicate the surface finish along a larger wavelength band. The method was then tested for non-destructive quality control of automotive gear tooth surfaces.

  9. Spatial-temporal analysis of building surface temperatures in Hung Hom

    NASA Astrophysics Data System (ADS)

    Zeng, Ying; Shen, Yueqian

    2015-12-01

    This thesis presents a study on spatial-temporal analysis of building surface temperatures in Hung Hom. Observations were collected from Aug 2013 to Oct 2013 at a 30-min interval, using iButton sensors (N=20) covering twelve locations in Hung Hom. And thermal images were captured in PolyU from 05 Aug 2013 to 06 Aug 2013. A linear regression model of iButton and thermal records is established to calibrate temperature data. A 3D modeling system is developed based on Visual Studio 2010 development platform, using ArcEngine10.0 component, Microsoft Access 2010 database and C# programming language. The system realizes processing data, spatial analysis, compound query and 3D face temperature rendering and so on. After statistical analyses, building face azimuths are found to have a statistically significant relationship with sun azimuths at peak time. And seasonal building temperature changing also corresponds to the sun angle and sun azimuth variations. Building materials are found to have a significant effect on building surface temperatures. Buildings with lower albedo materials tend to have higher temperatures and larger thermal conductivity material have significant diurnal variations. For the geographical locations, the peripheral faces of campus have higher temperatures than the inner faces during day time and buildings located at the southeast are cooler than the western. Furthermore, human activity is found to have a strong relationship with building surface temperatures through weekday and weekend comparison.

  10. Acid base properties of cyanobacterial surfaces I: Influences of growth phase and nitrogen metabolism on cell surface reactivity

    NASA Astrophysics Data System (ADS)

    Lalonde, S. V.; Smith, D. S.; Owttrim, G. W.; Konhauser, K. O.

    2008-03-01

    Significant efforts have been made to elucidate the chemical properties of bacterial surfaces for the purposes of refining surface complexation models that can account for their metal sorptive behavior under diverse conditions. However, the influence of culturing conditions on surface chemical parameters that are modeled from the potentiometric titration of bacterial surfaces has received little regard. While culture age and metabolic pathway have been considered as factors potentially influencing cell surface reactivity, statistical treatments have been incomplete and variability has remained unconfirmed. In this study, we employ potentiometric titrations to evaluate variations in bacterial surface ligand distributions using live cells of the sheathless cyanobacterium Anabaena sp. strain PCC 7120, grown under a variety of batch culture conditions. We evaluate the ability for a single set of modeled parameters, describing acid-base surface properties averaged over all culture conditions tested, to accurately account for the ligand distributions modeled for each individual culture condition. In addition to considering growth phase, we assess the role of the various assimilatory nitrogen metabolisms available to this organism as potential determinants of surface reactivity. We observe statistically significant variability in site distribution between the majority of conditions assessed. By employing post hoc Tukey-Kramer analysis for all possible pair-wise condition comparisons, we conclude that the average parameters are inadequate for the accurate chemical description of this cyanobacterial surface. It was determined that for this Gram-negative bacterium in batch culture, ligand distributions were influenced to a greater extent by nitrogen assimilation pathway than by growth phase.

  11. Statistical shape modeling of human cochlea: alignment and principal component analysis

    NASA Astrophysics Data System (ADS)

    Poznyakovskiy, Anton A.; Zahnert, Thomas; Fischer, Björn; Lasurashvili, Nikoloz; Kalaidzidis, Yannis; Mürbe, Dirk

    2013-02-01

    The modeling of the cochlear labyrinth in living subjects is hampered by insufficient resolution of available clinical imaging methods. These methods usually provide resolutions higher than 125 μm. This is too crude to record the position of basilar membrane and, as a result, keep apart even the scala tympani from other scalae. This problem could be avoided by the means of atlas-based segmentation. The specimens can endure higher radiation loads and, conversely, provide better-resolved images. The resulting surface can be used as the seed for atlas-based segmentation. To serve this purpose, we have developed a statistical shape model (SSM) of human scala tympani based on segmentations obtained from 10 μCT image stacks. After segmentation, we aligned the resulting surfaces using Procrustes alignment. This algorithm was slightly modified to accommodate single models with nodes which do not necessarily correspond to salient features and vary in number between models. We have established correspondence by mutual proximity between nodes. Rather than using the standard Euclidean norm, we have applied an alternative logarithmic norm to improve outlier treatment. The minimization was done using BFGS method. We have also split the surface nodes along an octree to reduce computation cost. Subsequently, we have performed the principal component analysis of the training set with Jacobi eigenvalue algorithm. We expect the resulting method to help acquiring not only better understanding in interindividual variations of cochlear anatomy, but also a step towards individual models for pre-operative diagnostics prior to cochlear implant insertions.

  12. Modeling and experiments of the adhesion force distribution between particles and a surface.

    PubMed

    You, Siming; Wan, Man Pun

    2014-06-17

    Due to the existence of surface roughness in real surfaces, the adhesion force between particles and the surface where the particles are deposited exhibits certain statistical distributions. Despite the importance of adhesion force distribution in a variety of applications, the current understanding of modeling adhesion force distribution is still limited. In this work, an adhesion force distribution model based on integrating the root-mean-square (RMS) roughness distribution (i.e., the variation of RMS roughness on the surface in terms of location) into recently proposed mean adhesion force models was proposed. The integration was accomplished by statistical analysis and Monte Carlo simulation. A series of centrifuge experiments were conducted to measure the adhesion force distributions between polystyrene particles (146.1 ± 1.99 μm) and various substrates (stainless steel, aluminum and plastic, respectively). The proposed model was validated against the measured adhesion force distributions from this work and another previous study. Based on the proposed model, the effect of RMS roughness distribution on the adhesion force distribution of particles on a rough surface was explored, showing that both the median and standard deviation of adhesion force distribution could be affected by the RMS roughness distribution. The proposed model could predict both van der Waals force and capillary force distributions and consider the multiscale roughness feature, greatly extending the current capability of adhesion force distribution prediction.

  13. Residual stress in glass: indentation crack and fractography approaches.

    PubMed

    Anunmana, Chuchai; Anusavice, Kenneth J; Mecholsky, John J

    2009-11-01

    To test the hypothesis that the indentation crack technique can determine surface residual stresses that are not statistically significantly different from those determined from the analytical procedure using surface cracks, the four-point flexure test, and fracture surface analysis. Soda-lime-silica glass bar specimens (4 mm x 2.3 mm x 28 mm) were prepared and annealed at 650 degrees C for 30 min before testing. The fracture toughness values of the glass bars were determined from 12 specimens based on induced surface cracks, four-point flexure, and fractographic analysis. To determine the residual stress from the indentation technique, 18 specimens were indented under 19.6N load using a Vickers microhardness indenter. Crack lengths were measured within 1 min and 24h after indentation, and the measured crack lengths were compared with the mean crack lengths of annealed specimens. Residual stress was calculated from an equation developed for the indentation technique. All specimens were fractured in a four-point flexure fixture and the residual stress was calculated from the strength and measured crack sizes on the fracture surfaces. The results show that there was no significant difference between the residual stresses calculated from the two techniques. However, the differences in mean residual stresses calculated within 1 min compared with those calculated after 24h were statistically significant (p=0.003). This study compared the indentation technique with the fractographic analysis method for determining the residual stress in the surface of soda-lime-silica glass. The indentation method may be useful for estimating residual stress in glass.

  14. Residual stress in glass: indentation crack and fractography approaches

    PubMed Central

    Anunmana, Chuchai; Anusavice, Kenneth J.; Mecholsky, John J.

    2009-01-01

    Objective To test the hypothesis that the indentation crack technique can determine surface residual stresses that are not statistically significantly different from those determined from the analytical procedure using surface cracks, the four-point flexure test, and fracture surface analysis. Methods Soda-lime-silica glass bar specimens (4 mm × 2.3 mm × 28 mm) were prepared and annealed at 650 °C for 30 min before testing. The fracture toughness values of the glass bars were determined from 12 specimens based on induced surface cracks, four-point flexure, and fractographic analysis. To determine the residual stress from the indentation technique, 18 specimens were indented under 19.6 N load using a Vickers microhardness indenter. Crack lengths were measured within 1 min and 24 h after indentation, and the measured crack lengths were compared with the mean crack lengths of annealed specimens. Residual stress was calculated from an equation developed for the indentation technique. All specimens were fractured in a four-point flexure fixture and the residual stress was calculated from the strength and measured crack sizes on the fracture surfaces. Results The results show that there was no significant difference between the residual stresses calculated from the two techniques. However, the differences in mean residual stresses calculated within 1 min compared with those calculated after 24 h were statistically significant (p=0.003). Significance This study compared the indentation technique with the fractographic analysis method for determining the residual stress in the surface of soda-lime silica glass. The indentation method may be useful for estimating residual stress in glass. PMID:19671475

  15. Multi-Scale Surface Descriptors

    PubMed Central

    Cipriano, Gregory; Phillips, George N.; Gleicher, Michael

    2010-01-01

    Local shape descriptors compactly characterize regions of a surface, and have been applied to tasks in visualization, shape matching, and analysis. Classically, curvature has be used as a shape descriptor; however, this differential property characterizes only an infinitesimal neighborhood. In this paper, we provide shape descriptors for surface meshes designed to be multi-scale, that is, capable of characterizing regions of varying size. These descriptors capture statistically the shape of a neighborhood around a central point by fitting a quadratic surface. They therefore mimic differential curvature, are efficient to compute, and encode anisotropy. We show how simple variants of mesh operations can be used to compute the descriptors without resorting to expensive parameterizations, and additionally provide a statistical approximation for reduced computational cost. We show how these descriptors apply to a number of uses in visualization, analysis, and matching of surfaces, particularly to tasks in protein surface analysis. PMID:19834190

  16. Adhesive properties and adhesive joints strength of graphite/epoxy composites

    NASA Astrophysics Data System (ADS)

    Rudawska, Anna; Stančeková, Dana; Cubonova, Nadezda; Vitenko, Tetiana; Müller, Miroslav; Valášek, Petr

    2017-05-01

    The article presents the results of experimental research of the adhesive joints strength of graphite/epoxy composites and the results of the surface free energy of the composite surfaces. Two types of graphite/epoxy composites with different thickness were tested which are used to aircraft structure. The single-lap adhesive joints of epoxy composites were considered. Adhesive properties were described by surface free energy. Owens-Wendt method was used to determine surface free energy. The epoxy two-component adhesive was used to preparing the adhesive joints. Zwick/Roell 100 strength device were used to determination the shear strength of adhesive joints of epoxy composites. The strength test results showed that the highest value was obtained for adhesive joints of graphite-epoxy composite of smaller material thickness (0.48 mm). Statistical analysis of the results obtained, the study showed statistically significant differences between the values of the strength of the confidence level of 0.95. The statistical analysis of the results also showed that there are no statistical significant differences in average values of surface free energy (0.95 confidence level). It was noted that in each of the results the dispersion component of surface free energy was much greater than polar component of surface free energy.

  17. Laplace-Beltrami Eigenvalues and Topological Features of Eigenfunctions for Statistical Shape Analysis

    PubMed Central

    Reuter, Martin; Wolter, Franz-Erich; Shenton, Martha; Niethammer, Marc

    2009-01-01

    This paper proposes the use of the surface based Laplace-Beltrami and the volumetric Laplace eigenvalues and -functions as shape descriptors for the comparison and analysis of shapes. These spectral measures are isometry invariant and therefore allow for shape comparisons with minimal shape pre-processing. In particular, no registration, mapping, or remeshing is necessary. The discriminatory power of the 2D surface and 3D solid methods is demonstrated on a population of female caudate nuclei (a subcortical gray matter structure of the brain, involved in memory function, emotion processing, and learning) of normal control subjects and of subjects with schizotypal personality disorder. The behavior and properties of the Laplace-Beltrami eigenvalues and -functions are discussed extensively for both the Dirichlet and Neumann boundary condition showing advantages of the Neumann vs. the Dirichlet spectra in 3D. Furthermore, topological analyses employing the Morse-Smale complex (on the surfaces) and the Reeb graph (in the solids) are performed on selected eigenfunctions, yielding shape descriptors, that are capable of localizing geometric properties and detecting shape differences by indirectly registering topological features such as critical points, level sets and integral lines of the gradient field across subjects. The use of these topological features of the Laplace-Beltrami eigenfunctions in 2D and 3D for statistical shape analysis is novel. PMID:20161035

  18. Automotive System for Remote Surface Classification.

    PubMed

    Bystrov, Aleksandr; Hoare, Edward; Tran, Thuy-Yung; Clarke, Nigel; Gashinova, Marina; Cherniakov, Mikhail

    2017-04-01

    In this paper we shall discuss a novel approach to road surface recognition, based on the analysis of backscattered microwave and ultrasonic signals. The novelty of our method is sonar and polarimetric radar data fusion, extraction of features for separate swathes of illuminated surface (segmentation), and using of multi-stage artificial neural network for surface classification. The developed system consists of 24 GHz radar and 40 kHz ultrasonic sensor. The features are extracted from backscattered signals and then the procedures of principal component analysis and supervised classification are applied to feature data. The special attention is paid to multi-stage artificial neural network which allows an overall increase in classification accuracy. The proposed technique was tested for recognition of a large number of real surfaces in different weather conditions with the average accuracy of correct classification of 95%. The obtained results thereby demonstrate that the use of proposed system architecture and statistical methods allow for reliable discrimination of various road surfaces in real conditions.

  19. Automotive System for Remote Surface Classification

    PubMed Central

    Bystrov, Aleksandr; Hoare, Edward; Tran, Thuy-Yung; Clarke, Nigel; Gashinova, Marina; Cherniakov, Mikhail

    2017-01-01

    In this paper we shall discuss a novel approach to road surface recognition, based on the analysis of backscattered microwave and ultrasonic signals. The novelty of our method is sonar and polarimetric radar data fusion, extraction of features for separate swathes of illuminated surface (segmentation), and using of multi-stage artificial neural network for surface classification. The developed system consists of 24 GHz radar and 40 kHz ultrasonic sensor. The features are extracted from backscattered signals and then the procedures of principal component analysis and supervised classification are applied to feature data. The special attention is paid to multi-stage artificial neural network which allows an overall increase in classification accuracy. The proposed technique was tested for recognition of a large number of real surfaces in different weather conditions with the average accuracy of correct classification of 95%. The obtained results thereby demonstrate that the use of proposed system architecture and statistical methods allow for reliable discrimination of various road surfaces in real conditions. PMID:28368297

  20. Grain-Size Based Additivity Models for Scaling Multi-rate Uranyl Surface Complexation in Subsurface Sediments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiaoying; Liu, Chongxuan; Hu, Bill X.

    This study statistically analyzed a grain-size based additivity model that has been proposed to scale reaction rates and parameters from laboratory to field. The additivity model assumed that reaction properties in a sediment including surface area, reactive site concentration, reaction rate, and extent can be predicted from field-scale grain size distribution by linearly adding reaction properties for individual grain size fractions. This study focused on the statistical analysis of the additivity model with respect to reaction rate constants using multi-rate uranyl (U(VI)) surface complexation reactions in a contaminated sediment as an example. Experimental data of rate-limited U(VI) desorption in amore » stirred flow-cell reactor were used to estimate the statistical properties of multi-rate parameters for individual grain size fractions. The statistical properties of the rate constants for the individual grain size fractions were then used to analyze the statistical properties of the additivity model to predict rate-limited U(VI) desorption in the composite sediment, and to evaluate the relative importance of individual grain size fractions to the overall U(VI) desorption. The result indicated that the additivity model provided a good prediction of the U(VI) desorption in the composite sediment. However, the rate constants were not directly scalable using the additivity model, and U(VI) desorption in individual grain size fractions have to be simulated in order to apply the additivity model. An approximate additivity model for directly scaling rate constants was subsequently proposed and evaluated. The result found that the approximate model provided a good prediction of the experimental results within statistical uncertainty. This study also found that a gravel size fraction (2-8mm), which is often ignored in modeling U(VI) sorption and desorption, is statistically significant to the U(VI) desorption in the sediment.« less

  1. Improved statistical power with a sparse shape model in detecting an aging effect in the hippocampus and amygdala

    NASA Astrophysics Data System (ADS)

    Chung, Moo K.; Kim, Seung-Goo; Schaefer, Stacey M.; van Reekum, Carien M.; Peschke-Schmitz, Lara; Sutterer, Matthew J.; Davidson, Richard J.

    2014-03-01

    The sparse regression framework has been widely used in medical image processing and analysis. However, it has been rarely used in anatomical studies. We present a sparse shape modeling framework using the Laplace- Beltrami (LB) eigenfunctions of the underlying shape and show its improvement of statistical power. Tradition- ally, the LB-eigenfunctions are used as a basis for intrinsically representing surface shapes as a form of Fourier descriptors. To reduce high frequency noise, only the first few terms are used in the expansion and higher frequency terms are simply thrown away. However, some lower frequency terms may not necessarily contribute significantly in reconstructing the surfaces. Motivated by this idea, we present a LB-based method to filter out only the significant eigenfunctions by imposing a sparse penalty. For dense anatomical data such as deformation fields on a surface mesh, the sparse regression behaves like a smoothing process, which will reduce the error of incorrectly detecting false negatives. Hence the statistical power improves. The sparse shape model is then applied in investigating the influence of age on amygdala and hippocampus shapes in the normal population. The advantage of the LB sparse framework is demonstrated by showing the increased statistical power.

  2. Color stability of silorane-based composites submitted to accelerated artificial ageing--an in situ study.

    PubMed

    Pires-de-Souza, Fernanda de Carvalho Panzeri; Garcia, Lucas da Fonseca Roberti; Roselino, Lourenço de Moraes Rego; Naves, Lucas Zago

    2011-07-01

    To assess the in situ color stability, surface and the tooth/restoration interface degradation of a silorane-based composite (P90, 3M ESPE) after accelerated artificial ageing (AAA), in comparison with other dimethacrylate monomer-based composites (Z250/Z350, 3M ESPE and Esthet-X, Dentsply). Class V cavities (25 mm(2) × 2 mm deep) were prepared in 48 bovine incisors, which were randomly allocated into 4 groups of 12 specimens each, according to the type of restorative material used. After polishing, 10 specimens were submitted to initial color readings (Easyshade, Vita) and 2 to analysis by scanning electronic microscopy (SEM). Afterwards, the teeth were submitted to AAA for 384 h, which corresponds to 1 year of clinical use, after which new color readings and microscopic images were obtained. The values obtained for the color analysis were submitted to statistical analysis (1-way ANOVA, Tukey, p<0.05). With regard to color stability, it was verified that all the composites showed color alteration above the clinically acceptable levels (ΔE ≥ 3.3), and that the silorane-based composite showed higher ΔE (18.6), with a statistically significant difference in comparison with the other composites (p<0.05). The SEM images showed small alterations for the dimethacrylate-based composites after AAA and extensive degradation for the silorane-based composite with a rupture at the interface between the matrix/particle. It may be concluded that the silorane-based composite underwent greater alteration with regard to color stability and greater surface and tooth/restoration interface degradation after AAA. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. An oilspill trajectory analysis model with a variable wind deflection angle

    USGS Publications Warehouse

    Samuels, W.B.; Huang, N.E.; Amstutz, D.E.

    1982-01-01

    The oilspill trajectory movement algorithm consists of a vector sum of the surface drift component due to wind and the surface current component. In the U.S. Geological Survey oilspill trajectory analysis model, the surface drift component is assumed to be 3.5% of the wind speed and is rotated 20 degrees clockwise to account for Coriolis effects in the Northern Hemisphere. Field and laboratory data suggest, however, that the deflection angle of the surface drift current can be highly variable. An empirical formula, based on field observations and theoretical arguments relating wind speed to deflection angle, was used to calculate a new deflection angle at each time step in the model. Comparisons of oilspill contact probabilities to coastal areas calculated for constant and variable deflection angles showed that the model is insensitive to this changing angle at low wind speeds. At high wind speeds, some statistically significant differences in contact probabilities did appear. ?? 1982.

  4. Impact of Assimilating Surface Velocity Observations on the Model Sea Surface Height Using the NCOM-4DVAR

    DTIC Science & Technology

    2016-09-26

    statistical analysis is done by not only examining the SSH forecast error across the entire do- main, but also by concentrating on the areamost densely covered...over (b) entire GoM domain and (d) GLAD region only. Statistics shown for FR (thin black), SSH1 (thick black), and VEL (gray) experiment 96-h SSH...coefficient. To statistically FIG. 9. Sea surface height (m) for AVISO (a) 1 Aug, (b) 20 Aug, (c) 10 Sep, and (d) 30 Sep; for SSH1 experiment (e) 1

  5. Statistical theory and applications of lock-in carrierographic image pixel brightness dependence on multi-crystalline Si solar cell efficiency and photovoltage

    NASA Astrophysics Data System (ADS)

    Mandelis, Andreas; Zhang, Yu; Melnikov, Alexander

    2012-09-01

    A solar cell lock-in carrierographic image generation theory based on the concept of non-equilibrium radiation chemical potential was developed. An optoelectronic diode expression was derived linking the emitted radiative recombination photon flux (current density), the solar conversion efficiency, and the external load resistance via the closed- and/or open-circuit photovoltage. The expression was shown to be of a structure similar to the conventional electrical photovoltaic I-V equation, thereby allowing the carrierographic image to be used in a quantitative statistical pixel brightness distribution analysis with outcome being the non-contacting measurement of mean values of these important parameters averaged over the entire illuminated solar cell surface. This is the optoelectronic equivalent of the electrical (contacting) measurement method using an external resistor circuit and the outputs of the solar cell electrode grid, the latter acting as an averaging distribution network over the surface. The statistical theory was confirmed using multi-crystalline Si solar cells.

  6. Application of short-data methods on extreme surge levels

    NASA Astrophysics Data System (ADS)

    Feng, X.

    2014-12-01

    Tropical cyclone-induced storm surges are among the most destructive natural hazards that impact the United States. Unfortunately for academic research, the available time series for extreme surge analysis are very short. The limited data introduces uncertainty and affects the accuracy of statistical analyses of extreme surge levels. This study deals with techniques applicable to data sets less than 20 years, including simulation modelling and methods based on the parameters of the parent distribution. The verified water levels from water gauges spread along the Southwest and Southeast Florida Coast, as well as the Florida Keys, are used in this study. Methods to calculate extreme storm surges are described and reviewed, including 'classical' methods based on the generalized extreme value (GEV) distribution and the generalized Pareto distribution (GPD), and approaches designed specifically to deal with short data sets. Incorporating global-warming influence, the statistical analysis reveals enhanced extreme surge magnitudes and frequencies during warm years, while reduced levels of extreme surge activity are observed in the same study domain during cold years. Furthermore, a non-stationary GEV distribution is applied to predict the extreme surge levels with warming sea surface temperatures. The non-stationary GEV distribution indicates that with 1 Celsius degree warming in sea surface temperature from the baseline climate, the 100-year return surge level in Southwest and Southeast Florida will increase by up to 40 centimeters. The considered statistical approaches for extreme surge estimation based on short data sets will be valuable to coastal stakeholders, including urban planners, emergency managers, and the hurricane and storm surge forecasting and warning system.

  7. Effect of artificial aging and surface treatment on bond strengths to dental zirconia.

    PubMed

    Perdigão, J; Fernandes, S D; Pinto, A M; Oliveira, F A

    2013-01-01

    The objective of this project was to study the influence of artificial aging and surface treatment on the microtensile bond strengths (μTBS) between zirconia and a phosphate monomer-based self-adhesive cement. Thirty zirconia disks (IPS e.max ZirCAD, Ivoclar Vivadent) were randomly assigned to two aging regimens: AR, used as received, which served as a control, and AG, artificial aging to simulate low-temperature degradation. Subsequently, the disks of each aging regimen were assigned to three surface treatments: NT, no surface treatment; CO, surface silicatization with CoJet sand (3M ESPE); and ZP, zirconia surface treated with Z-Prime Plus (Bisco Inc). Thirty discs were made of Filtek Z250 (3M ESPE) composite resin and luted to the zirconia discs using RelyX Unicem (3M ESPE). The specimens were sectioned with a diamond blade in X and Y directions to obtain bonded beams with a cross-section of 1.0 ± 0.2 mm. The beams were tested in tensile mode in a universal testing machine at a speed of 0.5 mm/min to measure μTBS. Selected beams were selected for fractographic analysis under the SEM. Statistical analysis was carried out with two-way analysis of variance and Dunnett T3 post hoc test at a significance level of 95%. The mean μTBS for the three AR subgroups (AR-NT, AR-CO, and AR-ZP) were significantly higher than those of the corresponding AG groups (p<0.0001). Both AR-CO and AR-ZP resulted in statistically significant higher mean bond strengths than the group AR-NT (p<0.006 and p<0.0001, respectively). Both AG-CO and AG-ZP resulted in statistically significant higher mean bond strengths than the group AG-NT (both at p<0.0001). Overall, AG decreased mean μTBS. Under the SEM, mixed failures showed residual cement attached to the zirconia side of the beams. CO resulted in a characteristic roughness of the zirconia surface. AR-ZP was the only group for which the amount of residual cement occupied at least 50% of the interface in mixed failures.

  8. Leads Detection Using Mixture Statistical Distribution Based CRF Algorithm from Sentinel-1 Dual Polarization SAR Imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Yu; Li, Fei; Zhang, Shengkai; Zhu, Tingting

    2017-04-01

    Synthetic Aperture Radar (SAR) is significantly important for polar remote sensing since it can provide continuous observations in all days and all weather. SAR can be used for extracting the surface roughness information characterized by the variance of dielectric properties and different polarization channels, which make it possible to observe different ice types and surface structure for deformation analysis. In November, 2016, Chinese National Antarctic Research Expedition (CHINARE) 33rd cruise has set sails in sea ice zone in Antarctic. Accurate leads spatial distribution in sea ice zone for routine planning of ship navigation is essential. In this study, the semantic relationship between leads and sea ice categories has been described by the Conditional Random Fields (CRF) model, and leads characteristics have been modeled by statistical distributions in SAR imagery. In the proposed algorithm, a mixture statistical distribution based CRF is developed by considering the contexture information and the statistical characteristics of sea ice for improving leads detection in Sentinel-1A dual polarization SAR imagery. The unary potential and pairwise potential in CRF model is constructed by integrating the posteriori probability estimated from statistical distributions. For mixture statistical distribution parameter estimation, Method of Logarithmic Cumulants (MoLC) is exploited for single statistical distribution parameters estimation. The iteration based Expectation Maximal (EM) algorithm is investigated to calculate the parameters in mixture statistical distribution based CRF model. In the posteriori probability inference, graph-cut energy minimization method is adopted in the initial leads detection. The post-processing procedures including aspect ratio constrain and spatial smoothing approaches are utilized to improve the visual result. The proposed method is validated on Sentinel-1A SAR C-band Extra Wide Swath (EW) Ground Range Detected (GRD) imagery with a pixel spacing of 40 meters near Prydz Bay area, East Antarctica. Main work is listed as follows: 1) A mixture statistical distribution based CRF algorithm has been developed for leads detection from Sentinel-1A dual polarization images. 2) The assessment of the proposed mixture statistical distribution based CRF method and single distribution based CRF algorithm has been presented. 3) The preferable parameters sets including statistical distributions, the aspect ratio threshold and spatial smoothing window size have been provided. In the future, the proposed algorithm will be developed for the operational Sentinel series data sets processing due to its less time consuming cost and high accuracy in leads detection.

  9. Statistical crystallography of surface micelle spacing

    NASA Technical Reports Server (NTRS)

    Noever, David A.

    1992-01-01

    The aggregation of the recently reported surface micelles of block polyelectrolytes is analyzed using techniques of statistical crystallography. A polygonal lattice (Voronoi mosaic) connects center-to-center points, yielding statistical agreement with crystallographic predictions; Aboav-Weaire's law and Lewis's law are verified. This protocol supplements the standard analysis of surface micelles leading to aggregation number determination and, when compared to numerical simulations, allows further insight into the random partitioning of surface films. In particular, agreement with Lewis's law has been linked to the geometric packing requirements of filling two-dimensional space which compete with (or balance) physical forces such as interfacial tension, electrostatic repulsion, and van der Waals attraction.

  10. SurfKin: an ab initio kinetic code for modeling surface reactions.

    PubMed

    Le, Thong Nguyen-Minh; Liu, Bin; Huynh, Lam K

    2014-10-05

    In this article, we describe a C/C++ program called SurfKin (Surface Kinetics) to construct microkinetic mechanisms for modeling gas-surface reactions. Thermodynamic properties of reaction species are estimated based on density functional theory calculations and statistical mechanics. Rate constants for elementary steps (including adsorption, desorption, and chemical reactions on surfaces) are calculated using the classical collision theory and transition state theory. Methane decomposition and water-gas shift reaction on Ni(111) surface were chosen as test cases to validate the code implementations. The good agreement with literature data suggests this is a powerful tool to facilitate the analysis of complex reactions on surfaces, and thus it helps to effectively construct detailed microkinetic mechanisms for such surface reactions. SurfKin also opens a possibility for designing nanoscale model catalysts. Copyright © 2014 Wiley Periodicals, Inc.

  11. Identifying drought response of semi-arid aeolian systems using near-surface luminescence profiles and changepoint analysis, Nebraska Sandhills.

    NASA Astrophysics Data System (ADS)

    Buckland, Catherine; Bailey, Richard; Thomas, David

    2017-04-01

    Two billion people living in drylands are affected by land degradation. Sediment erosion by wind and water removes fertile soil and destabilises landscapes. Vegetation disturbance is a key driver of dryland erosion caused by both natural and human forcings: drought, fire, land use, grazing pressure. A quantified understanding of vegetation cover sensitivities and resultant surface change to forcing factors is needed if the vegetation and landscape response to future climate change and human pressure are to be better predicted. Using quartz luminescence dating and statistical changepoint analysis (Killick & Eckley, 2014) this study demonstrates the ability to identify step-changes in depositional age of near-surface sediments. Lx/Tx luminescence profiles coupled with statistical analysis show the use of near-surface sediments in providing a high-resolution record of recent system response and aeolian system thresholds. This research determines how the environment has recorded and retained sedimentary evidence of drought response and land use disturbances over the last two hundred years across both individual landforms and the wider Nebraska Sandhills. Identifying surface deposition and comparing with records of climate, fire and land use changes allows us to assess the sensitivity and stability of the surface sediment to a range of forcing factors. Killick, R and Eckley, IA. (2014) "changepoint: An R Package for Changepoint Analysis." Journal of Statistical Software, (58) 1-19.

  12. Relationship Between Column-Density and Surface Mixing Ratio: Statistical Analysis of O3 and NO2 Data from the July 2011 Maryland DISCOVER-AQ Mission

    NASA Technical Reports Server (NTRS)

    Flynn, Clare; Pickering, Kenneth E.; Crawford, James H.; Lamsol, Lok; Krotkov, Nickolay; Herman, Jay; Weinheimer, Andrew; Chen, Gao; Liu, Xiong; Szykman, James; hide

    2014-01-01

    To investigate the ability of column (or partial column) information to represent surface air quality, results of linear regression analyses between surface mixing ratio data and column abundances for O3 and NO2 are presented for the July 2011 Maryland deployment of the DISCOVER-AQ mission. Data collected by the P-3B aircraft, ground-based Pandora spectrometers, Aura/OMI satellite instrument, and simulations for July 2011 from the CMAQ air quality model during this deployment provide a large and varied data set, allowing this problem to be approached from multiple perspectives. O3 columns typically exhibited a statistically significant and high degree of correlation with surface data (R(sup 2) > 0.64) in the P- 3B data set, a moderate degree of correlation (0.16 < R(sup 2) < 0.64) in the CMAQ data set, and a low degree of correlation (R(sup 2) < 0.16) in the Pandora and OMI data sets. NO2 columns typically exhibited a low to moderate degree of correlation with surface data in each data set. The results of linear regression analyses for O3 exhibited smaller errors relative to the observations than NO2 regressions. These results suggest that O3 partial column observations from future satellite instruments with sufficient sensitivity to the lower troposphere can be meaningful for surface air quality analysis.

  13. Trend-surface analysis of morphometric parameters: A case study in southeastern Brazil

    NASA Astrophysics Data System (ADS)

    Grohmann, Carlos Henrique

    2005-10-01

    Trend-surface analysis was carried out on data from morphometric parameters isobase and hydraulic gradient. The study area, located in the eastern border of Quadrilátero Ferrífero, southeastern Brazil, presents four main geomorphological units, one characterized by fluvial dissection, two of mountainous relief, with a scarp of hundreds of meters of fall between them, and a flat plateau in the central portion of the fluvially dissected terrains. Morphometric maps were evaluated in GRASS-GIS and statistics were made on R statistical language, using the spatial package. Analysis of variance (ANOVA) was made to test the significance of each surface and the significance of increasing polynomial degree. The best results were achieved with sixth-order surface for isobase and second-order surface for hydraulic gradient. Shape and orientation of residual maps contours for selected trends were compared with structures inferred from several morphometric maps, and a good correlation is present.

  14. In situ evaluation of a new silorane-based composite resin's bioadhesion properties.

    PubMed

    Claro-Pereira, Diogo; Sampaio-Maia, Benedita; Ferreira, Carla; Rodrigues, Andreia; Melo, Luís F; Vasconcelos, Mário R

    2011-12-01

    The aim of the present study was to compare, in situ, the initial dental plaque formation on a recently developed silorane-based composite resin, Filtek Silorane, and on a widely used methacrylate-based composite resin, Synergy D6, and to relate possible differences to surface free energy, hydrophobicity and type of organic matrix. Discs of Filtek Silorane and Synergy D6 were prepared and polished equally in order to attain the same surface roughness. Water, formamide and 1-bromonaphthalene contact angles were determined and the surface free energy and the hydrophobicity of the materials calculated. Two discs of each material were mounted in individual oral splints and exposed to the oral cavity of 20 participants for 4h. After this period the microbial adhesion to both materials' surface was measured by two different approaches, the DAPI staining and the plate count. Statistical analysis was performed using non-parametric tests. The surface roughness (R(a) parameter) was similar between the two materials and lower than 0.2μm. Mean water and formamide contact angles were significantly higher for Filtek Silorane, which presented significantly lower surface free energy and greater degree of hydrophobicity in comparison to Synergy D6. The bioadhesion potential evaluated by either DAPI staining or plate count did not differ between the two materials. In contrast to previous in vitro studies, the present in situ study found no statistically significant differences with respect to bacterial adhesion between Filtek Silorane and Synergy D6, despite the differences found for surface free energy and hydrophobicity. Copyright © 2011 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  15. Variation of Water Quality Parameters with Siltation Depth for River Ichamati Along International Border with Bangladesh Using Multivariate Statistical Techniques

    NASA Astrophysics Data System (ADS)

    Roy, P. K.; Pal, S.; Banerjee, G.; Biswas Roy, M.; Ray, D.; Majumder, A.

    2014-12-01

    River is considered as one of the main sources of freshwater all over the world. Hence analysis and maintenance of this water resource is globally considered a matter of major concern. This paper deals with the assessment of surface water quality of the Ichamati river using multivariate statistical techniques. Eight distinct surface water quality observation stations were located and samples were collected. For the samples collected statistical techniques were applied to the physico-chemical parameters and depth of siltation. In this paper cluster analysis is done to determine the relations between surface water quality and siltation depth of river Ichamati. Multiple regressions and mathematical equation modeling have been done to characterize surface water quality of Ichamati river on the basis of physico-chemical parameters. It was found that surface water quality of the downstream river was different from the water quality of the upstream. The analysis of the water quality parameters of the Ichamati river clearly indicate high pollution load on the river water which can be accounted to agricultural discharge, tidal effect and soil erosion. The results further reveal that with the increase in depth of siltation, water quality degraded.

  16. Computational Analysis for Rocket-Based Combined-Cycle Systems During Rocket-Only Operation

    NASA Technical Reports Server (NTRS)

    Steffen, C. J., Jr.; Smith, T. D.; Yungster, S.; Keller, D. J.

    2000-01-01

    A series of Reynolds-averaged Navier-Stokes calculations were employed to study the performance of rocket-based combined-cycle systems operating in an all-rocket mode. This parametric series of calculations were executed within a statistical framework, commonly known as design of experiments. The parametric design space included four geometric and two flowfield variables set at three levels each, for a total of 729 possible combinations. A D-optimal design strategy was selected. It required that only 36 separate computational fluid dynamics (CFD) solutions be performed to develop a full response surface model, which quantified the linear, bilinear, and curvilinear effects of the six experimental variables. The axisymmetric, Reynolds-averaged Navier-Stokes simulations were executed with the NPARC v3.0 code. The response used in the statistical analysis was created from Isp efficiency data integrated from the 36 CFD simulations. The influence of turbulence modeling was analyzed by using both one- and two-equation models. Careful attention was also given to quantify the influence of mesh dependence, iterative convergence, and artificial viscosity upon the resulting statistical model. Thirteen statistically significant effects were observed to have an influence on rocket-based combined-cycle nozzle performance. It was apparent that the free-expansion process, directly downstream of the rocket nozzle, can influence the Isp efficiency. Numerical schlieren images and particle traces have been used to further understand the physical phenomena behind several of the statistically significant results.

  17. Factorial-based response-surface modeling with confidence intervals for optimizing thermal-optical transmission analysis of atmospheric black carbon.

    PubMed

    Conny, J M; Norris, G A; Gould, T R

    2009-03-09

    Thermal-optical transmission (TOT) analysis measures black carbon (BC) in atmospheric aerosol on a fibrous filter. The method pyrolyzes organic carbon (OC) and employs laser light absorption to distinguish BC from the pyrolyzed OC; however, the instrument does not necessarily separate the two physically. In addition, a comprehensive temperature protocol for the analysis based on the Beer-Lambert Law remains elusive. Here, empirical response-surface modeling was used to show how the temperature protocol in TOT analysis can be modified to distinguish pyrolyzed OC from BC based on the Beer-Lambert Law. We determined the apparent specific absorption cross sections for pyrolyzed OC (sigma(Char)) and BC (sigma(BC)), which accounted for individual absorption enhancement effects within the filter. Response-surface models of these cross sections were derived from a three-factor central-composite factorial experimental design: temperature and duration of the high-temperature step in the helium phase, and the heating increase in the helium-oxygen phase. The response surface for sigma(BC), which varied with instrument conditions, revealed a ridge indicating the correct conditions for OC pyrolysis in helium. The intersection of the sigma(BC) and sigma(Char) surfaces indicated the conditions where the cross sections were equivalent, satisfying an important assumption upon which the method relies. 95% confidence interval surfaces defined a confidence region for a range of pyrolysis conditions. Analyses of wintertime samples from Seattle, WA revealed a temperature between 830 degrees C and 850 degrees C as most suitable for the helium high-temperature step lasting 150s. However, a temperature as low as 750 degrees C could not be rejected statistically.

  18. Extreme value statistics analysis of fracture strengths of a sintered silicon nitride failing from pores

    NASA Technical Reports Server (NTRS)

    Chao, Luen-Yuan; Shetty, Dinesh K.

    1992-01-01

    Statistical analysis and correlation between pore-size distribution and fracture strength distribution using the theory of extreme-value statistics is presented for a sintered silicon nitride. The pore-size distribution on a polished surface of this material was characterized, using an automatic optical image analyzer. The distribution measured on the two-dimensional plane surface was transformed to a population (volume) distribution, using the Schwartz-Saltykov diameter method. The population pore-size distribution and the distribution of the pore size at the fracture origin were correllated by extreme-value statistics. Fracture strength distribution was then predicted from the extreme-value pore-size distribution, usin a linear elastic fracture mechanics model of annular crack around pore and the fracture toughness of the ceramic. The predicted strength distribution was in good agreement with strength measurements in bending. In particular, the extreme-value statistics analysis explained the nonlinear trend in the linearized Weibull plot of measured strengths without postulating a lower-bound strength.

  19. Characterization of platelet adhesion under flow using microscopic image sequence analysis.

    PubMed

    Machin, M; Santomaso, A; Cozzi, M R; Battiston, M; Mazzuccato, M; De Marco, L; Canu, P

    2005-07-01

    A method for quantitative analysis of platelet deposition under flow is discussed here. The model system is based upon perfusion of blood platelets over an adhesive substrate immobilized on a glass coverslip acting as the lower surface of a rectangular flow chamber. The perfusion apparatus is mounted onto an inverted microscope equipped with epifluorescent illumination and intensified CCD video camera. Characterization is based on information obtained from a specific image analysis method applied to continuous sequences of microscopical images. Platelet recognition across the sequence of images is based on a time-dependent, bidimensional, gaussian-like pdf. Once a platelet is located,the variation of its position and shape as a function of time (i.e., the platelet history) can be determined. Analyzing the history we can establish if the platelet is moving on the surface, the frequency of this movement and the distance traveled before its resumes the velocity of a non-interacting cell. Therefore, we can determine how long the adhesion would last which is correlated to the resistance of the platelet-substrate bond. This algorithm enables the dynamic quantification of trajectories, as well as residence times, arrest and release frequencies for a high numbers of platelets at the same time. Statistically significant conclusions on platelet-surface interactions can then be obtained. An image analysis tool of this kind can dramatically help the investigation and characterization of the thrombogenic properties of artificial surfaces such as those used in artificial organs and biomedical devices.

  20. An Analysis of Effects of Variable Factors on Weapon Performance

    DTIC Science & Technology

    1993-03-01

    ALTERNATIVE ANALYSIS A. CATEGORICAL DATA ANALYSIS Statistical methodology for categorical data analysis traces its roots to the work of Francis Galton in the...choice of statistical tests . This thesis examines an analysis performed by Surface Warfare Development Group (SWDG). The SWDG analysis is shown to be...incorrect due to the misapplication of testing methods. A corrected analysis is presented and recommendations suggested for changes to the testing

  1. Multi-resolutional shape features via non-Euclidean wavelets: Applications to statistical analysis of cortical thickness

    PubMed Central

    Kim, Won Hwa; Singh, Vikas; Chung, Moo K.; Hinrichs, Chris; Pachauri, Deepti; Okonkwo, Ozioma C.; Johnson, Sterling C.

    2014-01-01

    Statistical analysis on arbitrary surface meshes such as the cortical surface is an important approach to understanding brain diseases such as Alzheimer’s disease (AD). Surface analysis may be able to identify specific cortical patterns that relate to certain disease characteristics or exhibit differences between groups. Our goal in this paper is to make group analysis of signals on surfaces more sensitive. To do this, we derive multi-scale shape descriptors that characterize the signal around each mesh vertex, i.e., its local context, at varying levels of resolution. In order to define such a shape descriptor, we make use of recent results from harmonic analysis that extend traditional continuous wavelet theory from the Euclidean to a non-Euclidean setting (i.e., a graph, mesh or network). Using this descriptor, we conduct experiments on two different datasets, the Alzheimer’s Disease NeuroImaging Initiative (ADNI) data and images acquired at the Wisconsin Alzheimer’s Disease Research Center (W-ADRC), focusing on individuals labeled as having Alzheimer’s disease (AD), mild cognitive impairment (MCI) and healthy controls. In particular, we contrast traditional univariate methods with our multi-resolution approach which show increased sensitivity and improved statistical power to detect a group-level effects. We also provide an open source implementation. PMID:24614060

  2. Electromagnetic wave scattering from rough terrain

    NASA Astrophysics Data System (ADS)

    Papa, R. J.; Lennon, J. F.; Taylor, R. L.

    1980-09-01

    This report presents two aspects of a program designed to calculate electromagnetic scattering from rough terrain: (1) the use of statistical estimation techniques to determine topographic parameters and (2) the results of a single-roughness-scale scattering calculation based on those parameters, including comparison with experimental data. In the statistical part of the present calculation, digitized topographic maps are used to generate data bases for the required scattering cells. The application of estimation theory to the data leads to the specification of statistical parameters for each cell. The estimated parameters are then used in a hypothesis test to decide on a probability density function (PDF) that represents the height distribution in the cell. Initially, the formulation uses a single observation of the multivariate data. A subsequent approach involves multiple observations of the heights on a bivariate basis, and further refinements are being considered. The electromagnetic scattering analysis, the second topic, calculates the amount of specular and diffuse multipath power reaching a monopulse receiver from a pulsed beacon positioned over a rough Earth. The program allows for spatial inhomogeneities and multiple specular reflection points. The analysis of shadowing by the rough surface has been extended to the case where the surface heights are distributed exponentially. The calculated loss of boresight pointing accuracy attributable to diffuse multipath is then compared with the experimental results. The extent of the specular region, the use of localized height variations, and the effect of the azimuthal variation in power pattern are all assessed.

  3. Statistical analysis of the 70 meter antenna surface distortions

    NASA Technical Reports Server (NTRS)

    Kiedron, K.; Chian, C. T.; Chuang, K. L.

    1987-01-01

    Statistical analysis of surface distortions of the 70 meter NASA/JPL antenna, located at Goldstone, was performed. The purpose of this analysis is to verify whether deviations due to gravity loading can be treated as quasi-random variables with normal distribution. Histograms of the RF pathlength error distribution for several antenna elevation positions were generated. The results indicate that the deviations from the ideal antenna surface are not normally distributed. The observed density distribution for all antenna elevation angles is taller and narrower than the normal density, which results in large positive values of kurtosis and a significant amount of skewness. The skewness of the distribution changes from positive to negative as the antenna elevation changes from zenith to horizon.

  4. Quantitative Outline-based Shape Analysis and Classification of Planetary Craterforms using Supervised Learning Models

    NASA Astrophysics Data System (ADS)

    Slezak, Thomas Joseph; Radebaugh, Jani; Christiansen, Eric

    2017-10-01

    The shapes of craterform morphology on planetary surfaces provides rich information about their origins and evolution. While morphologic information provides rich visual clues to geologic processes and properties, the ability to quantitatively communicate this information is less easily accomplished. This study examines the morphology of craterforms using the quantitative outline-based shape methods of geometric morphometrics, commonly used in biology and paleontology. We examine and compare landforms on planetary surfaces using shape, a property of morphology that is invariant to translation, rotation, and size. We quantify the shapes of paterae on Io, martian calderas, terrestrial basaltic shield calderas, terrestrial ash-flow calderas, and lunar impact craters using elliptic Fourier analysis (EFA) and the Zahn and Roskies (Z-R) shape function, or tangent angle approach to produce multivariate shape descriptors. These shape descriptors are subjected to multivariate statistical analysis including canonical variate analysis (CVA), a multiple-comparison variant of discriminant analysis, to investigate the link between craterform shape and classification. Paterae on Io are most similar in shape to terrestrial ash-flow calderas and the shapes of terrestrial basaltic shield volcanoes are most similar to martian calderas. The shapes of lunar impact craters, including simple, transitional, and complex morphology, are classified with a 100% rate of success in all models. Multiple CVA models effectively predict and classify different craterforms using shape-based identification and demonstrate significant potential for use in the analysis of planetary surfaces.

  5. Effect of citric acid, tetracycline, and doxycycline on instrumented periodontally involved root surfaces: A SEM study

    PubMed Central

    Chahal, Gurparkash Singh; Chhina, Kamalpreet; Chhabra, Vipin; Bhatnagar, Rakhi; Chahal, Amna

    2014-01-01

    Background: A surface smear layer consisting of organic and inorganic material is formed on the root surface following mechanical instrumentation and may inhibit the formation of new connective tissue attachment to the root surface. Modification of the tooth surface by root conditioning has resulted in improved connective tissue attachment and has advanced the goal of reconstructive periodontal treatment. Aim: The aim of this study was to compare the effects of citric acid, tetracycline, and doxycycline on the instrumented periodontally involved root surfaces in vitro using a scanning electron microscope. Settings and Design: A total of 45 dentin samples obtained from 15 extracted, scaled, and root planed teeth were divided into three groups. Materials and Methods: The root conditioning agents were applied with cotton pellets using the Passive burnishing technique for 5 minutes. The samples were then examined by the scanning electron microscope. Statistical Analysis Used: The statistical analysis was carried out using Statistical Package for Social Sciences (SPSS Inc., Chicago, IL, version 15.0 for Windows). For all quantitative variables means and standard deviations were calculated and compared. For more than two groups ANOVA was applied. For multiple comparisons post hoc tests with Bonferroni correction was used. Results: Upon statistical analysis the root conditioning agents used in this study were found to be effective in removing the smear layer, uncovering and widening the dentin tubules and unmasking the dentin collagen matrix. Conclusion: Tetracycline HCl was found to be the best root conditioner among the three agents used. PMID:24744541

  6. [Optimization of prokaryotic expression conditions of Leptospira interrogans trigeminy genus-specific protein antigen based on surface response analysis].

    PubMed

    Wang, Jiang; Luo, Dongjiao; Sun, Aihua; Yan, Jie

    2008-07-01

    Lipoproteins LipL32 and LipL21 and transmembrane protein OMPL1 have been confirmed as the superficial genus-specific antigens of Leptospira interrogans, which can be used as antigens for developing a universal genetic engineering vaccine. In order to obtain high expression of an artificial fusion gene lipL32/1-lipL21-ompL1/2, we optimized prokaryotic expression conditions. We used surface response analysis based on the central composite design to optimize culture conditions of a new antigen protein by recombinant Escherichia coli DE3.The culture conditions included initial pH, induction start time, post-induction time, Isopropyl beta-D-thiogalactopyranoside (IPTG) concentration, and temperature. The maximal production of antigen protein was 37.78 mg/l. The optimal culture conditions for high recombinant fusion protein was determined: initial pH 7.9, induction start time 2.5 h, a post-induction time of 5.38 h, 0.20 mM IPTG, and a post-induction temperature of 31 degrees C. Surface response analysis based on CCD increased the target production. This statistical method reduced the number of experiments required for optimization and enabled rapid identification and integration of the key culture condition parameters for optimizing recombinant protein expression.

  7. Proximal caries detection: Sirona Sidexis versus Kodak Ektaspeed Plus.

    PubMed

    Khan, Emad A; Tyndall, Donald A; Ludlow, John B; Caplan, Daniel

    2005-01-01

    This study compared the accuracy of intraoral film and a charge-coupled device (CCD) receptor for proximal caries detection. Four observers evaluated images of the proximal surfaces of 40 extracted posterior teeth. The presence or absence of caries was scored using a five-point confidence scale. The actual status of each surface was determined from ground section histology. Responses were evaluated by means of receiver operating characteristic (ROC) analysis. Areas under ROC curves (Az) were assessed through a paired t-test. The performance of the CCD-based intraoral sensor was not different statistically from Ektaspeed Plus film in detecting proximal caries.

  8. On the application of the Principal Component Analysis for an efficient climate downscaling of surface wind fields

    NASA Astrophysics Data System (ADS)

    Chavez, Roberto; Lozano, Sergio; Correia, Pedro; Sanz-Rodrigo, Javier; Probst, Oliver

    2013-04-01

    With the purpose of efficiently and reliably generating long-term wind resource maps for the wind energy industry, the application and verification of a statistical methodology for the climate downscaling of wind fields at surface level is presented in this work. This procedure is based on the combination of the Monte Carlo and the Principal Component Analysis (PCA) statistical methods. Firstly the Monte Carlo method is used to create a huge number of daily-based annual time series, so called climate representative years, by the stratified sampling of a 33-year-long time series corresponding to the available period of the NCAR/NCEP global reanalysis data set (R-2). Secondly the representative years are evaluated such that the best set is chosen according to its capability to recreate the Sea Level Pressure (SLP) temporal and spatial fields from the R-2 data set. The measure of this correspondence is based on the Euclidean distance between the Empirical Orthogonal Functions (EOF) spaces generated by the PCA (Principal Component Analysis) decomposition of the SLP fields from both the long-term and the representative year data sets. The methodology was verified by comparing the selected 365-days period against a 9-year period of wind fields generated by dynamical downscaling the Global Forecast System data with the mesoscale model SKIRON for the Iberian Peninsula. These results showed that, compared to the traditional method of dynamical downscaling any random 365-days period, the error in the average wind velocity by the PCA's representative year was reduced by almost 30%. Moreover the Mean Absolute Errors (MAE) in the monthly and daily wind profiles were also reduced by almost 25% along all SKIRON grid points. These results showed also that the methodology presented maximum error values in the wind speed mean of 0.8 m/s and maximum MAE in the monthly curves of 0.7 m/s. Besides the bulk numbers, this work shows the spatial distribution of the errors across the Iberian domain and additional wind statistics such as the velocity and directional frequency. Additional repetitions were performed to prove the reliability and robustness of this kind-of statistical-dynamical downscaling method.

  9. Surface topography study of prepared 3D printed moulds via 3D printer for silicone elastomer based nasal prosthesis

    NASA Astrophysics Data System (ADS)

    Abdullah, Abdul Manaf; Din, Tengku Noor Daimah Tengku; Mohamad, Dasmawati; Rahim, Tuan Noraihan Azila Tuan; Akil, Hazizan Md; Rajion, Zainul Ahmad

    2016-12-01

    Conventional prosthesis fabrication is highly depends on the hand creativity of laboratory technologist. The development in 3D printing technology offers a great help in fabricating affordable and fast yet esthetically acceptable prostheses. This study was conducted to discover the potential of 3D printed moulds for indirect silicone elastomer based nasal prosthesis fabrication. Moulds were designed using computer aided design (CAD) software (Solidworks, USA) and converted into the standard tessellation language (STL) file. Three moulds with layer thickness of 0.1, 0.2 and 0.3mm were printed utilizing polymer filament based 3D printer (Makerbot Replicator 2X, Makerbot, USA). Another one mould was printed utilizing liquid resin based 3D printer (Objet 30 Scholar, Stratasys, USA) as control. The printed moulds were then used to fabricate maxillofacial silicone specimens (n=10)/mould. Surface profilometer (Surfcom Flex, Accretech, Japan), digital microscope (KH77000, Hirox, USA) and scanning electron microscope (Quanta FEG 450, Fei, USA) were used to measure the surface roughness as well as the topological properties of fabricated silicone. Statistical analysis of One-Way ANOVA was employed to compare the surface roughness of the fabricated silicone elastomer. Result obtained demonstrated significant differences in surface roughness of the fabricated silicone (p<0.01). Further post hoc analysis also revealed significant differences in silicone fabricated using different 3D printed moulds (p<0.01). A 3D printed mould was successfully prepared and characterized. With surface topography that could be enhanced, inexpensive and rapid mould fabrication techniques, polymer filament based 3D printer is potential for indirect silicone elastomer based nasal prosthesis fabrication.

  10. Mass-Spectrometry-Based Proteomics Reveals Organ-Specific Expression Patterns To Be Used as Forensic Evidence.

    PubMed

    Dammeier, Sascha; Nahnsen, Sven; Veit, Johannes; Wehner, Frank; Ueffing, Marius; Kohlbacher, Oliver

    2016-01-04

    Standard forensic procedures to examine bullets after an exchange of fire include a mechanical or ballistic reconstruction of the event. While this is routine to identify which projectile hit a subject by DNA analysis of biological material on the surface of the projectile, it is rather difficult to determine which projectile caused the lethal injury--often the crucial point with regard to legal proceedings. With respect to fundamental law it is the duty of the public authority to make every endeavor to solve every homicide case. To improve forensic examinations, we present a forensic proteomic method to investigate biological material from a projectile's surface and determine the tissues traversed by it. To obtain a range of relevant samples, different major bovine organs were penetrated with projectiles experimentally. After tryptic "on-surface" digestion, mass-spectrometry-based proteome analysis, and statistical data analysis, we were able to achieve a cross-validated organ classification accuracy of >99%. Different types of anticipated external variables exhibited no prominent influence on the findings. In addition, shooting experiments were performed to validate the results. Finally, we show that these concepts could be applied to a real case of murder to substantially improve the forensic reconstruction.

  11. Decomposition of Ag-based soldering alloys used in space maintainers after intra-oral exposure. A retrieval analysis study.

    PubMed

    Soteriou, Despo; Ntasi, Argyro; Papagiannoulis, Lisa; Eliades, Theodore; Zinelis, Spiros

    2014-02-01

    The aim of this study was to evaluate the elemental alterations of Ag soldering alloys used in space maintainers after intra-oral exposure. Twenty devices were fabricated by using two different soldering alloys; US (Dentaurum Universal Silver Solder, n = 10) and OS (Leone Orthodontic Solder, n = 10). All devices were manufactured by the same technician. Surface morphology and elemental quantitative analysis of the soldering alloys before and after intra-oral placement in patients was determined by scanning electron microscopy and energy-dispersive X-ray microanalysis (SEM/EDX). Statistical analysis was performed by t-test, Mann Whitney tests and Pearson's correlation. For all tests a 95% confidence level was used (α = 0.05). Both soldering alloys demonstrated substantially increase in surface roughness after intra-oral aging. Statistical analysis illustrated a significant decrease in the Cu and Zn content after treatment. OS demonstrated higher Cu release than US (p < 0.05). The remaining relative concentrations of Cu and Zn after the treatment did not show any correlation (p > 0.05) with intra-oral exposure time, apart from Zn in OS (r = 0.840, p = 0.04). Both soldering alloys demonstrated a significant Cu and Zn reduction after intra-oral exposure that may raise biocompatibility concerns.

  12. Study of deformation evolution during failure of rock specimens using laser-based vibration measurements

    NASA Astrophysics Data System (ADS)

    Smolin, I. Yu.; Kulkov, A. S.; Makarov, P. V.; Tunda, V. A.; Krasnoveikin, V. A.; Eremin, M. O.; Bakeev, R. A.

    2017-12-01

    The aim of the paper is to analyze experimental data on the dynamic response of the marble specimen in uniaxial compression. To make it we use the methods of mathematical statistics. The lateral surface velocity evolution obtained by the laser Doppler vibrometer represents the data for analysis. The registered data were regarded as a time series that reflects deformation evolution of the specimen loaded up to failure. The revealed changes in statistical parameters were considered as precursors of failure. It is shown that before failure the deformation response is autocorrelated and reflects the states of dynamic chaos and self-organized criticality.

  13. Lineament and polygon patterns on Europa

    NASA Technical Reports Server (NTRS)

    Pieri, D. C.

    1981-01-01

    A classification scheme is presented for the lineaments and associated polygonal patterns observed on the surface of Europa, and the frequency distribution of the polygons is discussed in terms of the stress-relief fracturing of the surface. The lineaments are divided on the basis of albedo, morphology, orientation and characteristic geometry into eight groups based on Voyager 2 images taken at a best resolution of 4 km. The lineaments in turn define a system of polygons varying in size from small reticulate patterns the limit of resolution to 1,000,000 sq km individuals. Preliminary analysis of polygon side frequency distributions reveals a class of polygons with statistics similar to those found in complex terrestrial terrains, particularly in areas of well-oriented stresses, a class with similar statistics around the antijovian point, and a class with a distribution similar to those seen in terrestrial tensional fracture patterns. Speculations concerning the processes giving rise to the lineament patterns are presented.

  14. Improved Bond Equations for Fiber-Reinforced Polymer Bars in Concrete.

    PubMed

    Pour, Sadaf Moallemi; Alam, M Shahria; Milani, Abbas S

    2016-08-30

    This paper explores a set of new equations to predict the bond strength between fiber reinforced polymer (FRP) rebar and concrete. The proposed equations are based on a comprehensive statistical analysis and existing experimental results in the literature. Namely, the most effective parameters on bond behavior of FRP concrete were first identified by applying a factorial analysis on a part of the available database. Then the database that contains 250 pullout tests were divided into four groups based on the concrete compressive strength and the rebar surface. Afterward, nonlinear regression analysis was performed for each study group in order to determine the bond equations. The results show that the proposed equations can predict bond strengths more accurately compared to the other previously reported models.

  15. Correlative bacteriologic and micro-computed tomographic analysis of mandibular molar mesial canals prepared by self-adjusting file, reciproc, and twisted file systems.

    PubMed

    Siqueira, José F; Alves, Flávio R F; Versiani, Marco A; Rôças, Isabela N; Almeida, Bernardo M; Neves, Mônica A S; Sousa-Neto, Manoel D

    2013-08-01

    This ex vivo study evaluated the disinfecting and shaping ability of 3 protocols used in the preparation of mesial root canals of mandibular molars by means of correlative bacteriologic and micro-computed tomographic (μμCT) analysis. The mesial canals of extracted mandibular molars were contaminated with Enterococcus faecalis for 30 days and assigned to 3 groups based on their anatomic configuration as determined by μCT analysis according to the preparation technique (Self-Adjusting File [ReDent-Nova, Ra'anana, Israel], Reciproc [VDW, Munich, Germany], and Twisted File [SybronEndo, Orange, CA]). In all groups, 2.5% NaOCl was the irrigant. Canal samples were taken before (S1) and after instrumentation (S2), and bacterial quantification was performed using culture. Next, mesial roots were subjected to additional μCT analysis in order to evaluate shaping of the canals. All instrumentation protocols promoted a highly significant intracanal bacterial reduction (P < .001). Intergroup quantitative and qualitative comparisons disclosed no significant differences between groups (P > .05). As for shaping, no statistical difference was observed between the techniques regarding the mean percentage of volume increase, the surface area increase, the unprepared surface area, and the relative unprepared surface area (P > .05). Correlative analysis showed no statistically significant relationship between bacterial reduction and the mean percentage increase of the analyzed parameters (P > .05). The 3 instrumentation systems have similar disinfecting and shaping performance in the preparation of mesial canals of mandibular molars. Copyright © 2013 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  16. An Analysis LANDSAT-4 Thematic Mapper Geometric Properties

    NASA Technical Reports Server (NTRS)

    Walker, R. E.; Zobrist, A. L.; Bryant, N. A.; Gokhman, B.; Friedman, S. Z.; Logan, T. L.

    1984-01-01

    LANDSAT Thematic Mapper P-data of Washington, D. C., Harrisburg, PA, and Salton Sea, CA are analyzed to determine magnitudes and causes of error in the geometric conformity of the data to known Earth surface geometry. Several tests of data geometry are performed. Intraband and interband correlation and registration are investigated, exclusive of map based ground truth. The magnitudes and statistical trends of pixel offsets between a single band's mirror scans (due to processing procedures) are computed, and the inter-band integrity of registration is analyzed. A line to line correlation analysis is included.

  17. Predicting water-surface fluctuation of continental lakes: A RS and GIS based approach in Central Mexico

    USGS Publications Warehouse

    Mendoza, M.E.; Bocco, G.; Bravo, M.; Lopez, Granados E.; Osterkamp, W.R.

    2006-01-01

    Changes in the water-surface area occupied by the Cuitzeo Lake, Mexico, during the 1974-2001 period are analysed in this study. The research is based on remote sensing and geographic information techniques, as well as statistical analysis. High-resolution satellite image data were used to analyse the 1974-2000 period, and very low-resolution satellite image data were used for the 1997-2001 period. The long-term analysis (1974-2000) indicated that there were temporal changes in the surface area of the Cuitzeo Lake and that these changes were related to precipitation and temperatures that occurred in the previous year. Short-term monitoring (1997-2001) showed that the Cuitzeo Lake surface is lowering. Field observations demonstrated also that yearly desiccation is recurrent, particularly, in the western section of the lake. Results suggested that this behaviour was probably due to a drought period in the basin that began in the mid 1990s. Regression models constructed from long-term data showed that fluctuations of lake level can be estimated by monthly mean precipitation and temperatures of the previous year. ?? Springer Science + Business Media, Inc. 2006.

  18. Roughness of human enamel surface submitted to different prophylaxis methods.

    PubMed

    Castanho, Gisela Muassab; Arana-Chavez, Victor E; Fava, Marcelo

    2008-01-01

    The purpose of this in vitro study was to evaluate alterations in the surface roughness and micromorphology of human enamel submitted to three prophylaxis methods. Sixty-nine caries-free molars with exposed labial surfaces were divided into three groups. Group I was treated with a rotary instrument set at a low speed, rubber cup and a mixture of water and pumice; group II with a rotary instrument set at a low speed, rubber cup and prophylaxis paste Herjos-F (Vigodent S/A Indústria e Comércio, Rio de Janeiro, Brazil); and group III with sodium bicarbonate spray Profi II Ceramic (Dabi Atlante Indústrias Médico Odontológicas Ltda, Ribeirão Preto, Brazil). All procedures were performed by the same operator for 10 s, and samples were rinsed and stored in distilled water Pre and post-treatment surface evaluation was completed using a surface profilometer (Perthometer S8P, Marh, Perthen, Germany) in 54 samples. In addition, the other samples were coated with gold and examined in a scanning electron microscope (SEM). The results of this study were statistically analyzed with the paired t-test (Student), the Kruskal-Wallis test and the Dunn (5%) test. The sodium bicarbonate spray led to significantly rougher surfaces than the pumice paste. The use of prophylaxis paste showed no statistically significant difference when compared with the other methods. Based on SEM analysis, the sodium bicarbonate spray presented an irregular surface with granular material and erosions. Based on this study, it can be concluded that there was an increased enamel surface roughness when teeth were treated with sodium bicarbonate spray when compared with teeth treated with pumice paste.

  19. On the costs and benefits of emotional labor: a meta-analysis of three decades of research.

    PubMed

    Hülsheger, Ute R; Schewe, Anna F

    2011-07-01

    This article provides a quantitative review of the link of emotional labor (emotion-rule dissonance, surface acting, and deep acting) with well-being and performance outcomes. The meta-analysis is based on 494 individual correlations drawn from a final sample of 95 independent studies. Results revealed substantial relationships of emotion-rule dissonance and surface acting with indicators of impaired well-being (ρs between .39 and .48) and job attitudes (ρs between -.24 and -.40) and a small negative relationship with performance outcomes (ρs between -.20 and -.05). Overall, deep acting displayed weak relationships with indicators of impaired well-being and job attitudes but positive relationships with emotional performance and customer satisfaction (ρs .18 and .37). A meta-analytic regression analysis provides information on the unique contribution of emotion-rule dissonance, surface acting, and deep acting in statistically predicting well-being and performance outcomes. Furthermore, a mediation analysis confirms theoretical models of emotional labor which suggest that surface acting partially mediates the relationship of emotion-rule dissonance with well-being. Implications for future research as well as pragmatic ramifications for organizational practices are discussed in conclusion.

  20. Delineation of marine ecosystem zones in the northern Arabian Sea during winter

    NASA Astrophysics Data System (ADS)

    Shalin, Saleem; Samuelsen, Annette; Korosov, Anton; Menon, Nandini; Backeberg, Björn C.; Pettersson, Lasse H.

    2018-03-01

    The spatial and temporal variability of marine autotrophic abundance, expressed as chlorophyll concentration, is monitored from space and used to delineate the surface signature of marine ecosystem zones with distinct optical characteristics. An objective zoning method is presented and applied to satellite-derived Chlorophyll a (Chl a) data from the northern Arabian Sea (50-75° E and 15-30° N) during the winter months (November-March). Principal component analysis (PCA) and cluster analysis (CA) were used to statistically delineate the Chl a into zones with similar surface distribution patterns and temporal variability. The PCA identifies principal components of variability and the CA splits these into zones based on similar characteristics. Based on the temporal variability of the Chl a pattern within the study area, the statistical clustering revealed six distinct ecological zones. The obtained zones are related to the Longhurst provinces to evaluate how these compared to established ecological provinces. The Chl a variability within each zone was then compared with the variability of oceanic and atmospheric properties viz. mixed-layer depth (MLD), wind speed, sea-surface temperature (SST), photosynthetically active radiation (PAR), nitrate and dust optical thickness (DOT) as an indication of atmospheric input of iron to the ocean. The analysis showed that in all zones, peak values of Chl a coincided with low SST and deep MLD. The rate of decrease in SST and the deepening of MLD are observed to trigger the algae bloom events in the first four zones. Lagged cross-correlation analysis shows that peak Chl a follows peak MLD and SST minima. The MLD time lag is shorter than the SST lag by 8 days, indicating that the cool surface conditions might have enhanced mixing, leading to increased primary production in the study area. An analysis of monthly climatological nitrate values showed increased concentrations associated with the deepening of the mixed layer. The input of iron seems to be important in both the open-ocean and coastal areas of the northern and north-western parts of the northern Arabian Sea, where the seasonal variability of the Chl a pattern closely follows the variability of iron deposition.

  1. Investigation of Magnetotelluric Source Effect Based on Twenty Years of Telluric and Geomagnetic Observation

    NASA Astrophysics Data System (ADS)

    Kis, A.; Lemperger, I.; Wesztergom, V.; Menvielle, M.; Szalai, S.; Novák, A.; Hada, T.; Matsukiyo, S.; Lethy, A. M.

    2016-12-01

    Magnetotelluric method is widely applied for investigation of subsurface structures by imaging the spatial distribution of electric conductivity. The method is based on the experimental determination of surface electromagnetic impedance tensor (Z) by surface geomagnetic and telluric registrations in two perpendicular orientation. In practical explorations the accurate estimation of Z necessitates the application of robust statistical methods for two reasons:1) the geomagnetic and telluric time series' are contaminated by man-made noise components and2) the non-homogeneous behavior of ionospheric current systems in the period range of interest (ELF-ULF and longer periods) results in systematic deviation of the impedance of individual time windows.Robust statistics manage both load of Z for the purpose of subsurface investigations. However, accurate analysis of the long term temporal variation of the first and second statistical moments of Z may provide valuable information about the characteristics of the ionospheric source current systems. Temporal variation of extent, spatial variability and orientation of the ionospheric source currents has specific effects on the surface impedance tensor. Twenty year long geomagnetic and telluric recordings of the Nagycenk Geophysical Observatory provides unique opportunity to reconstruct the so called magnetotelluric source effect and obtain information about the spatial and temporal behavior of ionospheric source currents at mid-latitudes. Detailed investigation of time series of surface electromagnetic impedance tensor has been carried out in different frequency classes of the ULF range. The presentation aims to provide a brief review of our results related to long term periodic modulations, up to solar cycle scale and about eventual deviations of the electromagnetic impedance and so the reconstructed equivalent ionospheric source effects.

  2. Esophageal wall dose-surface maps do not improve the predictive performance of a multivariable NTCP model for acute esophageal toxicity in advanced stage NSCLC patients treated with intensity-modulated (chemo-)radiotherapy.

    PubMed

    Dankers, Frank; Wijsman, Robin; Troost, Esther G C; Monshouwer, René; Bussink, Johan; Hoffmann, Aswin L

    2017-05-07

    In our previous work, a multivariable normal-tissue complication probability (NTCP) model for acute esophageal toxicity (AET) Grade  ⩾2 after highly conformal (chemo-)radiotherapy for non-small cell lung cancer (NSCLC) was developed using multivariable logistic regression analysis incorporating clinical parameters and mean esophageal dose (MED). Since the esophagus is a tubular organ, spatial information of the esophageal wall dose distribution may be important in predicting AET. We investigated whether the incorporation of esophageal wall dose-surface data with spatial information improves the predictive power of our established NTCP model. For 149 NSCLC patients treated with highly conformal radiation therapy esophageal wall dose-surface histograms (DSHs) and polar dose-surface maps (DSMs) were generated. DSMs were used to generate new DSHs and dose-length-histograms that incorporate spatial information of the dose-surface distribution. From these histograms dose parameters were derived and univariate logistic regression analysis showed that they correlated significantly with AET. Following our previous work, new multivariable NTCP models were developed using the most significant dose histogram parameters based on univariate analysis (19 in total). However, the 19 new models incorporating esophageal wall dose-surface data with spatial information did not show improved predictive performance (area under the curve, AUC range 0.79-0.84) over the established multivariable NTCP model based on conventional dose-volume data (AUC  =  0.84). For prediction of AET, based on the proposed multivariable statistical approach, spatial information of the esophageal wall dose distribution is of no added value and it is sufficient to only consider MED as a predictive dosimetric parameter.

  3. Esophageal wall dose-surface maps do not improve the predictive performance of a multivariable NTCP model for acute esophageal toxicity in advanced stage NSCLC patients treated with intensity-modulated (chemo-)radiotherapy

    NASA Astrophysics Data System (ADS)

    Dankers, Frank; Wijsman, Robin; Troost, Esther G. C.; Monshouwer, René; Bussink, Johan; Hoffmann, Aswin L.

    2017-05-01

    In our previous work, a multivariable normal-tissue complication probability (NTCP) model for acute esophageal toxicity (AET) Grade  ⩾2 after highly conformal (chemo-)radiotherapy for non-small cell lung cancer (NSCLC) was developed using multivariable logistic regression analysis incorporating clinical parameters and mean esophageal dose (MED). Since the esophagus is a tubular organ, spatial information of the esophageal wall dose distribution may be important in predicting AET. We investigated whether the incorporation of esophageal wall dose-surface data with spatial information improves the predictive power of our established NTCP model. For 149 NSCLC patients treated with highly conformal radiation therapy esophageal wall dose-surface histograms (DSHs) and polar dose-surface maps (DSMs) were generated. DSMs were used to generate new DSHs and dose-length-histograms that incorporate spatial information of the dose-surface distribution. From these histograms dose parameters were derived and univariate logistic regression analysis showed that they correlated significantly with AET. Following our previous work, new multivariable NTCP models were developed using the most significant dose histogram parameters based on univariate analysis (19 in total). However, the 19 new models incorporating esophageal wall dose-surface data with spatial information did not show improved predictive performance (area under the curve, AUC range 0.79-0.84) over the established multivariable NTCP model based on conventional dose-volume data (AUC  =  0.84). For prediction of AET, based on the proposed multivariable statistical approach, spatial information of the esophageal wall dose distribution is of no added value and it is sufficient to only consider MED as a predictive dosimetric parameter.

  4. An Adaptive Buddy Check for Observational Quality Control

    NASA Technical Reports Server (NTRS)

    Dee, Dick P.; Rukhovets, Leonid; Todling, Ricardo; DaSilva, Arlindo M.; Larson, Jay W.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    An adaptive buddy check algorithm is presented that adjusts tolerances for outlier observations based on the variability of surrounding data. The algorithm derives from a statistical hypothesis test combined with maximum-likelihood covariance estimation. Its stability is shown to depend on the initial identification of outliers by a simple background check. The adaptive feature ensures that the final quality control decisions are not very sensitive to prescribed statistics of first-guess and observation errors, nor on other approximations introduced into the algorithm. The implementation of the algorithm in a global atmospheric data assimilation is described. Its performance is contrasted with that of a non-adaptive buddy check, for the surface analysis of an extreme storm that took place in Europe on 27 December 1999. The adaptive algorithm allowed the inclusion of many important observations that differed greatly from the first guess and that would have been excluded on the basis of prescribed statistics. The analysis of the storm development was much improved as a result of these additional observations.

  5. A statistical approach to discriminate between non-fallers, rare fallers and frequent fallers in older adults based on posturographic data.

    PubMed

    Maranesi, E; Merlo, A; Fioretti, S; Zemp, D D; Campanini, I; Quadri, P

    2016-02-01

    Identification of future non-fallers, infrequent and frequent fallers among older people would permit focusing the delivery of prevention programs on selected individuals. Posturographic parameters have been proven to differentiate between non-fallers and frequent fallers, but not between the first group and infrequent fallers. In this study, postural stability with eyes open and closed on both a firm and a compliant surface and while performing a cognitive task was assessed in a consecutive sample of 130 cognitively able elderly, mean age 77(7)years, categorized as non-fallers (N=67), infrequent fallers (one/two falls, N=45) and frequent fallers (more than two falls, N=18) according to their last year fall history. Principal Component Analysis was used to select the most significant features from a set of 17posturographic parameters. Next, variables derived from principal component analysis were used to test, in each task, group differences between the three groups. One parameter based on a combination of a set of Centre of Pressure anterior-posterior variables obtained from the eyes-open on a compliant surface task was statistically different among all groups, thus distinguishing infrequent fallers from both non-fallers (P<0.05) and frequent fallers (P<0.05). For the first time, a method based on posturographic data to retrospectively discriminate infrequent fallers was obtained. The joint use of both the eyes-open on a compliant surface condition and this new parameter could be used, in a future study, to improve the performance of protocols and to verify the ability of this method to identify new-fallers in elderly without cognitive impairment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. A T1 and DTI fused 3D corpus callosum analysis in pre- vs. post-season contact sports players

    NASA Astrophysics Data System (ADS)

    Lao, Yi; Law, Meng; Shi, Jie; Gajawelli, Niharika; Haas, Lauren; Wang, Yalin; Leporé, Natasha

    2015-01-01

    Sports related traumatic brain injury (TBI) is a worldwide public health issue, and damage to the corpus callosum (CC) has been considered as an important indicator of TBI. However, contact sports players suffer repeated hits to the head during the course of a season even in the absence of diagnosed concussion, and less is known about their effect on callosal anatomy. In addition, T1-weighted and diffusion tensor brain magnetic resonance images (DTI) have been analyzed separately, but a joint analysis of both types of data may increase statistical power and give a more complete understanding of anatomical correlates of subclinical concussions in these athletes. Here, for the first time, we fuse T1 surface-based morphometry and a new DTI analysis on 3D surface representations of the CCs into a single statistical analysis on these subjects. Our new combined method successfully increases detection power in detecting differences between pre- vs. post-season contact sports players. Alterations are found in the ventral genu, isthmus, and splenium of CC. Our findings may inform future health assessments in contact sports players. The new method here is also the first truly multimodal diffusion and T1-weighted analysis of the CC, and may be useful to detect anatomical changes in the corpus callosum in other multimodal datasets.

  7. Simulation on a car interior aerodynamic noise control based on statistical energy analysis

    NASA Astrophysics Data System (ADS)

    Chen, Xin; Wang, Dengfeng; Ma, Zhengdong

    2012-09-01

    How to simulate interior aerodynamic noise accurately is an important question of a car interior noise reduction. The unsteady aerodynamic pressure on body surfaces is proved to be the key effect factor of car interior aerodynamic noise control in high frequency on high speed. In this paper, a detail statistical energy analysis (SEA) model is built. And the vibra-acoustic power inputs are loaded on the model for the valid result of car interior noise analysis. The model is the solid foundation for further optimization on car interior noise control. After the most sensitive subsystems for the power contribution to car interior noise are pointed by SEA comprehensive analysis, the sound pressure level of car interior aerodynamic noise can be reduced by improving their sound and damping characteristics. The further vehicle testing results show that it is available to improve the interior acoustic performance by using detailed SEA model, which comprised by more than 80 subsystems, with the unsteady aerodynamic pressure calculation on body surfaces and the materials improvement of sound/damping properties. It is able to acquire more than 2 dB reduction on the central frequency in the spectrum over 800 Hz. The proposed optimization method can be looked as a reference of car interior aerodynamic noise control by the detail SEA model integrated unsteady computational fluid dynamics (CFD) and sensitivity analysis of acoustic contribution.

  8. INTERFRAGMENTARY SURFACE AREA AS AN INDEX OF COMMINUTION SEVERITY IN CORTICAL BONE IMPACT

    PubMed Central

    Beardsley, Christina L.; Anderson, Donald D.; Marsh, J. Lawrence; Brown, Thomas D.

    2008-01-01

    Summary A monotonic relationship is expected between energy absorption and fracture surface area generation for brittle solids, based on fracture mechanics principles. It was hypothesized that this relationship is demonstrable in bone, to the point that on a continuous scale, comminuted fractures created with specific levels of energy delivery could be discriminated from one another. Using bovine cortical bone segments in conjunction with digital image analysis of CT fracture data, the surface area freed by controlled impact fracture events was measured. The results demonstrated a statistically significant (p<0.0001) difference in measured de novo surface area between three specimen groups, over a range of input energies from 0.423 to 0.702 J/g. Local material properties were also incorporated into these measurements via CT Hounsfield intensities. This study confirms that comminution severity of bone fractures can indeed be measured on a continuous scale, based on energy absorption. This lays a foundation for similar assessments in human injuries. PMID:15885492

  9. Quality improvement of diagnosis of the electromyography data based on statistical characteristics of the measured signals

    NASA Astrophysics Data System (ADS)

    Selivanova, Karina G.; Avrunin, Oleg G.; Zlepko, Sergii M.; Romanyuk, Sergii O.; Zabolotna, Natalia I.; Kotyra, Andrzej; Komada, Paweł; Smailova, Saule

    2016-09-01

    Research and systematization of motor disorders, taking into account the clinical and neurophysiologic phenomena, are important and actual problem of neurology. The article describes a technique for decomposing surface electromyography (EMG), using Principal Component Analysis. The decomposition is achieved by a set of algorithms that uses a specially developed for analyze EMG. The accuracy was verified by calculation of Mahalanobis distance and Probability error.

  10. Impact of satellite-based data on FGGE general circulation statistics

    NASA Technical Reports Server (NTRS)

    Salstein, David A.; Rosen, Richard D.; Baker, Wayman E.; Kalnay, Eugenia

    1987-01-01

    The NASA Goddard Laboratory for Atmospheres (GLA) analysis/forecast system was run in two different parallel modes in order to evaluate the influence that data from satellites and other FGGE observation platforms can have on analyses of large scale circulation; in the first mode, data from all observation systems were used, while in the second only conventional upper air and surface reports were used. The GLA model was also integrated for the same period without insertion of any data; an independent objective analysis based only on rawinsonde and pilot balloon data is also performed. A small decrease in the vigor of the general circulation is noted to follow from the inclusion of satellite observations.

  11. Reanalysis Intercomparison on a Surface Wind Statistical Downscaling Exercise over Northeastern North America.

    NASA Astrophysics Data System (ADS)

    Lucio-Eceiza, Etor E.; Fidel González-Rouco, J.; Navarro, Jorge; García-Bustamante, Elena; Beltrami, Hugo; Rojas-Labanda, Cristina

    2017-04-01

    The area of North Eastern North America is located in a privileged position for the study of the wind behaviour as it lies within the track of many of the extratropical cyclones that travel that half of the continent. During the winter season the cyclonic activity and wind intensity are higher in the region, offering a great opportunity to analyse the relationships of the surface wind field with various large-scale configurations. The analysis of the wind behaviour is conducted via a statistical downscaling method based on Canonical Correlation Analysis (CCA). This methodology exploits the relationships among the main modes of circulation over the North Atlantic and Pacific Sectors and the behaviour of an observational surface wind database. For this exercise, various predictor variables have been selected (surface wind, SLP, geopotential height at 850 and 500 hPa, and thermal thickness between these two levels), obtained by all the global reanalysis products available to date. Our predictand field consists of an observational surface wind dataset with 525 sites distributed over North Eastern North America that span over a period of about 60 years (1953-2010). These data have been previously subjected to an exhaustive quality control process. A sensitivity analysis of the methodology to different parameter configurations has been carried out, such as reanalysis product, window size, predictor variables, number of retained EOF and CCA modes, and crossvalidation subset (to test the robustness of the method). An evaluation of the predictive skill of the wind estimations has also been conducted. Overall, the methodology offers a good representation of the wind variability, which is very consistent between all the reanalysis products. The wind directly obtained from the reanalyses offer a better temporal correlation but a larger range, and in many cases, worst representation of the local variability. The long observational period has also permitted the study of intra to multidecadal variability as the statistical relationship obtained by this method also allows for the reconstruction of the regional wind behaviour back to the mid 19th century. For this task we have used two 20th century reanalysis products as well as two additional instrumental sea level pressure datasets.

  12. The Relationship Between Surface Curvature and Abdominal Aortic Aneurysm Wall Stress.

    PubMed

    de Galarreta, Sergio Ruiz; Cazón, Aitor; Antón, Raúl; Finol, Ender A

    2017-08-01

    The maximum diameter (MD) criterion is the most important factor when predicting risk of rupture of abdominal aortic aneurysms (AAAs). An elevated wall stress has also been linked to a high risk of aneurysm rupture, yet is an uncommon clinical practice to compute AAA wall stress. The purpose of this study is to assess whether other characteristics of the AAA geometry are statistically correlated with wall stress. Using in-house segmentation and meshing algorithms, 30 patient-specific AAA models were generated for finite element analysis (FEA). These models were subsequently used to estimate wall stress and maximum diameter and to evaluate the spatial distributions of wall thickness, cross-sectional diameter, mean curvature, and Gaussian curvature. Data analysis consisted of statistical correlations of the aforementioned geometry metrics with wall stress for the 30 AAA inner and outer wall surfaces. In addition, a linear regression analysis was performed with all the AAA wall surfaces to quantify the relationship of the geometric indices with wall stress. These analyses indicated that while all the geometry metrics have statistically significant correlations with wall stress, the local mean curvature (LMC) exhibits the highest average Pearson's correlation coefficient for both inner and outer wall surfaces. The linear regression analysis revealed coefficients of determination for the outer and inner wall surfaces of 0.712 and 0.516, respectively, with LMC having the largest effect on the linear regression equation with wall stress. This work underscores the importance of evaluating AAA mean wall curvature as a potential surrogate for wall stress.

  13. Wood Products Analysis

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Structural Reliability Consultants' computer program creates graphic plots showing the statistical parameters of glue laminated timbers, or 'glulam.' The company president, Dr. Joseph Murphy, read in NASA Tech Briefs about work related to analysis of Space Shuttle surface tile strength performed for Johnson Space Center by Rockwell International Corporation. Analysis led to a theory of 'consistent tolerance bounds' for statistical distributions, applicable in industrial testing where statistical analysis can influence product development and use. Dr. Murphy then obtained the Tech Support Package that covers the subject in greater detail. The TSP became the basis for Dr. Murphy's computer program PC-DATA, which he is marketing commercially.

  14. Optimization of hole generation in Ti/CFRP stacks

    NASA Astrophysics Data System (ADS)

    Ivanov, Y. N.; Pashkov, A. E.; Chashhin, N. S.

    2018-03-01

    The article aims to describe methods for improving the surface quality and hole accuracy in Ti/CFRP stacks by optimizing cutting methods and drill geometry. The research is based on the fundamentals of machine building, theory of probability, mathematical statistics, and experiment planning and manufacturing process optimization theories. Statistical processing of experiment data was carried out by means of Statistica 6 and Microsoft Excel 2010. Surface geometry in Ti stacks was analyzed using a Taylor Hobson Form Talysurf i200 Series Profilometer, and in CFRP stacks - using a Bruker ContourGT-Kl Optical Microscope. Hole shapes and sizes were analyzed using a Carl Zeiss CONTURA G2 Measuring machine, temperatures in cutting zones were recorded with a FLIR SC7000 Series Infrared Camera. Models of multivariate analysis of variance were developed. They show effects of drilling modes on surface quality and accuracy of holes in Ti/CFRP stacks. The task of multicriteria drilling process optimization was solved. Optimal cutting technologies which improve performance were developed. Methods for assessing thermal tool and material expansion effects on the accuracy of holes in Ti/CFRP/Ti stacks were developed.

  15. Time Series Analysis Based on Running Mann Whitney Z Statistics

    USDA-ARS?s Scientific Manuscript database

    A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...

  16. Morphological Properties of Siloxane-Hydrogel Contact Lens Surfaces.

    PubMed

    Stach, Sebastian; Ţălu, Ştefan; Trabattoni, Silvia; Tavazzi, Silvia; Głuchaczka, Alicja; Siek, Patrycja; Zając, Joanna; Giovanzana, Stefano

    2017-04-01

    The aim of this study was to quantitatively characterize the micromorphology of contact lens (CL) surfaces using atomic force microscopy (AFM) and multifractal analysis. AFM and multifractal analysis were used to characterize the topography of new and worn siloxane-hydrogel CLs made of Filcon V (I FDA group). CL surface roughness was studied by AFM in intermittent-contact mode, in air, on square areas of 25 and 100 μm 2 , by using a Nanoscope V MultiMode (Bruker). Detailed surface characterization of the surface topography was obtained using statistical parameters of 3-D (three-dimensional) surface roughness, in accordance with ISO 25178-2: 2012. Before wear, the surface was found to be characterized by out-of-plane and sharp structures, whilst after a wear of 8 h, two typical morphologies were observed. One morphology (sharp type) has a similar aspect as the unworn CLs and the other morphology (smooth type) is characterized by troughs and bumpy structures. The analysis of the AFM images revealed a multifractal geometry. The generalized dimension D q and the singularity spectrum f(α) provided quantitative values that characterize the local scale properties of CL surface geometry at nanometer scale. Surface statistical parameters deduced by multifractal analysis can be used to assess the CL micromorphology and can be used by manufacturers in developing CLs with improved surface characteristics. These parameters can also be used in understanding the tribological interactions of the back surface of the CL with the corneal surface and the front surface of the CL with the under-surface of the eyelid (friction, wear, and micro-elastohydrodynamic lubrication at a nanometer scale).

  17. Forensic age estimation by morphometric analysis of the manubrium from 3D MR images.

    PubMed

    Martínez Vera, Naira P; Höller, Johannes; Widek, Thomas; Neumayer, Bernhard; Ehammer, Thomas; Urschler, Martin

    2017-08-01

    Forensic age estimation research based on skeletal structures focuses on patterns of growth and development using different bones. In this work, our aim was to study growth-related evolution of the manubrium in living adolescents and young adults using magnetic resonance imaging (MRI), which is an image acquisition modality that does not involve ionizing radiation. In a first step, individual manubrium and subject features were correlated with age, which confirmed a statistically significant change of manubrium volume (M vol :p<0.01, R 2 ¯=0.50) and surface area (M sur :p<0.01, R 2 ¯=0.53) for the studied age range. Additionally, shapes of the manubria were for the first time investigated using principal component analysis. The decomposition of the data in principal components allowed to analyse the contribution of each component to total shape variation. With 13 principal components, ∼96% of shape variation could be described (M shp :p<0.01, R 2 ¯=0.60). Multiple linear regression analysis modelled the relationship between the statistically best correlated variables and age. Models including manubrium shape, volume or surface area divided by the height of the subject (Y∼M shp M sur /S h :p<0.01, R 2 ¯=0.71; Y∼M shp M vol /S h :p<0.01, R 2 ¯=0.72) presented a standard error of estimate of two years. In order to estimate the accuracy of these two manubrium-based age estimation models, cross validation experiments predicting age on held-out test sets were performed. Median absolute difference of predicted and known chronological age was 1.18 years for the best performing model (Y∼M shp M sur /S h :p<0.01, R p 2 =0.67). In conclusion, despite limitations in determining legal majority age, manubrium morphometry analysis presented statistically significant results for skeletal age estimation, which indicates that this bone structure may be considered as a new candidate in multi-factorial MRI-based age estimation. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Improvements to an earth observing statistical performance model with applications to LWIR spectral variability

    NASA Astrophysics Data System (ADS)

    Zhao, Runchen; Ientilucci, Emmett J.

    2017-05-01

    Hyperspectral remote sensing systems provide spectral data composed of hundreds of narrow spectral bands. Spectral remote sensing systems can be used to identify targets, for example, without physical interaction. Often it is of interested to characterize the spectral variability of targets or objects. The purpose of this paper is to identify and characterize the LWIR spectral variability of targets based on an improved earth observing statistical performance model, known as the Forecasting and Analysis of Spectroradiometric System Performance (FASSP) model. FASSP contains three basic modules including a scene model, sensor model and a processing model. Instead of using mean surface reflectance only as input to the model, FASSP transfers user defined statistical characteristics of a scene through the image chain (i.e., from source to sensor). The radiative transfer model, MODTRAN, is used to simulate the radiative transfer based on user defined atmospheric parameters. To retrieve class emissivity and temperature statistics, or temperature / emissivity separation (TES), a LWIR atmospheric compensation method is necessary. The FASSP model has a method to transform statistics in the visible (ie., ELM) but currently does not have LWIR TES algorithm in place. This paper addresses the implementation of such a TES algorithm and its associated transformation of statistics.

  19. The Research of Multiple Attenuation Based on Feedback Iteration and Independent Component Analysis

    NASA Astrophysics Data System (ADS)

    Xu, X.; Tong, S.; Wang, L.

    2017-12-01

    How to solve the problem of multiple suppression is a difficult problem in seismic data processing. The traditional technology for multiple attenuation is based on the principle of the minimum output energy of the seismic signal, this criterion is based on the second order statistics, and it can't achieve the multiple attenuation when the primaries and multiples are non-orthogonal. In order to solve the above problems, we combine the feedback iteration method based on the wave equation and the improved independent component analysis (ICA) based on high order statistics to suppress the multiple waves. We first use iterative feedback method to predict the free surface multiples of each order. Then, in order to predict multiples from real multiple in amplitude and phase, we design an expanded pseudo multi-channel matching filtering method to get a more accurate matching multiple result. Finally, we present the improved fast ICA algorithm which is based on the maximum non-Gauss criterion of output signal to the matching multiples and get better separation results of the primaries and the multiples. The advantage of our method is that we don't need any priori information to the prediction of the multiples, and can have a better separation result. The method has been applied to several synthetic data generated by finite-difference model technique and the Sigsbee2B model multiple data, the primaries and multiples are non-orthogonal in these models. The experiments show that after three to four iterations, we can get the perfect multiple results. Using our matching method and Fast ICA adaptive multiple subtraction, we can not only effectively preserve the effective wave energy in seismic records, but also can effectively suppress the free surface multiples, especially the multiples related to the middle and deep areas.

  20. Brushing-Induced Surface Roughness of Two Nickel Based Alloys and a Titanium Based Alloy: A Comparative Study - In Vitro Study

    PubMed Central

    Acharya, B L Guruprasanna; Nadiger, Ramesh; Shetty, Bharathraj; Gururaj, G; Kumar, K Naveen; Darshan, D D

    2014-01-01

    Background: Alloys with high nickel content have been increasingly used in dentistry. Alloys have high corrosion rates when exposed to chemical or physical forces that are common intra orally. Titanium is the most biocompatible materials for crowns, fixed partial dentures and implants in the present use, but paradoxically the self-protective oxide film on the titanium can be affected by excessive use of the most common preventive agents in dentistry. Therefore, this study is undertaken in order to draw attention toward the potential effect of prophylactic brushing in a saline medium. Materials and Methods: Forty-five wax patterns in equal dimensions of 10 mm × 10 mm × 2 mm were cast in titanium (Grade II) and nickel-chromium. Of the 45 wax patterns, 15 wax patterns were used for preparing cast titanium samples and 30 wax patterns were used for preparing cast nickel-chromium samples and polished. These samples were divided into three groups of 15 samples each. They are brushed for 48 h each clinically simulating 2 years of brushing in a saline tooth paste medium. The surface roughnesses of the samples were evaluated using profilometer, scanning electron microscopes and energy dispersive spectroscopy. Results were subjected to statistical analysis. Results: The statistical analysis of the Rz and Ra surface roughness values were calculated. Significant difference of surface roughness was present in the titanium samples compared to that of the machine-readable cataloguing and Wirolloy (nickel-chromium) samples after the study. To know the difference in the values of all samples before and after, Student’s paired t-test was carried out. Results showed that there is a significant change in the Rz and Ra values of titanium samples. Conclusion: The present findings suggest that, prophylactic brushing with the fluoridated toothpaste have an effect on the surface roughness of titanium and also to a certain extent, on nickel-chromium. Therefore, careful consideration must be given to the selection of the toothbrushes and toothpastes with the medium abrasives in patients with these restorations. How to cite the article: Acharya BL, Nadiger R, Shetty B, Gururaj G, Kumar KN, Darshan DD. Brushing induced surface roughness of two nickel based alloys and a titanium based alloy: A comparative study - In vitro study. J Int Oral Health 2014;6(3):36-49. PMID:25083031

  1. Fast, Statistical Model of Surface Roughness for Ion-Solid Interaction Simulations and Efficient Code Coupling

    NASA Astrophysics Data System (ADS)

    Drobny, Jon; Curreli, Davide; Ruzic, David; Lasa, Ane; Green, David; Canik, John; Younkin, Tim; Blondel, Sophie; Wirth, Brian

    2017-10-01

    Surface roughness greatly impacts material erosion, and thus plays an important role in Plasma-Surface Interactions. Developing strategies for efficiently introducing rough surfaces into ion-solid interaction codes will be an important step towards whole-device modeling of plasma devices and future fusion reactors such as ITER. Fractal TRIDYN (F-TRIDYN) is an upgraded version of the Monte Carlo, BCA program TRIDYN developed for this purpose that includes an explicit fractal model of surface roughness and extended input and output options for file-based code coupling. Code coupling with both plasma and material codes has been achieved and allows for multi-scale, whole-device modeling of plasma experiments. These code coupling results will be presented. F-TRIDYN has been further upgraded with an alternative, statistical model of surface roughness. The statistical model is significantly faster than and compares favorably to the fractal model. Additionally, the statistical model compares well to alternative computational surface roughness models and experiments. Theoretical links between the fractal and statistical models are made, and further connections to experimental measurements of surface roughness are explored. This work was supported by the PSI-SciDAC Project funded by the U.S. Department of Energy through contract DOE-DE-SC0008658.

  2. A simple hydrodynamic model of tornado-like vortices

    NASA Astrophysics Data System (ADS)

    Kurgansky, M. V.

    2015-05-01

    Based on similarity arguments, a simple fluid dynamic model of tornado-like vortices is offered that, with account for "vortex breakdown" at a certain height above the ground, relates the maximal azimuthal velocity in the vortex, reachable near the ground surface, to the convective available potential energy (CAPE) stored in the environmental atmosphere under pre-tornado conditions. The relative proportion of the helicity (kinetic energy) destruction (dissipation) in the "vortex breakdown" zone and, accordingly, within the surface boundary layer beneath the vortex is evaluated. These considerations form the basis of the dynamic-statistical analysis of the relationship between the tornado intensity and the CAPE budget in the surrounding atmosphere.

  3. Detection of semi-volatile organic compounds in permeable ...

    EPA Pesticide Factsheets

    Abstract The Edison Environmental Center (EEC) has a research and demonstration permeable parking lot comprised of three different permeable systems: permeable asphalt, porous concrete and interlocking concrete permeable pavers. Water quality and quantity analysis has been ongoing since January, 2010. This paper describes a subset of the water quality analysis, analysis of semivolatile organic compounds (SVOCs) to determine if hydrocarbons were in water infiltrated through the permeable surfaces. SVOCs were analyzed in samples collected from 11 dates over a 3 year period, from 2/8/2010 to 4/1/2013.Results are broadly divided into three categories: 42 chemicals were never detected; 12 chemicals (11 chemical test) were detected at a rate of less than 10% or less; and 22 chemicals were detected at a frequency of 10% or greater (ranging from 10% to 66.5% detections). Fundamental and exploratory statistical analyses were performed on these latter analyses results by grouping results by surface type. The statistical analyses were limited due to low frequency of detections and dilutions of samples which impacted detection limits. The infiltrate data through three permeable surfaces were analyzed as non-parametric data by the Kaplan-Meier estimation method for fundamental statistics; there were some statistically observable difference in concentration between pavement types when using Tarone-Ware Comparison Hypothesis Test. Additionally Spearman Rank order non-parame

  4. Analysis of measured data of human body based on error correcting frequency

    NASA Astrophysics Data System (ADS)

    Jin, Aiyan; Peipei, Gao; Shang, Xiaomei

    2014-04-01

    Anthropometry is to measure all parts of human body surface, and the measured data is the basis of analysis and study of the human body, establishment and modification of garment size and formulation and implementation of online clothing store. In this paper, several groups of the measured data are gained, and analysis of data error is gotten by analyzing the error frequency and using analysis of variance method in mathematical statistics method. Determination of the measured data accuracy and the difficulty of measured parts of human body, further studies of the causes of data errors, and summarization of the key points to minimize errors possibly are also mentioned in the paper. This paper analyses the measured data based on error frequency, and in a way , it provides certain reference elements to promote the garment industry development.

  5. Improved Bond Equations for Fiber-Reinforced Polymer Bars in Concrete

    PubMed Central

    Pour, Sadaf Moallemi; Alam, M. Shahria; Milani, Abbas S.

    2016-01-01

    This paper explores a set of new equations to predict the bond strength between fiber reinforced polymer (FRP) rebar and concrete. The proposed equations are based on a comprehensive statistical analysis and existing experimental results in the literature. Namely, the most effective parameters on bond behavior of FRP concrete were first identified by applying a factorial analysis on a part of the available database. Then the database that contains 250 pullout tests were divided into four groups based on the concrete compressive strength and the rebar surface. Afterward, nonlinear regression analysis was performed for each study group in order to determine the bond equations. The results show that the proposed equations can predict bond strengths more accurately compared to the other previously reported models. PMID:28773859

  6. Pathway analysis with next-generation sequencing data.

    PubMed

    Zhao, Jinying; Zhu, Yun; Boerwinkle, Eric; Xiong, Momiao

    2015-04-01

    Although pathway analysis methods have been developed and successfully applied to association studies of common variants, the statistical methods for pathway-based association analysis of rare variants have not been well developed. Many investigators observed highly inflated false-positive rates and low power in pathway-based tests of association of rare variants. The inflated false-positive rates and low true-positive rates of the current methods are mainly due to their lack of ability to account for gametic phase disequilibrium. To overcome these serious limitations, we develop a novel statistic that is based on the smoothed functional principal component analysis (SFPCA) for pathway association tests with next-generation sequencing data. The developed statistic has the ability to capture position-level variant information and account for gametic phase disequilibrium. By intensive simulations, we demonstrate that the SFPCA-based statistic for testing pathway association with either rare or common or both rare and common variants has the correct type 1 error rates. Also the power of the SFPCA-based statistic and 22 additional existing statistics are evaluated. We found that the SFPCA-based statistic has a much higher power than other existing statistics in all the scenarios considered. To further evaluate its performance, the SFPCA-based statistic is applied to pathway analysis of exome sequencing data in the early-onset myocardial infarction (EOMI) project. We identify three pathways significantly associated with EOMI after the Bonferroni correction. In addition, our preliminary results show that the SFPCA-based statistic has much smaller P-values to identify pathway association than other existing methods.

  7. Rough surface reconstruction for ultrasonic NDE simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, Wonjae; Shi, Fan; Lowe, Michael J. S.

    2014-02-18

    The reflection of ultrasound from rough surfaces is an important topic for the NDE of safety-critical components, such as pressure-containing components in power stations. The specular reflection from a rough surface of a defect is normally lower than it would be from a flat surface, so it is typical to apply a safety factor in order that justification cases for inspection planning are conservative. The study of the statistics of the rough surfaces that might be expected in candidate defects according to materials and loading, and the reflections from them, can be useful to develop arguments for realistic safety factors.more » This paper presents a study of real rough crack surfaces that are representative of the potential defects in pressure-containing power plant. Two-dimensional (area) values of the height of the roughness have been measured and their statistics analysed. Then a means to reconstruct model cases with similar statistics, so as to enable the creation of multiple realistic realizations of the surfaces, has been investigated, using random field theory. Rough surfaces are reconstructed, based on a real surface, and results for these two-dimensional descriptions of the original surface have been compared with those from the conventional model based on a one-dimensional correlation coefficient function. In addition, ultrasonic reflections from them are simulated using a finite element method.« less

  8. Biocompatibility Analysis of an Electrically-Activated Silver-Based Antibacterial Surface System for Medical Device Applications

    DTIC Science & Technology

    2012-12-16

    sterilizing without causing toxicity in vivo. 1 Introduction As reported to the Centers for Disease Control and Pre- vention (CDC) between 2006 and...Owings MF. National Hospital Discharge Survey. Advance Data from Vital and Health Statistics. United States: Centers for Disease Control and Prevention...10.1007/s10856-012-4730-3. 19. Shirwaiker RA, Wysk RA, Kariyawasam S, Carrion H, Voigt RC. Micro-scale fabrication and characterization of a silver–polymer

  9. Small angle scattering polarization biopsy: a comparative analysis of various skin diseases

    NASA Astrophysics Data System (ADS)

    Zimnyakov, D. A.; Alonova, M. V.; Yermolenko, S. B.; Ivashko, P. V.; Reshetnikova, E. M.; Galkina, E. M.; Utz, S. R.

    2013-12-01

    An approach to differentiation of the morphological features of normal and pathological human epidermis on the base of statistical analysis of the local polarization states of laser light forward scattered by in-vitro tissue samples is discussed. The eccentricity and the azimuth angle of local polarization ellipses retrieved for various positions of the focused laser beam on the tissue surface, and the coefficient of collimated transmittance are considered as the diagnostic parameters for differentiation. The experimental data obtained with the psoriasis, discoid lupus erythematosus, alopecia, lichen planus, scabies, demodex, and normal skin samples are presented.

  10. Multivariate statistical techniques for the evaluation of surface water quality of the Himalayan foothills streams, Pakistan

    NASA Astrophysics Data System (ADS)

    Malik, Riffat Naseem; Hashmi, Muhammad Zaffar

    2017-10-01

    Himalayan foothills streams, Pakistan play an important role in living water supply and irrigation of farmlands; thus, the water quality is closely related to public health. Multivariate techniques were applied to check spatial and seasonal trends, and metals contamination sources of the Himalayan foothills streams, Pakistan. Grab surface water samples were collected from different sites (5-15 cm water depth) in pre-washed polyethylene containers. Fast Sequential Atomic Absorption Spectrophotometer (Varian FSAA-240) was used to measure the metals concentration. Concentrations of Ni, Cu, and Mn were high in pre-monsoon season than the post-monsoon season. Cluster analysis identified impaired, moderately impaired and least impaired clusters based on water parameters. Discriminant function analysis indicated spatial variability in water was due to temperature, electrical conductivity, nitrates, iron and lead whereas seasonal variations were correlated with 16 physicochemical parameters. Factor analysis identified municipal and poultry waste, automobile activities, surface runoff, and soil weathering as major sources of contamination. Levels of Mn, Cr, Fe, Pb, Cd, Zn and alkalinity were above the WHO and USEPA standards for surface water. The results of present study will help to higher authorities for the management of the Himalayan foothills streams.

  11. Assessment and statistics of surgically induced astigmatism.

    PubMed

    Naeser, Kristian

    2008-05-01

    The aim of the thesis was to develop methods for assessment of surgically induced astigmatism (SIA) in individual eyes, and in groups of eyes. The thesis is based on 12 peer-reviewed publications, published over a period of 16 years. In these publications older and contemporary literature was reviewed(1). A new method (the polar system) for analysis of SIA was developed. Multivariate statistical analysis of refractive data was described(2-4). Clinical validation studies were performed. The description of a cylinder surface with polar values and differential geometry was compared. The main results were: refractive data in the form of sphere, cylinder and axis may define an individual patient or data set, but are unsuited for mathematical and statistical analyses(1). The polar value system converts net astigmatisms to orthonormal components in dioptric space. A polar value is the difference in meridional power between two orthogonal meridians(5,6). Any pair of polar values, separated by an arch of 45 degrees, characterizes a net astigmatism completely(7). The two polar values represent the net curvital and net torsional power over the chosen meridian(8). The spherical component is described by the spherical equivalent power. Several clinical studies demonstrated the efficiency of multivariate statistical analysis of refractive data(4,9-11). Polar values and formal differential geometry describe astigmatic surfaces with similar concepts and mathematical functions(8). Other contemporary methods, such as Long's power matrix, Holladay's and Alpins' methods, Zernike(12) and Fourier analyses(8), are correlated to the polar value system. In conclusion, analysis of SIA should be performed with polar values or other contemporary component systems. The study was supported by Statens Sundhedsvidenskabeligt Forskningsråd, Cykelhandler P. Th. Rasmussen og Hustrus Mindelegat, Hotelejer Carl Larsen og Hustru Nicoline Larsens Mindelegat, Landsforeningen til Vaern om Synet, Forskningsinitiativet for Arhus Amt, Alcon Denmark, and Desirée and Niels Ydes Fond.

  12. Application of image recognition algorithms for statistical description of nano- and microstructured surfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mărăscu, V.; Dinescu, G.; Faculty of Physics, University of Bucharest, 405 Atomistilor Street, Bucharest-Magurele

    In this paper we propose a statistical approach for describing the self-assembling of sub-micronic polystyrene beads on silicon surfaces, as well as the evolution of surface topography due to plasma treatments. Algorithms for image recognition are used in conjunction with Scanning Electron Microscopy (SEM) imaging of surfaces. In a first step, greyscale images of the surface covered by the polystyrene beads are obtained. Further, an adaptive thresholding method was applied for obtaining binary images. The next step consisted in automatic identification of polystyrene beads dimensions, by using Hough transform algorithm, according to beads radius. In order to analyze the uniformitymore » of the self–assembled polystyrene beads, the squared modulus of 2-dimensional Fast Fourier Transform (2- D FFT) was applied. By combining these algorithms we obtain a powerful and fast statistical tool for analysis of micro and nanomaterials with aspect features regularly distributed on surface upon SEM examination.« less

  13. The influence of polishing techniques on pre-polymerized CAD\\CAM acrylic resin denture bases

    PubMed Central

    Alammari, Manal Rahma

    2017-01-01

    Background Lately, computer-aided design and computer-aided manufacturing (CAD/CAM) has broadly been successfully employed in dentistry. The CAD/CAM systems have recently become commercially available for fabrication of complete dentures, and are considered as an alternative technique to conventionally processed acrylic resin bases. However, they have not yet been fully investigated. Objective The purpose of this study was to inspect the effects of mechanical polishing and chemical polishing on the surface roughness (Ra) and contact angle (wettability) of heat-cured, auto-cured and CAD/CAM denture base acrylic resins. Methods This study was conducted at the Advanced Dental Research Laboratory Center of King Abdulaziz University from March to June 2017. Three denture base materials were selected: heat cure poly-methylmethacrylate resin, thermoplastic (polyamide resin) and (CAD\\CAM) denture base resin. Sixty specimens were prepared and divided into three groups, twenty in each. Each group was divided according to the polishing techniques into (Mech P) and (Chem P), ten specimens in each; surface roughness and wettability were investigated. Data were analyzed by SPSS version 22, using one-way ANOVA and Pearson coefficient. Results One-way analysis of variance (ANOVA) and post hoc tests were used for comparing the surface roughness values between three groups which revealed a statistical significant difference between them (p1<0.001). Heat-cured denture base material of (Group I) in both methods, showed the highest mean surface roughness value (2.44±0.07, 2.72±0.09, Mech P and Chem P respectively); while CAD\\CAM denture base material (group III) showed the least mean values (1.08±0.23, 1.39±0.31, Mech P and Chem P respectively). CAD/CAM showed the least contact angle in both polishing methods, which were statistically significant at 5% level (p=0.034 and p<0.001). Conclusion Mechanical polishing produced lower surface roughness of CAD\\CAM denture base resin with superior smooth surface compared to chemical polishing. Mechanical polishing is considered the best effective polishing technique. CAD/CAM denture base material should be considered as the material of choice for complete denture construction in the near future, especially for older dental patients with changed salivary functions, because of its wettability. PMID:29238483

  14. The influence of polishing techniques on pre-polymerized CAD\\CAM acrylic resin denture bases.

    PubMed

    Alammari, Manal Rahma

    2017-10-01

    Lately, computer-aided design and computer-aided manufacturing (CAD/CAM) has broadly been successfully employed in dentistry. The CAD/CAM systems have recently become commercially available for fabrication of complete dentures, and are considered as an alternative technique to conventionally processed acrylic resin bases. However, they have not yet been fully investigated. The purpose of this study was to inspect the effects of mechanical polishing and chemical polishing on the surface roughness (Ra) and contact angle (wettability) of heat-cured, auto-cured and CAD/CAM denture base acrylic resins. This study was conducted at the Advanced Dental Research Laboratory Center of King Abdulaziz University from March to June 2017. Three denture base materials were selected: heat cure poly-methylmethacrylate resin, thermoplastic (polyamide resin) and (CAD\\CAM) denture base resin. Sixty specimens were prepared and divided into three groups, twenty in each. Each group was divided according to the polishing techniques into (Mech P) and (Chem P), ten specimens in each; surface roughness and wettability were investigated. Data were analyzed by SPSS version 22, using one-way ANOVA and Pearson coefficient. One-way analysis of variance (ANOVA) and post hoc tests were used for comparing the surface roughness values between three groups which revealed a statistical significant difference between them (p 1 <0.001). Heat-cured denture base material of (Group I) in both methods, showed the highest mean surface roughness value (2.44±0.07, 2.72±0.09, Mech P and Chem P respectively); while CAD\\CAM denture base material (group III) showed the least mean values (1.08±0.23, 1.39±0.31, Mech P and Chem P respectively). CAD/CAM showed the least contact angle in both polishing methods, which were statistically significant at 5% level (p=0.034 and p<0.001). Mechanical polishing produced lower surface roughness of CAD\\CAM denture base resin with superior smooth surface compared to chemical polishing. Mechanical polishing is considered the best effective polishing technique. CAD/CAM denture base material should be considered as the material of choice for complete denture construction in the near future, especially for older dental patients with changed salivary functions, because of its wettability.

  15. Performance analysis of cutting graphite-epoxy composite using a 90,000psi abrasive waterjet

    NASA Astrophysics Data System (ADS)

    Choppali, Aiswarya

    Graphite-epoxy composites are being widely used in many aerospace and structural applications because of their properties: which include lighter weight, higher strength to weight ratio and a greater flexibility in design. However, the inherent anisotropy of these composites makes it difficult to machine them using conventional methods. To overcome the major issues that develop with conventional machining such as fiber pull out, delamination, heat generation and high tooling costs, an effort is herein made to study abrasive waterjet machining of composites. An abrasive waterjet is used to cut 1" thick graphite epoxy composites based on baseline data obtained from the cutting of ¼" thick material. The objective of this project is to study the surface roughness of the cut surface with a focus on demonstrating the benefits of using higher pressures for cutting composites. The effects of major cutting parameters: jet pressure, traverse speed, abrasive feed rate and cutting head size are studied at different levels. Statistical analysis of the experimental data provides an understanding of the effect of the process parameters on surface roughness. Additionally, the effect of these parameters on the taper angle of the cut is studied. The data is analyzed to obtain a set of process parameters that optimize the cutting of 1" thick graphite-epoxy composite. The statistical analysis is used to validate the experimental data. Costs involved in the cutting process are investigated in term of abrasive consumed to better understand and illustrate the practical benefits of using higher pressures. It is demonstrated that, as pressure increased, ultra-high pressure waterjets produced a better surface quality at a faster traverse rate with lower costs.

  16. Water quality and quantity assessment of pervious pavements performance in experimental car park areas.

    PubMed

    Sañudo-Fontaneda, Luis A; Charlesworth, Susanne M; Castro-Fresno, Daniel; Andres-Valeri, Valerio C A; Rodriguez-Hernandez, Jorge

    2014-01-01

    Pervious pavements have become one of the most used sustainable urban drainage system (SUDS) techniques in car parks. This research paper presents the results of monitoring water quality from several experimental car park areas designed and constructed in Spain with bays made of interlocking concrete block pavement, porous asphalt, polymer-modified porous concrete and reinforced grass with plastic and concrete cells. Moreover, two different sub-base materials were used (limestone aggregates and basic oxygen furnace slag). This study therefore encompasses the majority of the materials used as permeable surfaces and sub-base layers all over the world. Effluent from the test bays was monitored for dissolved oxygen, pH, electric conductivity, total suspended solids, turbidity and total petroleum hydrocarbons in order to analyze the behaviour shown by each combination of surface and sub-base materials. In addition, permeability tests were undertaken in all car parks using the 'Laboratorio Caminos Santander' permeameter and the Cantabrian Portable Infiltrometer. All results are presented together with the influence of surface and sub-base materials on water quality indicators using bivariate correlation statistical analysis at a confidence level of 95%. The polymer-modified porous concrete surface course in combination with limestone aggregate sub-base presented the best performance.

  17. Estimation of the Ocean Skin Temperature using the NASA GEOS Atmospheric Data Assimilation System

    NASA Technical Reports Server (NTRS)

    Koster, Randal D.; Akella, Santha; Todling, Ricardo; Suarez, Max

    2016-01-01

    This report documents the status of the development of a sea surface temperature (SST) analysis for the Goddard Earth Observing System (GEOS) Version-5 atmospheric data assimilation system (ADAS). Its implementation is part of the steps being taken toward the development of an integrated earth system analysis. Currently, GEOS-ADAS SST is a bulk ocean temperature (from ocean boundary conditions), and is almost identical to the skin sea surface temperature. Here we describe changes to the atmosphere-ocean interface layer of the GEOS-atmospheric general circulation model (AGCM) to include near surface diurnal warming and cool-skin effects. We also added SST relevant Advanced Very High Resolution Radiometer (AVHRR) observations to the GEOS-ADAS observing system. We provide a detailed description of our analysis of these observations, along with the modifications to the interface between the GEOS atmospheric general circulation model, gridpoint statistical interpolation-based atmospheric analysis and the community radiative transfer model. Our experiments (with and without these changes) show improved assimilation of satellite radiance observations. We obtained a closer fit to withheld, in-situ buoys measuring near-surface SST. Evaluation of forecast skill scores corroborate improvements seen in the observation fits. Along with a discussion of our results, we also include directions for future work.

  18. Effect of denture cleaning on abrasion resistance and surface topography of polymerized CAD CAM acrylic resin denture base

    PubMed Central

    Shinawi, Lana Ahmed

    2017-01-01

    Background The application of computer-aided design computer-aided manufacturing (CAD CAM) technology in the fabrication of complete dentures, offers numerous advantages as it provides optimum fit and eliminates polymerization shrinkage of the acrylic base. Additionally, the porosity and surface roughness of CAD CAM resins is less compared to conventionally processed resins which leads to a decrease in the adhesion of bacteria on the denture base, which is associated with many conditions including halitosis and aspiration pneumonia in elderly denture wearers. Aim To evaluate the influence of tooth brushing with dentifrices on CAD CAM resin blocks in terms of abrasion resistance, surface roughness and scanning electron photomicrography. Methods This experimental study was carried out at the Faculty of Dentistry of King Abdulaziz University during 2016. A total of 40 rectangular shaped polymerized CAD CAM resin samples were subjected to 40.000 and 60.000 brushing strokes under a 200-gram vertical load simulating three years of tooth brushing strokes using commercially available denture cleaning dentifrice. Data were analyzed by SPSS version 20, using descriptive statistics and ANOVA. Results ANOVA test revealed a statistical significant weight loss of CAD CAM acrylic resin denture base specimens following 40.000 and 60.000 brushing strokes as well as a statistical significant change (p=0.0.5) in the surface roughness following brushing. The CAD CAM resin samples SEM baseline imaging revealed a relatively smooth homogenous surface, but following 40,000 and 60,000 brushing strokes, imaging displayed the presence of small scratches on the surface. Conclusion CAD CAM resin displayed a homogenous surface initially with low surface roughness that was significantly affected following simulating three years of manual brushing, but despite the significant weight loss, the findings are within the clinically acceptable limits. PMID:28713496

  19. Effect of denture cleaning on abrasion resistance and surface topography of polymerized CAD CAM acrylic resin denture base.

    PubMed

    Shinawi, Lana Ahmed

    2017-05-01

    The application of computer-aided design computer-aided manufacturing (CAD CAM) technology in the fabrication of complete dentures, offers numerous advantages as it provides optimum fit and eliminates polymerization shrinkage of the acrylic base. Additionally, the porosity and surface roughness of CAD CAM resins is less compared to conventionally processed resins which leads to a decrease in the adhesion of bacteria on the denture base, which is associated with many conditions including halitosis and aspiration pneumonia in elderly denture wearers. To evaluate the influence of tooth brushing with dentifrices on CAD CAM resin blocks in terms of abrasion resistance, surface roughness and scanning electron photomicrography. This experimental study was carried out at the Faculty of Dentistry of King Abdulaziz University during 2016. A total of 40 rectangular shaped polymerized CAD CAM resin samples were subjected to 40.000 and 60.000 brushing strokes under a 200-gram vertical load simulating three years of tooth brushing strokes using commercially available denture cleaning dentifrice. Data were analyzed by SPSS version 20, using descriptive statistics and ANOVA. ANOVA test revealed a statistical significant weight loss of CAD CAM acrylic resin denture base specimens following 40.000 and 60.000 brushing strokes as well as a statistical significant change (p=0.0.5) in the surface roughness following brushing. The CAD CAM resin samples SEM baseline imaging revealed a relatively smooth homogenous surface, but following 40,000 and 60,000 brushing strokes, imaging displayed the presence of small scratches on the surface. CAD CAM resin displayed a homogenous surface initially with low surface roughness that was significantly affected following simulating three years of manual brushing, but despite the significant weight loss, the findings are within the clinically acceptable limits.

  20. VCSEL-based fiber optic link for avionics: implementation and performance analyses

    NASA Astrophysics Data System (ADS)

    Shi, Jieqin; Zhang, Chunxi; Duan, Jingyuan; Wen, Huaitao

    2006-11-01

    A Gb/s fiber optic link with built-in test capability (BIT) basing on vertical-cavity surface-emitting laser (VCSEL) sources for military avionics bus for next generation has been presented in this paper. To accurately predict link performance, statistical methods and Bit Error Rate (BER) measurements have been examined. The results show that the 1Gb/s fiber optic link meets the BER requirement and values for link margin can reach up to 13dB. Analysis shows that the suggested photonic network may provide high performance and low cost interconnections alternative for future military avionics.

  1. Tolerancing aspheres based on manufacturing knowledge

    NASA Astrophysics Data System (ADS)

    Wickenhagen, S.; Kokot, S.; Fuchs, U.

    2017-10-01

    A standard way of tolerancing optical elements or systems is to perform a Monte Carlo based analysis within a common optical design software package. Although, different weightings and distributions are assumed they are all counting on statistics, which usually means several hundreds or thousands of systems for reliable results. Thus, employing these methods for small batch sizes is unreliable, especially when aspheric surfaces are involved. The huge database of asphericon was used to investigate the correlation between the given tolerance values and measured data sets. The resulting probability distributions of these measured data were analyzed aiming for a robust optical tolerancing process.

  2. Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan

    2016-01-01

    The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.

  3. Early-type galaxies in the Antlia cluster: catalogue and isophotal analysis

    NASA Astrophysics Data System (ADS)

    Calderón, Juan P.; Bassino, Lilia P.; Cellone, Sergio A.; Gómez, Matías

    2018-06-01

    We present a statistical isophotal analysis of 138 early-type galaxies in the Antlia cluster, located at a distance of ˜ 35 Mpc. The observational material consists of CCD images of four 36 × 36 arcmin2 fields obtained with the MOSAIC II camera at the Blanco 4-m telescope at Cerro Tololo Interamerican Observatory. Our present work supersedes previous Antlia studies in the sense that the covered area is four times larger, the limiting magnitude is MB ˜ -9.6 mag, and the surface photometry parameters of each galaxy are derived from Sérsic model fits extrapolated to infinity. In a companion previous study we focused on the scaling relations obtained by means of surface photometry, and now we present the data, on which the previous paper is based, the parameters of the isophotal fits as well as an isophotal analysis. For each galaxy, we derive isophotal shape parameters along the semimajor axis and search for correlations within different radial bins. Through extensive statistical tests, we also analyse the behaviour of these values against photometric and global parameters of the galaxies themselves. While some galaxies do display radial gradients in their ellipticity (ɛ) and/or their Fourier coefficients, differences in mean values between adjacent regions are not statistically significant. Regarding Fourier coefficients, dwarf galaxies usually display gradients between all adjacent regions, while non-dwarfs tend to show this behaviour just between the two outermost regions. Globally, there is no obvious correlation between Fourier coefficients and luminosity for the whole magnitude range (-12 ≳ MV ≳ -22); however, dwarfs display much higher dispersions at all radii.

  4. A global estimate of the Earth's magnetic crustal thickness

    NASA Astrophysics Data System (ADS)

    Vervelidou, Foteini; Thébault, Erwan

    2014-05-01

    The Earth's lithosphere is considered to be magnetic only down to the Curie isotherm. Therefore the Curie isotherm can, in principle, be estimated by analysis of magnetic data. Here, we propose such an analysis in the spectral domain by means of a newly introduced regional spatial power spectrum. This spectrum is based on the Revised Spherical Cap Harmonic Analysis (R-SCHA) formalism (Thébault et al., 2006). We briefly discuss its properties and its relationship with the Spherical Harmonic spatial power spectrum. This relationship allows us to adapt any theoretical expression of the lithospheric field power spectrum expressed in Spherical Harmonic degrees to the regional formulation. We compared previously published statistical expressions (Jackson, 1994 ; Voorhies et al., 2002) to the recent lithospheric field models derived from the CHAMP and airborne measurements and we finally developed a new statistical form for the power spectrum of the Earth's magnetic lithosphere that we think provides more consistent results. This expression depends on the mean magnetization, the mean crustal thickness and a power law value that describes the amount of spatial correlation of the sources. In this study, we make a combine use of the R-SCHA surface power spectrum and this statistical form. We conduct a series of regional spectral analyses for the entire Earth. For each region, we estimate the R-SCHA surface power spectrum of the NGDC-720 Spherical Harmonic model (Maus, 2010). We then fit each of these observational spectra to the statistical expression of the power spectrum of the Earth's lithosphere. By doing so, we estimate the large wavelengths of the magnetic crustal thickness on a global scale that are not accessible directly from the magnetic measurements due to the masking core field. We then discuss these results and compare them to the results we obtained by conducting a similar spectral analysis, but this time in the cartesian coordinates, by means of a published statistical expression (Maus et al., 1997). We also compare our results to crustal thickness global maps derived by means of additional geophysical data (Purucker et al., 2002).

  5. Near-Surface Transport Pathways in the North Atlantic Ocean: Looking for Throughput from the Subtropical to the Subpolar Gyre

    NASA Astrophysics Data System (ADS)

    Rypina, I. I.; Pratt, L. J.; Lozier, M.

    2011-12-01

    Motivated by discrepancies between Eulerian transport estimates and the behavior of Lagrangian surface drifters, near-surface transport pathways and processes in the North Atlantic are studied using a combination of data, altimetric surface heights, statistical analysis of trajectories, and dynamical systems techniques. Particular attention is paid to the issue of the subtropical-to-subpolar intergyre fluid exchange. The velocity field used in this study is composed of a steady drifter-derived background flow, upon which a time-dependent altimeter-based perturbation is superimposed. This analysis suggests that most of the fluid entering the subpolar gyre from the subtropical gyre within two years comes from a narrow region lying inshore of the Gulf Stream core, whereas fluid on the offshore side of the Gulf Stream is largely prevented from doing so by the Gulf Stream core, which acts as a strong transport barrier, in agreement with past studies. The transport barrier near the Gulf Stream core is robust and persistent from 1992 until 2008. The qualitative behavior is found to be largely independent of the Ekman drift.

  6. Materials Approach to Dissecting Surface Responses in the Attachment Stages of Biofouling Organisms

    DTIC Science & Technology

    2016-04-25

    their settlement behavior in regards to the coating surfaces. 5) Multivariate statistical analysis was used to examine the effect (if any) of the...applied to glass rods and were deployed in the field to evaluate settlement preferences. Canonical Analysis of Principal Coordinates were applied to...the influence of coating surface properties on the patterns in settlement observed in the field in the extension of this work over the coming year

  7. Capillary fluctuations of surface steps: An atomistic simulation study for the model Cu(111) system

    NASA Astrophysics Data System (ADS)

    Freitas, Rodrigo; Frolov, Timofey; Asta, Mark

    2017-10-01

    Molecular dynamics (MD) simulations are employed to investigate the capillary fluctuations of steps on the surface of a model metal system. The fluctuation spectrum, characterized by the wave number (k ) dependence of the mean squared capillary-wave amplitudes and associated relaxation times, is calculated for 〈110 〉 and 〈112 〉 steps on the {111 } surface of elemental copper near the melting temperature of the classical potential model considered. Step stiffnesses are derived from the MD results, yielding values from the largest system sizes of (37 ±1 ) meV/A ˚ for the different line orientations, implying that the stiffness is isotropic within the statistical precision of the calculations. The fluctuation lifetimes are found to vary by approximately four orders of magnitude over the range of wave numbers investigated, displaying a k dependence consistent with kinetics governed by step-edge mediated diffusion. The values for step stiffness derived from these simulations are compared to step free energies for the same system and temperature obtained in a recent MD-based thermodynamic-integration (TI) study [Freitas, Frolov, and Asta, Phys. Rev. B 95, 155444 (2017), 10.1103/PhysRevB.95.155444]. Results from the capillary-fluctuation analysis and TI calculations yield statistically significant differences that are discussed within the framework of statistical-mechanical theories for configurational contributions to step free energies.

  8. Bayesian statistics as a new tool for spectral analysis - I. Application for the determination of basic parameters of massive stars

    NASA Astrophysics Data System (ADS)

    Mugnes, J.-M.; Robert, C.

    2015-11-01

    Spectral analysis is a powerful tool to investigate stellar properties and it has been widely used for decades now. However, the methods considered to perform this kind of analysis are mostly based on iteration among a few diagnostic lines to determine the stellar parameters. While these methods are often simple and fast, they can lead to errors and large uncertainties due to the required assumptions. Here, we present a method based on Bayesian statistics to find simultaneously the best combination of effective temperature, surface gravity, projected rotational velocity, and microturbulence velocity, using all the available spectral lines. Different tests are discussed to demonstrate the strength of our method, which we apply to 54 mid-resolution spectra of field and cluster B stars obtained at the Observatoire du Mont-Mégantic. We compare our results with those found in the literature. Differences are seen which are well explained by the different methods used. We conclude that the B-star microturbulence velocities are often underestimated. We also confirm the trend that B stars in clusters are on average faster rotators than field B stars.

  9. Exploring Remote Sensing Products Online with Giovanni for Studying Urbanization

    NASA Technical Reports Server (NTRS)

    Shen, Suhung; Leptoukh, Gregory G.; Gerasimov, Irina; Kempler, Steve

    2012-01-01

    Recently, a Large amount of MODIS land products at multi-spatial resolutions have been integrated into the online system, Giovanni, to support studies on land cover and land use changes focused on Northern Eurasia and Monsoon Asia regions. Giovanni (Goddard Interactive Online Visualization ANd aNalysis Infrastructure) is a Web-based application developed by the NASA Goddard Earth Sciences Data and Information Services Center (GES-DISC) providing a simple and intuitive way to visualize, analyze, and access Earth science remotely-sensed and modeled data. The customized Giovanni Web portals (Giovanni-NEESPI and Giovanni-MAIRS) are created to integrate land, atmospheric, cryospheric, and social products, that enable researchers to do quick exploration and basic analyses of land surface changes and their relationships to climate at global and regional scales. This presentation documents MODIS land surface products in Giovanni system. As examples, images and statistical analysis results on land surface and local climate changes associated with urbanization over Yangtze River Delta region, China, using data in Giovanni are shown.

  10. Optimization of photocatalytic degradation of palm oil mill effluent in UV/ZnO system based on response surface methodology.

    PubMed

    Ng, Kim Hoong; Cheng, Yoke Wang; Khan, Maksudur R; Cheng, Chin Kui

    2016-12-15

    This paper reports on the optimization of palm oil mill effluent (POME) degradation in a UV-activated-ZnO system based on central composite design (CCD) in response surface methodology (RSM). Three potential factors, viz. O 2 flowrate (A), ZnO loading (B) and initial concentration of POME (C) were evaluated for the significance analysis using a 2 3 full factorial design before the optimization process. It is found that all the three main factors were significant, with contributions of 58.27% (A), 15.96% (B) and 13.85% (C), respectively, to the POME degradation. In addition, the interactions between the factors AB, AC and BC also have contributed 4.02%, 3.12% and 1.01% to the POME degradation. Subsequently, all the three factors were subjected to statistical central composite design (CCD) analysis. Quadratic models were developed and rigorously checked. A 3D-response surface was subsequently generated. Two successive validation experiments were carried out and the degradation achieved were 55.25 and 55.33%, contrasted with 52.45% for predicted degradation value. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Statistical representative elementary volumes of porous media determined using greyscale analysis of 3D tomograms

    NASA Astrophysics Data System (ADS)

    Bruns, S.; Stipp, S. L. S.; Sørensen, H. O.

    2017-09-01

    Digital rock physics carries the dogmatic concept of having to segment volume images for quantitative analysis but segmentation rejects huge amounts of signal information. Information that is essential for the analysis of difficult and marginally resolved samples, such as materials with very small features, is lost during segmentation. In X-ray nanotomography reconstructions of Hod chalk we observed partial volume voxels with an abundance that limits segmentation based analysis. Therefore, we investigated the suitability of greyscale analysis for establishing statistical representative elementary volumes (sREV) for the important petrophysical parameters of this type of chalk, namely porosity, specific surface area and diffusive tortuosity, by using volume images without segmenting the datasets. Instead, grey level intensities were transformed to a voxel level porosity estimate using a Gaussian mixture model. A simple model assumption was made that allowed formulating a two point correlation function for surface area estimates using Bayes' theory. The same assumption enables random walk simulations in the presence of severe partial volume effects. The established sREVs illustrate that in compacted chalk, these simulations cannot be performed in binary representations without increasing the resolution of the imaging system to a point where the spatial restrictions of the represented sample volume render the precision of the measurement unacceptable. We illustrate this by analyzing the origins of variance in the quantitative analysis of volume images, i.e. resolution dependence and intersample and intrasample variance. Although we cannot make any claims on the accuracy of the approach, eliminating the segmentation step from the analysis enables comparative studies with higher precision and repeatability.

  12. Towards a Holistic Cortical Thickness Descriptor: Heat Kernel-Based Grey Matter Morphology Signatures.

    PubMed

    Wang, Gang; Wang, Yalin

    2017-02-15

    In this paper, we propose a heat kernel based regional shape descriptor that may be capable of better exploiting volumetric morphological information than other available methods, thereby improving statistical power on brain magnetic resonance imaging (MRI) analysis. The mechanism of our analysis is driven by the graph spectrum and the heat kernel theory, to capture the volumetric geometry information in the constructed tetrahedral meshes. In order to capture profound brain grey matter shape changes, we first use the volumetric Laplace-Beltrami operator to determine the point pair correspondence between white-grey matter and CSF-grey matter boundary surfaces by computing the streamlines in a tetrahedral mesh. Secondly, we propose multi-scale grey matter morphology signatures to describe the transition probability by random walk between the point pairs, which reflects the inherent geometric characteristics. Thirdly, a point distribution model is applied to reduce the dimensionality of the grey matter morphology signatures and generate the internal structure features. With the sparse linear discriminant analysis, we select a concise morphology feature set with improved classification accuracies. In our experiments, the proposed work outperformed the cortical thickness features computed by FreeSurfer software in the classification of Alzheimer's disease and its prodromal stage, i.e., mild cognitive impairment, on publicly available data from the Alzheimer's Disease Neuroimaging Initiative. The multi-scale and physics based volumetric structure feature may bring stronger statistical power than some traditional methods for MRI-based grey matter morphology analysis. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Statistical characterisation of COSMO Sky-Med X-SAR retrieved precipitation fields by scale-invariance analysis

    NASA Astrophysics Data System (ADS)

    Deidda, Roberto; Mascaro, Giuseppe; Hellies, Matteo; Baldini, Luca; Roberto, Nicoletta

    2013-04-01

    COSMO Sky-Med (CSK) is an important programme of the Italian Space Agency aiming at supporting environmental monitoring and management of exogenous, endogenous and anthropogenic risks through X-band Synthetic Aperture Radar (X-SAR) on board of 4 satellites forming a constellation. Most of typical SAR applications are focused on land or ocean observation. However, X-band SAR can be detect precipitation that results in a specific signature caused by the combination of attenuation of surface returns induced by precipitation and enhancement of backscattering determined by the hydrometeors in the SAR resolution volume. Within CSK programme, we conducted an intercomparison between the statistical properties of precipitation fields derived by CSK SARs and those derived by the CNR Polar 55C (C-band) ground based weather radar located in Rome (Italy). This contribution presents main results of this research which was aimed at the robust characterisation of rainfall statistical properties across different scales by means of scale-invariance analysis and multifractal theory. The analysis was performed on a dataset of more two years of precipitation observations collected by the CNR Polar 55C radar and rainfall fields derived from available images collected by the CSK satellites during intense rainfall events. Scale-invariance laws and multifractal properties were detected on the most intense rainfall events derived from the CNR Polar 55C radar for spatial scales from 4 km to 64 km. The analysis on X-SAR retrieved rainfall fields, although based on few images, leaded to similar results and confirmed the existence of scale-invariance and multifractal properties for scales larger than 4 km. These outcomes encourage investigating SAR methodologies for future development of meteo-hydrological forecasting models based on multifractal theory.

  14. Biological Surface Adsorption Index of Nanomaterials: Modelling Surface Interactions of Nanomaterials with Biomolecules.

    PubMed

    Chen, Ran; Riviere, Jim E

    2017-01-01

    Quantitative analysis of the interactions between nanomaterials and their surrounding environment is crucial for safety evaluation in the application of nanotechnology as well as its development and standardization. In this chapter, we demonstrate the importance of the adsorption of surrounding molecules onto the surface of nanomaterials by forming biocorona and thus impact the bio-identity and fate of those materials. We illustrate the key factors including various physical forces in determining the interaction happening at bio-nano interfaces. We further discuss the mathematical endeavors in explaining and predicting the adsorption phenomena, and propose a new statistics-based surface adsorption model, the Biological Surface Adsorption Index (BSAI), to quantitatively analyze the interaction profile of surface adsorption of a large group of small organic molecules onto nanomaterials with varying surface physicochemical properties, first employing five descriptors representing the surface energy profile of the nanomaterials, then further incorporating traditional semi-empirical adsorption models to address concentration effects of solutes. These Advancements in surface adsorption modelling showed a promising development in the application of quantitative predictive models in biological applications, nanomedicine, and environmental safety assessment of nanomaterials.

  15. Assessing soil quality indicator under different land use and soil erosion using multivariate statistical techniques.

    PubMed

    Nosrati, Kazem

    2013-04-01

    Soil degradation associated with soil erosion and land use is a critical problem in Iran and there is little or insufficient scientific information in assessing soil quality indicator. In this study, factor analysis (FA) and discriminant analysis (DA) were used to identify the most sensitive indicators of soil quality for evaluating land use and soil erosion within the Hiv catchment in Iran and subsequently compare soil quality assessment using expert opinion based on soil surface factors (SSF) form of Bureau of Land Management (BLM) method. Therefore, 19 soil physical, chemical, and biochemical properties were measured from 56 different sampling sites covering three land use/soil erosion categories (rangeland/surface erosion, orchard/surface erosion, and rangeland/stream bank erosion). FA identified four factors that explained for 82 % of the variation in soil properties. Three factors showed significant differences among the three land use/soil erosion categories. The results indicated that based upon backward-mode DA, dehydrogenase, silt, and manganese allowed more than 80 % of the samples to be correctly assigned to their land use and erosional status. Canonical scores of discriminant functions were significantly correlated to the six soil surface indices derived of BLM method. Stepwise linear regression revealed that soil surface indices: soil movement, surface litter, pedestalling, and sum of SSF were also positively related to the dehydrogenase and silt. This suggests that dehydrogenase and silt are most sensitive to land use and soil erosion.

  16. Separate modal analysis for tumor detection with a digital image elasto tomography (DIET) breast cancer screening system.

    PubMed

    Kashif, Amer S; Lotz, Thomas F; Heeren, Adrianus M W; Chase, James G

    2013-11-01

    It is estimated that every year, 1 × 10(6) women are diagnosed with breast cancer, and more than 410,000 die annually worldwide. Digital Image Elasto Tomography (DIET) is a new noninvasive breast cancer screening modality that induces mechanical vibrations in the breast and images its surface motion with digital cameras to detect changes in stiffness. This research develops a new automated approach for diagnosing breast cancer using DIET based on a modal analysis model. The first and second natural frequency of silicone phantom breasts is analyzed. Separate modal analysis is performed for each region of the phantom to estimate the modal parameters using imaged motion data over several input frequencies. Statistical methods are used to assess the likelihood of a frequency shift, which can indicate tumor location. Phantoms with 5, 10, and 20 mm stiff inclusions are tested, as well as a homogeneous (healthy) phantom. Inclusions are located at four locations with different depth. The second natural frequency proves to be a reliable metric with the potential to clearly distinguish lesion like inclusions of different stiffness, as well as providing an approximate location for the tumor like inclusions. The 10 and 20 mm inclusions are always detected regardless of depth. The 5 mm inclusions are only detected near the surface. The homogeneous phantom always yields a negative result, as expected. Detection is based on a statistical likelihood analysis to determine the presence of significantly different frequency response over the phantom, which is a novel approach to this problem. The overall results show promise and justify proof of concept trials with human subjects.

  17. GIS, geostatistics, metadata banking, and tree-based models for data analysis and mapping in environmental monitoring and epidemiology.

    PubMed

    Schröder, Winfried

    2006-05-01

    By the example of environmental monitoring, some applications of geographic information systems (GIS), geostatistics, metadata banking, and Classification and Regression Trees (CART) are presented. These tools are recommended for mapping statistically estimated hot spots of vectors and pathogens. GIS were introduced as tools for spatially modelling the real world. The modelling can be done by mapping objects according to the spatial information content of data. Additionally, this can be supported by geostatistical and multivariate statistical modelling. This is demonstrated by the example of modelling marine habitats of benthic communities and of terrestrial ecoregions. Such ecoregionalisations may be used to predict phenomena based on the statistical relation between measurements of an interesting phenomenon such as, e.g., the incidence of medically relevant species and correlated characteristics of the ecoregions. The combination of meteorological data and data on plant phenology can enhance the spatial resolution of the information on climate change. To this end, meteorological and phenological data have to be correlated. To enable this, both data sets which are from disparate monitoring networks have to be spatially connected by means of geostatistical estimation. This is demonstrated by the example of transformation of site-specific data on plant phenology into surface data. The analysis allows for spatial comparison of the phenology during the two periods 1961-1990 and 1991-2002 covering whole Germany. The changes in both plant phenology and air temperature were proved to be statistically significant. Thus, they can be combined by GIS overlay technique to enhance the spatial resolution of the information on the climate change and use them for the prediction of vector incidences at the regional scale. The localisation of such risk hot spots can be done by geometrically merging surface data on promoting factors. This is demonstrated by the example of the transfer of heavy metals through soils. The predicted hot spots of heavy metal transfer can be validated empirically by measurement data which can be inquired by a metadata base linked with a geographic information system. A corresponding strategy for the detection of vector hot spots in medical epidemiology is recommended. Data on incidences and habitats of the Anophelinae in the marsh regions of Lower Saxony (Germany) were used to calculate a habitat model by CART, which together with climate data and data on ecoregions can be further used for the prediction of habitats of medically relevant vector species. In the future, this approach should be supported by an internet-based information system consisting of three components: metadata questionnaire, metadata base, and GIS to link metadata, surface data, and measurement data on incidences and habitats of medically relevant species and related data on climate, phenology, and ecoregional characteristic conditions.

  18. A Microstructural Approach Toward the Quantification of Anomaly Bond Coat Surface Geometry Change in NiCoCrAlY Plasma-Sprayed Bond Coat

    NASA Astrophysics Data System (ADS)

    Shahbeigi-Roodposhti, Peiman; Jordan, Eric; Shahbazmohamadi, Sina

    2017-12-01

    Three-dimensional behavior of NiCoCrAlY bond coat surface geometry change (known as rumpling) was characterized during 120 h of thermal cycling. The proposed scanning electron microscope (SEM)-based 3D imaging method allows for recording the change in both height and width at the same location during the heat treatment. Statistical analysis using both profile information [two dimensions (2D)] and surface information [three dimensions (3D)] demonstrated a typical nature of rumpling as increase in height and decrease in width. However, it also revealed an anomaly of height reduction between 40 and 80 cycles. Such behavior was further investigated by analyzing the bearing area ratio curve of the surface and attributed to filling of voids and valleys by the growth of thermally grown oxide.

  19. The surface abundance and stratigraphy of lunar rocks from data about their albedo

    NASA Technical Reports Server (NTRS)

    Shevchenko, V. V.

    1977-01-01

    The data pf ground-based studies and surveys of the lunar surface by the Zond and Apollo spacecraft have been used to construct an albedo map covering 80 percent of the lunar sphere. Statistical analysis of the distribution of areas with various albedos shows several types of lunar surface. Comparison of albedo data for maria and continental areas with the results of geochemical orbital surveys allows the identification of the types of surface with known types of lunar rock. The aluminum/silcon and magnesium/silicon ratios as measured by the geochemical experiments on the Apollo 15 and Apollo 16 spacecraft were used as an indication of the chemical composition of the rock. The relationship of the relative aluminum content to the age of crystalline rocks allows a direct dependence to be constructed between the mean albedo of areas and the age of the rocks of which they are composed.

  20. Remote sensing based water-use efficiency evaluation in sub-surface irrigated wine grape vines

    NASA Astrophysics Data System (ADS)

    Zúñiga, Carlos Espinoza; Khot, Lav R.; Jacoby, Pete; Sankaran, Sindhuja

    2016-05-01

    Increased water demands have forced agriculture industry to investigate better irrigation management strategies in crop production. Efficient irrigation systems, improved irrigation scheduling, and selection of crop varieties with better water-use efficiencies can aid towards conserving water. In an ongoing experiment carried on in Red Mountain American Viticulture area near Benton City, Washington, subsurface drip irrigation treatments at 30, 60 and 90 cm depth, and 15, 30 and 60% irrigation were applied to satisfy evapotranspiration demand using pulse and continuous irrigation. These treatments were compared to continuous surface irrigation applied at 100% evapotranspiration demand. Thermal infrared and multispectral images were acquired using unmanned aerial vehicle during the growing season. Obtained results indicated no difference in yield among treatments (p<0.05), however there was statistical difference in leaf temperature comparing surface and subsurface irrigation (p<0.05). Normalized vegetation index obtained from the analysis of multispectral images showed statistical difference among treatments when surface and subsurface irrigation methods were compared. Similar differences in vegetation index values were observed, when irrigation rates were compared. Obtained results show the applicability of aerial thermal infrared and multispectral images to characterize plant responses to different irrigation treatments and use of such information in irrigation scheduling or high-throughput selection of water-use efficient crop varieties in plant breeding.

  1. On testing for spatial correspondence between maps of human brain structure and function.

    PubMed

    Alexander-Bloch, Aaron F; Shou, Haochang; Liu, Siyuan; Satterthwaite, Theodore D; Glahn, David C; Shinohara, Russell T; Vandekar, Simon N; Raznahan, Armin

    2018-06-01

    A critical issue in many neuroimaging studies is the comparison between brain maps. Nonetheless, it remains unclear how one should test hypotheses focused on the overlap or spatial correspondence between two or more brain maps. This "correspondence problem" affects, for example, the interpretation of comparisons between task-based patterns of functional activation, resting-state networks or modules, and neuroanatomical landmarks. To date, this problem has been addressed with remarkable variability in terms of methodological approaches and statistical rigor. In this paper, we address the correspondence problem using a spatial permutation framework to generate null models of overlap by applying random rotations to spherical representations of the cortical surface, an approach for which we also provide a theoretical statistical foundation. We use this method to derive clusters of cognitive functions that are correlated in terms of their functional neuroatomical substrates. In addition, using publicly available data, we formally demonstrate the correspondence between maps of task-based functional activity, resting-state fMRI networks and gyral-based anatomical landmarks. We provide open-access code to implement the methods presented for two commonly-used tools for surface based cortical analysis (https://www.github.com/spin-test). This spatial permutation approach constitutes a useful advance over widely-used methods for the comparison of cortical maps, thereby opening new possibilities for the integration of diverse neuroimaging data. Copyright © 2018 Elsevier Inc. All rights reserved.

  2. Carbonaceous particulate matter on the lung surface from adults living in São Paulo, Brazil.

    PubMed

    Padovan, Michele Galhardoni; Whitehouse, Abigail; Gouveia, Nelson; Habermann, Mateus; Grigg, Jonathan

    2017-01-01

    We therefore sought to identify the exposures associated with lung surface in long-term residents of São Paulo, Brazil. Lung surface carbon were analyzed in 72 autopsy specimens by image analysis. Smoking history, measured PM10 nearest to the home, distance to main road, and distance-weighted traffic density were used as exposure variables. Data are summarized as median (IQR), and compared by Mann Whitney Test, with correlations done by Spearman's correlation. There was no association between lung surface and age or gender. There was no statistically significant association in lung surface between smokers and non-smokers 6.74 cm2 (3.47 to 10.02) versus 5.20cm2 (2.29 to 7.54), and there was no significant association between lung surface carbon and exposure to environmental PM and markers of traffic exposure. We did not find a statistically significant association between lung surface and smokers and non-smokers, and no statistically significant association between lung surface carbon and environmental exposure variables. These results suggest that lung surface carbon in long-term residents of São Paulo may predominately be from environmental PM, but the most appropriate environmental exposure marker remains unclear.

  3. Surface Roughness of the Moon Derived from Multi-frequency Radar Data

    NASA Astrophysics Data System (ADS)

    Fa, W.

    2011-12-01

    Surface roughness of the Moon provides important information concerning both significant questions about lunar surface processes and engineering constrains for human outposts and rover trafficabillity. Impact-related phenomena change the morphology and roughness of lunar surface, and therefore surface roughness provides clues to the formation and modification mechanisms of impact craters. Since the Apollo era, lunar surface roughness has been studied using different approaches, such as direct estimation from lunar surface digital topographic relief, and indirect analysis of Earth-based radar echo strengths. Submillimeter scale roughness at Apollo landing sites has been studied by computer stereophotogrammetry analysis of Apollo Lunar Surface Closeup Camera (ALSCC) pictures, whereas roughness at meter to kilometer scale has been studied using laser altimeter data from recent missions. Though these studies shown lunar surface roughness is scale dependent that can be described by fractal statistics, roughness at centimeter scale has not been studied yet. In this study, lunar surface roughnesses at centimeter scale are investigated using Earth-based 70 cm Arecibo radar data and miniature synthetic aperture radar (Mini-SAR) data at S- and X-band (with wavelengths 12.6 cm and 4.12 cm). Both observations and theoretical modeling show that radar echo strengths are mostly dominated by scattering from the surface and shallow buried rocks. Given the different penetration depths of radar waves at these frequencies (< 30 m for 70 cm wavelength, < 3 m at S-band, and < 1 m at X-band), radar echo strengths at S- and X-band will yield surface roughness directly, whereas radar echo at 70-cm will give an upper limit of lunar surface roughness. The integral equation method is used to model radar scattering from the rough lunar surface, and dielectric constant of regolith and surface roughness are two dominate factors. The complex dielectric constant of regolith is first estimated globally using the regolith composition and the relation among the dielectric constant, bulk density, and regolith composition. The statistical properties of lunar surface roughness are described by the root mean square (RMS) height and correlation length, which represent the vertical and horizontal scale of the roughness. The correlation length and its scale dependence are studied using the topography data from laser altimeter observations from recent lunar missions. As these two parameters are known, surface roughness (RMS slope) can be estimated by minimizing the difference between the observed and modeled radar echo strength. Surface roughness of several regions over Oceanus Procellarum and southeastern highlands on lunar nearside are studied, and preliminary results show that maira is smoother than highlands at 70 cm scale, whereas the situation turns opposite at 12 and 4 cm scale. Surface roughness of young craters is in general higher than that of maria and highlands, indicating large rock population produced during impacting process.

  4. Statistical Distribution Analysis of Lineated Bands on Europa

    NASA Astrophysics Data System (ADS)

    Chen, T.; Phillips, C. B.; Pappalardo, R. T.

    2016-12-01

    Tina Chen, Cynthia B. Phillips, Robert T. Pappalardo Europa's surface is covered with intriguing linear and disrupted features, including lineated bands that range in scale and size. Previous studies have shown the possibility of an icy shell at the surface that may be concealing a liquid ocean with the potential to harboring life (Pappalardo et al., 1999). Utilizing the high-resolution imaging data from the Galileo spacecraft, we examined bands through a morphometric and morphologic approach. Greeley et al. (2000) and Procktor et al. (2002) have defined bands as wide, hummocky to lineated features that have distinctive surface texture and albedo compared to its surrounding terrain. We took morphometric measurements of lineated bands to find correlations in properties such as size, location, and orientation, and to shed light on formation models. We will present our measurements of over 100 bands on Europa that was mapped on the USGS Europa Global Mosaic Base Map (2002). We also conducted a statistical analysis to understand the distribution of lineated bands globally, and whether the widths of the bands differ by location. Our preliminary analysis from our statistical distribution evaluation, combined with the morphometric measurements, supports a uniform ice shell thickness for Europa rather than one that varies geographically. References: Greeley, Ronald, et al. "Geologic mapping of Europa." Journal of Geophysical Research: Planets 105.E9 (2000): 22559-22578.; Pappalardo, R. T., et al. "Does Europa have a subsurface ocean? Evaluation of the geological evidence." Journal of Geophysical Research: Planets 104.E10 (1999): 24015-24055.; Prockter, Louise M., et al. "Morphology of Europan bands at high resolution: A mid-ocean ridge-type rift mechanism." Journal of Geophysical Research: Planets 107.E5 (2002).; U.S. Geological Survey, 2002, Controlled photomosaic map of Europa, Je 15M CMN: U.S. Geological Survey Geologic Investigations Series I-2757, available at http://pubs.usgs.gov/imap/i2757/

  5. Comparison of innovative molecular approaches and standard spore assays for assessment of surface cleanliness.

    PubMed

    Cooper, Moogega; La Duc, Myron T; Probst, Alexander; Vaishampayan, Parag; Stam, Christina; Benardini, James N; Piceno, Yvette M; Andersen, Gary L; Venkateswaran, Kasthuri

    2011-08-01

    A bacterial spore assay and a molecular DNA microarray method were compared for their ability to assess relative cleanliness in the context of bacterial abundance and diversity on spacecraft surfaces. Colony counts derived from the NASA standard spore assay were extremely low for spacecraft surfaces. However, the PhyloChip generation 3 (G3) DNA microarray resolved the genetic signatures of a highly diverse suite of microorganisms in the very same sample set. Samples completely devoid of cultivable spores were shown to harbor the DNA of more than 100 distinct microbial phylotypes. Furthermore, samples with higher numbers of cultivable spores did not necessarily give rise to a greater microbial diversity upon analysis with the DNA microarray. The findings of this study clearly demonstrated that there is not a statistically significant correlation between the cultivable spore counts obtained from a sample and the degree of bacterial diversity present. Based on these results, it can be stated that validated state-of-the-art molecular techniques, such as DNA microarrays, can be utilized in parallel with classical culture-based methods to further describe the cleanliness of spacecraft surfaces.

  6. Statistical analysis of QC data and estimation of fuel rod behaviour

    NASA Astrophysics Data System (ADS)

    Heins, L.; Groβ, H.; Nissen, K.; Wunderlich, F.

    1991-02-01

    The behaviour of fuel rods while in reactor is influenced by many parameters. As far as fabrication is concerned, fuel pellet diameter and density, and inner cladding diameter are important examples. Statistical analyses of quality control data show a scatter of these parameters within the specified tolerances. At present it is common practice to use a combination of superimposed unfavorable tolerance limits (worst case dataset) in fuel rod design calculations. Distributions are not considered. The results obtained in this way are very conservative but the degree of conservatism is difficult to quantify. Probabilistic calculations based on distributions allow the replacement of the worst case dataset by a dataset leading to results with known, defined conservatism. This is achieved by response surface methods and Monte Carlo calculations on the basis of statistical distributions of the important input parameters. The procedure is illustrated by means of two examples.

  7. Point-based and model-based geolocation analysis of airborne laser scanning data

    NASA Astrophysics Data System (ADS)

    Sefercik, Umut Gunes; Buyuksalih, Gurcan; Jacobsen, Karsten; Alkan, Mehmet

    2017-01-01

    Airborne laser scanning (ALS) is one of the most effective remote sensing technologies providing precise three-dimensional (3-D) dense point clouds. A large-size ALS digital surface model (DSM) covering the whole Istanbul province was analyzed by point-based and model-based comprehensive statistical approaches. Point-based analysis was performed using checkpoints on flat areas. Model-based approaches were implemented in two steps as strip to strip comparing overlapping ALS DSMs individually in three subareas and comparing the merged ALS DSMs with terrestrial laser scanning (TLS) DSMs in four other subareas. In the model-based approach, the standard deviation of height and normalized median absolute deviation were used as the accuracy indicators combined with the dependency of terrain inclination. The results demonstrate that terrain roughness has a strong impact on the vertical accuracy of ALS DSMs. From the relative horizontal shifts determined and partially improved by merging the overlapping strips and comparison of the ALS, and the TLS, data were found not to be negligible. The analysis of ALS DSM in relation to TLS DSM allowed us to determine the characteristics of the DSM in detail.

  8. An Assessment of Actual and Potential Building Climate Zone Change and Variability From the Last 30 Years Through 2100 Using NASA's MERRA and CMIP5 Simulations

    NASA Technical Reports Server (NTRS)

    Stackhouse, Paul W., Jr.; Chandler, William S.; Hoell, James M.; Westberg, David; Zhang, Taiping

    2015-01-01

    Background: In the US, residential and commercial building infrastructure combined consumes about 40% of total energy usage and emits about 39% of total CO2 emission (DOE/EIA "Annual Energy Outlook 2013"). Building codes, as used by local and state enforcement entities are typically tied to the dominant climate within an enforcement jurisdiction classified according to various climate zones. These climate zones are based upon a 30-year average of local surface observations and are developed by DOE and ASHRAE. Establishing the current variability and potential changes to future building climate zones is very important for increasing the energy efficiency of buildings and reducing energy costs and emissions in the future. Objectives: This paper demonstrates the usefulness of using NASA's Modern Era Retrospective-analysis for Research and Applications (MERRA) atmospheric data assimilation to derive the DOE/ASHRAE building climate zone maps and then using MERRA to define the last 30 years of variability in climate zones for the Continental US. An atmospheric assimilation is a global atmospheric model optimized to satellite, atmospheric and surface in situ measurements. Using MERRA as a baseline, we then evaluate the latest Climate Model Inter-comparison Project (CMIP) climate model Version 5 runs to assess potential variability in future climate zones under various assumptions. Methods: We derive DOE/ASHRAE building climate zones using surface and temperature data products from MERRA. We assess these zones using the uncertainties derived by comparison to surface measurements. Using statistical tests, we evaluate variability of the climate zones in time and assess areas in the continental US for statistically significant trends by region. CMIP 5 produced a data base of over two dozen detailed climate model runs under various greenhouse gas forcing assumptions. We evaluate the variation in building climate zones for 3 different decades using an ensemble and quartile statistics to provide an assessment of potential building climate zone changes relative to the uncertainties demonstrated using MERRA. Findings and Conclusions: These results show that there is a statistically significant increase in the area covered by warmer climate zones and a tendency for a reduction of area in colder climate zones in some limited regions. The CMIP analysis shows that models vary from relatively little building climate zone change for the least sensitive and conservation assumptions to a warming of at most 3 zones for certain areas, particularly the north central US by the end of the 21st century.

  9. Using time-of-flight secondary ion mass spectrometry and multivariate statistical analysis to detect and image octabenzyl-polyhedral oligomeric silsesquioxane in polycarbonate

    NASA Astrophysics Data System (ADS)

    Smentkowski, V. S.; Duong, H. M.; Tamaki, R.; Keenan, M. R.; Ohlhausen, J. A. Tony; Kotula, P. G.

    2006-11-01

    Silsesquioxane, with an empirical formula of RSiO3/2, has the potential to combine the mechanical properties of plastics with the oxidative stability of ceramics in one material [D.W. Scott, J. Am. Chem. Soc. 68 (1946) 356; K.J. Shea, D.A. Loy, Acc. Chem. Res. 34 (2001) 707; K.-M. Kim, D.-K. Keum, Y. Chujo, Macromolecules 36 (2003) 867; M.J. Abad, L. Barral, D.P. Fasce, R.J.J. William, Macromolecules 36 (2003) 3128]. The high sensitivity, surface specificity, and ability to detect and image high mass additives make time-of-flight secondary ion mass spectrometry (ToF-SIMS) a powerful surface analytical instrument for the characterization of polymer composite surfaces in an analytical laboratory [J.C. Vickerman, D. Briggs (Eds.), ToF-SIMS Surface Analysis by Mass Spectrometry, Surface Spectra/IMPublications, UK, 2001; X. Vanden Eynde, P. Bertand, Surf. Interface Anal. 27 (1999) 157; P.M. Thompson, Anal. Chem. 63 (1991) 2447; S.J. Simko, S.R. Bryan, D.P. Griffis, R.W. Murray, R.W. Linton, Anal. Chem. 57 (1985) 1198; S. Affrossman, S.A. O'Neill, M. Stamm, Macromolecules 31 (1998) 6280]. In this paper, we compare ToF-SIMS spectra of control samples with spectra generated from polymer nano-composites based on octabenzyl-polyhedral oligomeric silsesquioxane (BnPOSS) as well as spectra (and images) generated from multivariate statistical analysis (MVSA) of the entire spectral image. We will demonstrate that ToF-SIMS is able to detect and image low concentrations of BnPOSS in polycarbonate. We emphasize the use of MVSA tools for converting the massive amount of data contained in a ToF-SIMS spectral image into a smaller number of useful chemical components (spectra and images) that fully describe the ToF-SIMS measurement.

  10. Evaluation of Surface Roughness and Tensile Strength of Base Metal Alloys Used for Crown and Bridge on Recasting (Recycling).

    PubMed

    Agrawal, Amit; Hashmi, Syed W; Rao, Yogesh; Garg, Akanksha

    2015-07-01

    Dental casting alloys play a prominent role in the restoration of the partial dentition. Casting alloys have to survive long term in the mouth and also have the combination of structure, molecules, wear resistance and biologic compatibility. According to ADA system casting alloys were divided into three groups (wt%); high noble, Noble and predominantly base metal alloys. To evaluate the mechanical properties such as tensile strength and surface roughness of the new and recast base metal (nickel-chromium) alloys. Recasting of the base metal alloys derived from sprue and button, to make it reusable has been done. A total of 200 test specimens were fabricated using specially fabricated jig of metal and divided into two groups- 100 specimens of new alloy and 100 specimens of recast alloys, which were tested for tensile strength on universal testing machine and surface roughness on surface roughness tester. Tensile strength of new alloy showed no statistically significant difference (p-value>0.05) from recast alloy whereas new alloy had statistically significant surface roughness (Maximum and Average surface roughness) difference (p-value<0.01) as compared to recast alloy. Within the limitations of the study it is concluded that the tensile strength will not be affected by recasting of nickel-chromium alloy whereas surface roughness increases markedly.

  11. A Finite-Volume "Shaving" Method for Interfacing NASA/DAO''s Physical Space Statistical Analysis System to the Finite-Volume GCM with a Lagrangian Control-Volume Vertical Coordinate

    NASA Technical Reports Server (NTRS)

    Lin, Shian-Jiann; DaSilva, Arlindo; Atlas, Robert (Technical Monitor)

    2001-01-01

    Toward the development of a finite-volume Data Assimilation System (fvDAS), a consistent finite-volume methodology is developed for interfacing the NASA/DAO's Physical Space Statistical Analysis System (PSAS) to the joint NASA/NCAR finite volume CCM3 (fvCCM3). To take advantage of the Lagrangian control-volume vertical coordinate of the fvCCM3, a novel "shaving" method is applied to the lowest few model layers to reflect the surface pressure changes as implied by the final analysis. Analysis increments (from PSAS) to the upper air variables are then consistently put onto the Lagrangian layers as adjustments to the volume-mean quantities during the analysis cycle. This approach is demonstrated to be superior to the conventional method of using independently computed "tendency terms" for surface pressure and upper air prognostic variables.

  12. Laser-diagnostic mapping of temperature and soot statistics in a 2-m diameter turbulent pool fire

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kearney, Sean P.; Grasser, Thomas W.

    We present spatial profiles of temperature and soot-volume-fraction statistics from a sooting 2-m base diameter turbulent pool fire, burning a 10%-toluene / 90%-methanol fuel mixture. Dual-pump coherent anti-Stokes Raman scattering and laser-induced incandescence are utilized to obtain radial profiles of temperature and soot probability density functions (pdf) as well as estimates of temperature/soot joint statistics at three vertical heights above the surface of the methanol/toluene fuel pool. Results are presented both in the fuel vapor-dome region at ¼ base diameter and in the actively burning region at ½ and ¾ diameters above the fuel surface. The spatial evolution of themore » soot and temperature pdfs is discussed and profiles of the temperature and soot mean and rms statistics are provided. Joint temperature/soot statistics are presented as spatially resolved conditional averages across the fire plume, and in terms of a joint pdf obtained by including measurements from multiple spatial locations.« less

  13. Laser-diagnostic mapping of temperature and soot statistics in a 2-m diameter turbulent pool fire

    DOE PAGES

    Kearney, Sean P.; Grasser, Thomas W.

    2017-08-10

    We present spatial profiles of temperature and soot-volume-fraction statistics from a sooting 2-m base diameter turbulent pool fire, burning a 10%-toluene / 90%-methanol fuel mixture. Dual-pump coherent anti-Stokes Raman scattering and laser-induced incandescence are utilized to obtain radial profiles of temperature and soot probability density functions (pdf) as well as estimates of temperature/soot joint statistics at three vertical heights above the surface of the methanol/toluene fuel pool. Results are presented both in the fuel vapor-dome region at ¼ base diameter and in the actively burning region at ½ and ¾ diameters above the fuel surface. The spatial evolution of themore » soot and temperature pdfs is discussed and profiles of the temperature and soot mean and rms statistics are provided. Joint temperature/soot statistics are presented as spatially resolved conditional averages across the fire plume, and in terms of a joint pdf obtained by including measurements from multiple spatial locations.« less

  14. K-SPAN: A lexical database of Korean surface phonetic forms and phonological neighborhood density statistics.

    PubMed

    Holliday, Jeffrey J; Turnbull, Rory; Eychenne, Julien

    2017-10-01

    This article presents K-SPAN (Korean Surface Phonetics and Neighborhoods), a database of surface phonetic forms and several measures of phonological neighborhood density for 63,836 Korean words. Currently publicly available Korean corpora are limited by the fact that they only provide orthographic representations in Hangeul, which is problematic since phonetic forms in Korean cannot be reliably predicted from orthographic forms. We describe the method used to derive the surface phonetic forms from a publicly available orthographic corpus of Korean, and report on several statistics calculated using this database; namely, segment unigram frequencies, which are compared to previously reported results, along with segment-based and syllable-based neighborhood density statistics for three types of representation: an "orthographic" form, which is a quasi-phonological representation, a "conservative" form, which maintains all known contrasts, and a "modern" form, which represents the pronunciation of contemporary Seoul Korean. These representations are rendered in an ASCII-encoded scheme, which allows users to query the corpus without having to read Korean orthography, and permits the calculation of a wide range of phonological measures.

  15. Estimation of Surface Air Temperature from MODIS 1km Resolution Land Surface Temperature Over Northern China

    NASA Technical Reports Server (NTRS)

    Shen, Suhung; Leptoukh, Gregory G.; Gerasimov, Irina

    2010-01-01

    Surface air temperature is a critical variable to describe the energy and water cycle of the Earth-atmosphere system and is a key input element for hydrology and land surface models. It is a very important variable in agricultural applications and climate change studies. This is a preliminary study to examine statistical relationships between ground meteorological station measured surface daily maximum/minimum air temperature and satellite remotely sensed land surface temperature from MODIS over the dry and semiarid regions of northern China. Studies were conducted for both MODIS-Terra and MODIS-Aqua by using year 2009 data. Results indicate that the relationships between surface air temperature and remotely sensed land surface temperature are statistically significant. The relationships between the maximum air temperature and daytime land surface temperature depends significantly on land surface types and vegetation index, but the minimum air temperature and nighttime land surface temperature has little dependence on the surface conditions. Based on linear regression relationship between surface air temperature and MODIS land surface temperature, surface maximum and minimum air temperatures are estimated from 1km MODIS land surface temperature under clear sky conditions. The statistical errors (sigma) of the estimated daily maximum (minimum) air temperature is about 3.8 C(3.7 C).

  16. Standardization of a Volumetric Displacement Measurement for Two-Body Abrasion Scratch Test Data Analysis

    NASA Technical Reports Server (NTRS)

    Street, K. W. Jr.; Kobrick, R. L.; Klaus, D. M.

    2011-01-01

    A limitation has been identified in the existing test standards used for making controlled, two-body abrasion scratch measurements based solely on the width of the resultant score on the surface of the material. A new, more robust method is proposed for analyzing a surface scratch that takes into account the full three-dimensional profile of the displaced material. To accomplish this, a set of four volume- displacement metrics was systematically defined by normalizing the overall surface profile to denote statistically the area of relevance, termed the Zone of Interaction. From this baseline, depth of the trough and height of the plowed material are factored into the overall deformation assessment. Proof-of-concept data were collected and analyzed to demonstrate the performance of this proposed methodology. This technique takes advantage of advanced imaging capabilities that allow resolution of the scratched surface to be quantified in greater detail than was previously achievable. When reviewing existing data analysis techniques for conducting two-body abrasive scratch tests, it was found that the ASTM International Standard G 171 specified a generic metric based only on visually determined scratch width as a way to compare abraded materials. A limitation to this method was identified in that the scratch width is based on optical surface measurements, manually defined by approximating the boundaries, but does not consider the three-dimensional volume of material that was displaced. With large, potentially irregular deformations occurring on softer materials, it becomes unclear where to systematically determine the scratch width. Specifically, surface scratches on different samples may look the same from a top view, resulting in an identical scratch width measurement, but may vary in actual penetration depth and/or plowing deformation. Therefore, two different scratch profiles would be measured as having identical abrasion properties, although they differ significantly.

  17. Statistical shape model-based reconstruction of a scaled, patient-specific surface model of the pelvis from a single standard AP x-ray radiograph

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng Guoyan

    2010-04-15

    Purpose: The aim of this article is to investigate the feasibility of using a statistical shape model (SSM)-based reconstruction technique to derive a scaled, patient-specific surface model of the pelvis from a single standard anteroposterior (AP) x-ray radiograph and the feasibility of estimating the scale of the reconstructed surface model by performing a surface-based 3D/3D matching. Methods: Data sets of 14 pelvises (one plastic bone, 12 cadavers, and one patient) were used to validate the single-image based reconstruction technique. This reconstruction technique is based on a hybrid 2D/3D deformable registration process combining a landmark-to-ray registration with a SSM-based 2D/3D reconstruction.more » The landmark-to-ray registration was used to find an initial scale and an initial rigid transformation between the x-ray image and the SSM. The estimated scale and rigid transformation were used to initialize the SSM-based 2D/3D reconstruction. The optimal reconstruction was then achieved in three stages by iteratively matching the projections of the apparent contours extracted from a 3D model derived from the SSM to the image contours extracted from the x-ray radiograph: Iterative affine registration, statistical instantiation, and iterative regularized shape deformation. The image contours are first detected by using a semiautomatic segmentation tool based on the Livewire algorithm and then approximated by a set of sparse dominant points that are adaptively sampled from the detected contours. The unknown scales of the reconstructed models were estimated by performing a surface-based 3D/3D matching between the reconstructed models and the associated ground truth models that were derived from a CT-based reconstruction method. Such a matching also allowed for computing the errors between the reconstructed models and the associated ground truth models. Results: The technique could reconstruct the surface models of all 14 pelvises directly from the landmark-based initialization. Depending on the surface-based matching techniques, the reconstruction errors were slightly different. When a surface-based iterative affine registration was used, an average reconstruction error of 1.6 mm was observed. This error was increased to 1.9 mm, when a surface-based iterative scaled rigid registration was used. Conclusions: It is feasible to reconstruct a scaled, patient-specific surface model of the pelvis from single standard AP x-ray radiograph using the present approach. The unknown scale of the reconstructed model can be estimated by performing a surface-based 3D/3D matching.« less

  18. Do European Standard Disinfectant tests truly simulate in-use microbial and organic soiling conditions on food preparation surfaces?

    PubMed

    Meyer, B; Morin, V N; Rödger, H-J; Holah, J; Bird, C

    2010-04-01

    The results from European standard disinfectant tests are used as one basis to approve the use of disinfectants in Europe. The design of these laboratory-based tests should thus simulate as closely as possible the practical conditions and challenges that the disinfectants would encounter in use. No evidence is available that the organic and microbial loading in these tests simulates actual levels in the food service sector. Total organic carbon (TOC) and total viable count (TVC) were determined on 17 visibly clean and 45 visibly dirty surfaces in two restaurants and the food preparation surfaces of a large retail store. These values were compared to reference values recovered from surfaces soiled with the organic and microbial loading, following the standard conditions of the European Surface Test for bactericidal efficacy, EN 13697. The TOC reference values for clean and dirty conditions were higher than the data from practice, but cannot be regarded as statistical outliers. This was considered as a conservative assessment; however, as additional nine TOC samples from visibly dirty surfaces were discarded from the analysis, as their loading made them impossible to process. Similarly, the recovery of test organisms from surfaces contaminated according to EN 13697 was higher than the TVC from visibly dirty surfaces in practice; though they could not be regarded as statistical outliers of the whole data field. No correlation was found between TVC and TOC in the sampled data, which re-emphasizes the potential presence of micro-organisms on visibly clean surfaces and thus the need for the same degree of disinfection as visibly dirty surfaces. The organic soil and the microbial burden used in EN disinfectant standards represent a realistic worst-case scenario for disinfectants used in the food service and food-processing areas.

  19. Quantitative analysis of osteoblast behavior on microgrooved hydroxyapatite and titanium substrata.

    PubMed

    Lu, Xiong; Leng, Yang

    2003-09-01

    The effects of implant surface topography and chemistry on osteoblast behavior have been a research focus because of their potential importance in orthopedic and dental applications. This work focused on the topographic effects of hydroxyapatite (HA) and titanium (Ti) surface that had identical micropatterns to determine whether there was synergistic interaction between surface chemistry and surface topography. Surface microgrooves with six different groove widths (4, 8, 16, 24, 30, and 38 microm) and three different groove depths (2, 4, and 10 microm) were made on single crystalline silicon wafers using microfabrication techniques. Ti and HA thin films were coated on the microgrooves by radio-frequency magnetron sputtering. After that, human osteoblast-like cells were seeded and cultured on the microgrooved surfaces for up to 7 days. The cells' behavior was examined using scanning electron microscopy after cells were fixed and dehydrated. Statistical analysis was based on quantitative data of orientation angle, evaluating the contact guidance, and form index, describing cell shape or cell morphology changes. The contact guidance and cell shape changes were observed on the HA and Ti microgrooves. No difference in orientation angle between HA and Ti microgrooves was found. This might suggest that surface chemistry was not a significant influence on cell guidance. However, the form index analysis indicated an interaction between topographic effects and surface chemistry. Thus, conclusions about surface topographic effects on cell behavior drawn from one type of material cannot simply be applied to another type of material. Copyright 2003 Wiley Periodicals, Inc. J Biomed Mater Res 66A: 677-687, 2003

  20. A fully Bayesian before-after analysis of permeable friction course (PFC) pavement wet weather safety.

    PubMed

    Buddhavarapu, Prasad; Smit, Andre F; Prozzi, Jorge A

    2015-07-01

    Permeable friction course (PFC), a porous hot-mix asphalt, is typically applied to improve wet weather safety on high-speed roadways in Texas. In order to warrant expensive PFC construction, a statistical evaluation of its safety benefits is essential. Generally, the literature on the effectiveness of porous mixes in reducing wet-weather crashes is limited and often inconclusive. In this study, the safety effectiveness of PFC was evaluated using a fully Bayesian before-after safety analysis. First, two groups of road segments overlaid with PFC and non-PFC material were identified across Texas; the non-PFC or reference road segments selected were similar to their PFC counterparts in terms of site specific features. Second, a negative binomial data generating process was assumed to model the underlying distribution of crash counts of PFC and reference road segments to perform Bayesian inference on the safety effectiveness. A data-augmentation based computationally efficient algorithm was employed for a fully Bayesian estimation. The statistical analysis shows that PFC is not effective in reducing wet weather crashes. It should be noted that the findings of this study are in agreement with the existing literature, although these studies were not based on a fully Bayesian statistical analysis. Our study suggests that the safety effectiveness of PFC road surfaces, or any other safety infrastructure, largely relies on its interrelationship with the road user. The results suggest that the safety infrastructure must be properly used to reap the benefits of the substantial investments. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Documentation and analysis of traumatic injuries in clinical forensic medicine involving structured light three-dimensional surface scanning versus photography.

    PubMed

    Shamata, Awatif; Thompson, Tim

    2018-05-10

    Non-contact three-dimensional (3D) surface scanning has been applied in forensic medicine and has been shown to mitigate shortcoming of traditional documentation methods. The aim of this paper is to assess the efficiency of structured light 3D surface scanning in recording traumatic injuries of live cases in clinical forensic medicine. The work was conducted in Medico-Legal Centre in Benghazi, Libya. A structured light 3D surface scanner and ordinary digital camera with close-up lens were used to record the injuries and to have 3D and two-dimensional (2D) documents of the same traumas. Two different types of comparison were performed. Firstly, the 3D wound documents were compared to 2D documents based on subjective visual assessment. Additionally, 3D wound measurements were compared to conventional measurements and this was done to determine whether there was a statistical significant difference between them. For this, Friedman test was used. The study established that the 3D wound documents had extra features over the 2D documents. Moreover; the 3D scanning method was able to overcome the main deficiencies of the digital photography. No statistically significant difference was found between the 3D and conventional wound measurements. The Spearman's correlation established strong, positive correlation between the 3D and conventional measurement methods. Although, the 3D surface scanning of the injuries of the live subjects faced some difficulties, the 3D results were appreciated, the validity of 3D measurements based on the structured light 3D scanning was established. Further work will be achieved in forensic pathology to scan open injuries with depth information. Crown Copyright © 2018. Published by Elsevier Ltd. All rights reserved.

  2. A parallel efficient partitioning algorithm for the statistical model of dynamic sea clutter at low grazing angle

    NASA Astrophysics Data System (ADS)

    Wu, Tao; Wu, Zhensen; Linghu, Longxiang

    2017-10-01

    Study of characteristics of sea clutter is very important for signal processing of radar, detection of targets on sea surface and remote sensing. The sea state is complex at Low grazing angle (LGA), and it is difficult with its large irradiation area and a great deal simulation facets. A practical and efficient model to obtain radar clutter of dynamic sea in different sea condition is proposed, basing on the physical mechanism of interaction between electromagnetic wave and sea wave. The classical analysis method for sea clutter is basing on amplitude and spectrum distribution, taking the clutter as random processing model, which is equivocal in its physical mechanism. To achieve electromagnetic field from sea surface, a modified phase from facets is considered, and the backscattering coefficient is calculated by Wu's improved two-scale model, which can solve the statistical sea backscattering problem less than 5 degree, considering the effects of the surface slopes joint probability density, the shadowing function, the skewness of sea waves and the curvature of the surface on the backscattering from the ocean surface. We make the assumption that the scattering contribution of each facet is independent, the total field is the superposition of each facet in the receiving direction. Such data characters are very suitable to compute on GPU threads. So we can make the best of GPU resource. We have achieved a speedup of 155-fold for S band and 162-fold for Ku/Χ band on the Tesla K80 GPU as compared with Intel® Core™ CPU. In this paper, we mainly study the high resolution data, and the time resolution is millisecond, so we may have 10,00 time points, and we analyze amplitude probability density distribution of radar clutter.

  3. The Southampton-York Natural Scenes (SYNS) dataset: Statistics of surface attitude

    PubMed Central

    Adams, Wendy J.; Elder, James H.; Graf, Erich W.; Leyland, Julian; Lugtigheid, Arthur J.; Muryy, Alexander

    2016-01-01

    Recovering 3D scenes from 2D images is an under-constrained task; optimal estimation depends upon knowledge of the underlying scene statistics. Here we introduce the Southampton-York Natural Scenes dataset (SYNS: https://syns.soton.ac.uk), which provides comprehensive scene statistics useful for understanding biological vision and for improving machine vision systems. In order to capture the diversity of environments that humans encounter, scenes were surveyed at random locations within 25 indoor and outdoor categories. Each survey includes (i) spherical LiDAR range data (ii) high-dynamic range spherical imagery and (iii) a panorama of stereo image pairs. We envisage many uses for the dataset and present one example: an analysis of surface attitude statistics, conditioned on scene category and viewing elevation. Surface normals were estimated using a novel adaptive scale selection algorithm. Across categories, surface attitude below the horizon is dominated by the ground plane (0° tilt). Near the horizon, probability density is elevated at 90°/270° tilt due to vertical surfaces (trees, walls). Above the horizon, probability density is elevated near 0° slant due to overhead structure such as ceilings and leaf canopies. These structural regularities represent potentially useful prior assumptions for human and machine observers, and may predict human biases in perceived surface attitude. PMID:27782103

  4. Statistical prediction of September Arctic Sea Ice minimum based on stable teleconnections with global climate and oceanic patterns

    NASA Astrophysics Data System (ADS)

    Ionita, M.; Grosfeld, K.; Scholz, P.; Lohmann, G.

    2016-12-01

    Sea ice in both Polar Regions is an important indicator for the expression of global climate change and its polar amplification. Consequently, a broad information interest exists on sea ice, its coverage, variability and long term change. Knowledge on sea ice requires high quality data on ice extent, thickness and its dynamics. However, its predictability depends on various climate parameters and conditions. In order to provide insights into the potential development of a monthly/seasonal signal, we developed a robust statistical model based on ocean heat content, sea surface temperature and atmospheric variables to calculate an estimate of the September minimum sea ice extent for every year. Although previous statistical attempts at monthly/seasonal forecasts of September sea ice minimum show a relatively reduced skill, here it is shown that more than 97% (r = 0.98) of the September sea ice extent can predicted three months in advance by using previous months conditions via a multiple linear regression model based on global sea surface temperature (SST), mean sea level pressure (SLP), air temperature at 850hPa (TT850), surface winds and sea ice extent persistence. The statistical model is based on the identification of regions with stable teleconnections between the predictors (climatological parameters) and the predictand (here sea ice extent). The results based on our statistical model contribute to the sea ice prediction network for the sea ice outlook report (https://www.arcus.org/sipn) and could provide a tool for identifying relevant regions and climate parameters that are important for the sea ice development in the Arctic and for detecting sensitive and critical regions in global coupled climate models with focus on sea ice formation.

  5. Dissociative adsorption of O2 on unreconstructed metal (100) surfaces: Pathways, energetics, and sticking kinetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Da-Jiang; Evans, James W.

    An accurate description of oxygen dissociation pathways and kinetics for various local adlayer environments is key for an understanding not just of the coverage dependence of oxygen sticking, but also of reactive steady states in oxidation reactions. Density functional theory analysis for M(100) surfaces with M=Pd, Rh, and Ni, where O prefers the fourfold hollow adsorption site, does not support the traditional Brundle-Behm-Barker picture of dissociative adsorption onto second-nearest-neighbor hollow sites with an additional blocking constraint. Rather adsorption via neighboring vicinal bridge sites dominates, although other pathways can be active. The same conclusion also applies for M=Pt and Ir, wheremore » oxygen prefers the bridge adsorption site. Statistical mechanical analysis is performed based on kinetic Monte Carlo simulation of a multisite lattice-gas model consistent with our revised picture of adsorption. This analysis determines the coverage and temperature dependence of sticking for a realistic treatment of the oxygen adlayer structure.« less

  6. Examination of Spectral Transformations on Spectral Mixture Analysis

    NASA Astrophysics Data System (ADS)

    Deng, Y.; Wu, C.

    2018-04-01

    While many spectral transformation techniques have been applied on spectral mixture analysis (SMA), few study examined their necessity and applicability. This paper focused on exploring the difference between spectrally transformed schemes and untransformed scheme to find out which transformed scheme performed better in SMA. In particular, nine spectrally transformed schemes as well as untransformed scheme were examined in two study areas. Each transformed scheme was tested 100 times using different endmember classes' spectra under the endmember model of vegetation- high albedo impervious surface area-low albedo impervious surface area-soil (V-ISAh-ISAl-S). Performance of each scheme was assessed based on mean absolute error (MAE). Statistical analysis technique, Paired-Samples T test, was applied to test the significance of mean MAEs' difference between transformed and untransformed schemes. Results demonstrated that only NSMA could exceed the untransformed scheme in all study areas. Some transformed schemes showed unstable performance since they outperformed the untransformed scheme in one area but weakened the SMA result in another region.

  7. Participant Interaction in Asynchronous Learning Environments: Evaluating Interaction Analysis Methods

    ERIC Educational Resources Information Center

    Blanchette, Judith

    2012-01-01

    The purpose of this empirical study was to determine the extent to which three different objective analytical methods--sequence analysis, surface cohesion analysis, and lexical cohesion analysis--can most accurately identify specific characteristics of online interaction. Statistically significant differences were found in all points of…

  8. Multi-decadal evolution characteristics of global surface temperature anomaly data shown by observation and CMIP5 models

    NASA Astrophysics Data System (ADS)

    Zhu, X.

    2017-12-01

    Based on methods of statistical analysis, the time series of global surface air temperature(SAT) anomalies from 1860-2014 has been defined by three types of phase changes that occur through the division of temperature changes into different stages. The characteristics of the three types of phase changes simulated by CMIP5 models were evaluated. The conclusion is as follows: the SAT from 1860-2014 can be divided into six stages according to trend differences, and this subdivision is proved to be statistically significant. Based on trend analysis and the distribution of slopes between any two points (two points' slope) in every stage, the six stages can be summarized as three phase changes of warming, cooling, and hiatus. Between 1860 and 2014, the world experienced three heating phases (1860-1878, 1909-1942,1975-2004), one cooling phase (1878-1909), and two hiatus phases (1942-1975, 2004-2014).Using the definition method, whether the next year belongs to the previous phase can be estimated. Furthermore, the temperature in 2015 was used as an example to validate the feasibility of this method. The simulations of the heating period by CMIP5 models are well; however the characteristics shown by SAT during the cooling and hiatus period cannot be represented by CMIP5 models. As such, the projections of future heating phases using the CMIP5 models are credible, but for cooling and hiatus events they are unreliable.

  9. 75 FR 72611 - Assessments, Large Bank Pricing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-24

    ... the worst risk ranking and are included in the statistical analysis. Appendix 1 to the NPR describes the statistical analysis in detail. \\12\\ The percentage approximated by factors is based on the statistical model for that particual year. Actual weights assigned to each scorecard measure are largely based...

  10. Advanced Gear Alloys for Ultra High Strength Applications

    NASA Technical Reports Server (NTRS)

    Shen, Tony; Krantz, Timothy; Sebastian, Jason

    2011-01-01

    Single tooth bending fatigue (STBF) test data of UHS Ferrium C61 and C64 alloys are presented in comparison with historical test data of conventional gear steels (9310 and Pyrowear 53) with comparable statistical analysis methods. Pitting and scoring tests of C61 and C64 are works in progress. Boeing statistical analysis of STBF test data for the four gear steels (C61, C64, 9310 and Pyrowear 53) indicates that the UHS grades exhibit increases in fatigue strength in the low cycle fatigue (LCF) regime. In the high cycle fatigue (HCF) regime, the UHS steels exhibit better mean fatigue strength endurance limit behavior (particularly as compared to Pyrowear 53). However, due to considerable scatter in the UHS test data, the anticipated overall benefits of the UHS grades in bending fatigue have not been fully demonstrated. Based on all the test data and on Boeing s analysis, C61 has been selected by Boeing as the gear steel for the final ERDS demonstrator test gearboxes. In terms of potential follow-up work, detailed physics-based, micromechanical analysis and modeling of the fatigue data would allow for a better understanding of the causes of the experimental scatter, and of the transition from high-stress LCF (surface-dominated) to low-stress HCF (subsurface-dominated) fatigue failure. Additional STBF test data and failure analysis work, particularly in the HCF regime and around the endurance limit stress, could allow for better statistical confidence and could reduce the observed effects of experimental test scatter. Finally, the need for further optimization of the residual compressive stress profiles of the UHS steels (resulting from carburization and peening) is noted, particularly for the case of the higher hardness C64 material.

  11. Bag-breakup control of surface drag in hurricanes

    NASA Astrophysics Data System (ADS)

    Troitskaya, Yuliya; Zilitinkevich, Sergej; Kandaurov, Alexander; Ermakova, Olga; Kozlov, Dmitry; Sergeev, Daniil

    2016-04-01

    Air-sea interaction at extreme winds is of special interest now in connection with the problem of the sea surface drag reduction at the wind speed exceeding 30-35 m/s. This phenomenon predicted by Emanuel (1995) and confirmed by a number of field (e.g., Powell, et al, 2003) and laboratory (Donelan et al, 2004) experiments still waits its physical explanation. Several papers attributed the drag reduction to spume droplets - spray turning off the crests of breaking waves (e.g., Kudryavtsev, Makin, 2011, Bao, et al, 2011). The fluxes associated with the spray are determined by the rate of droplet production at the surface quantified by the sea spray generation function (SSGF), defined as the number of spray particles of radius r produced from the unit area of water surface in unit time. However, the mechanism of spume droplets' formation is unknown and empirical estimates of SSGF varied over six orders of magnitude; therefore, the production rate of large sea spray droplets is not adequately described and there are significant uncertainties in estimations of exchange processes in hurricanes. Herewith, it is unknown what is air-sea interface and how water is fragmented to spray at hurricane wind. Using high-speed video, we observed mechanisms of production of spume droplets at strong winds by high-speed video filming, investigated statistics and compared their efficiency. Experiments showed, that the generation of the spume droplets near the wave crest is caused by the following events: bursting of submerged bubbles, generation and breakup of "projections" and "bag breakup". Statistical analysis of results of these experiments showed that the main mechanism of spray-generation is attributed to "bag-breakup mechanism", namely, inflating and consequent blowing of short-lived, sail-like pieces of the water-surface film. Using high-speed video, we show that at hurricane winds the main mechanism of spray production is attributed to "bag-breakup", namely, inflating and consequent breaking of short-lived, sail-like pieces of the water-surface film - "bags". On the base of general principles of statistical physics (model of a canonical ensemble) we developed statistics of the "bag-breakup" events: their number and statistical distribution of geometrical parameters depending on wind speed. Basing on the developed statistics, we estimated the surface stress caused by bags as the average sum of stresses caused by individual bags depending on their eometrical parameters. The resulting stress is subjected to counteracting impacts of the increasing wind speed: the increasing number of bags, and their decreasing sizes and life times and the balance yields a peaking dependence of the bag resistance on the wind speed: the share of bag-stress peaks at U10  35 m/s and then reduces. Peaking of surface stress associated with the "bag-breakup" explains seemingly paradoxical non-monotonous wind-dependence of surface drag coefficient peaking at winds about 35 m/s. This work was supported by the Russian Foundation of Basic Research (14-05-91767, 13-05-12093, 16-05-00839, 14-05-91767, 16-55-52025, 15-35-20953) and experiment and equipment was supported by Russian Science Foundation (Agreements 14-17-00667 and 15-17-20009 respectively), Yu.Troitskaya, A.Kandaurov and D.Sergeev were partially supported by FP7 Collaborative Project No. 612610.

  12. An evaluation of shear bond strength of self-etch adhesive on pre-etched enamel: an in vitro study.

    PubMed

    Rao, Bhadra; Reddy, Satti Narayana; Mujeeb, Abdul; Mehta, Kanchan; Saritha, G

    2013-11-01

    To determine the shear bond strength of self-etch adhesive G-bond on pre-etched enamel. Thirty caries free human mandibular premolars extracted for orthodontic purpose were used for the study. Occlusal surfaces of all the teeth were flattened with diamond bur and a silicon carbide paper was used for surface smoothening. The thirty samples were randomly grouped into three groups. Three different etch systems were used for the composite build up: group 1 (G-bond self-etch adhesive system), group 2 (G-bond) and group 3 (Adper single bond). Light cured was applied for 10 seconds with a LED unit for composite buildup on the occlusal surface of each tooth with 8 millimeters (mm) in diameter and 3 mm in thickness. The specimens in each group were tested in shear mode using a knife-edge testing apparatus in a universal testing machine across head speed of 1 mm/ minute. Shear bond strength values in Mpa were calculated from the peak load at failure divided by the specimen surface area. The mean shear bond strength of all the groups were calculated and statistical analysis was carried out using one-way Analysis of Variance (ANOVA). The mean bond strength of group 1 is 15.5 Mpa, group 2 is 19.5 Mpa and group 3 is 20.1 Mpa. Statistical analysis was carried out between the groups using one-way ANOVA. Group 1 showed statistically significant lower bond strength when compared to groups 2 and 3. No statistical significant difference between groups 2 and 3 (p < 0.05). Self-etch adhesive G-bond showed increase in shear bond strength on pre-etched enamel.

  13. Systematic Mapping and Statistical Analyses of Valley Landform and Vegetation Asymmetries Across Hydroclimatic Gradients

    NASA Astrophysics Data System (ADS)

    Poulos, M. J.; Pierce, J. L.; McNamara, J. P.; Flores, A. N.; Benner, S. G.

    2015-12-01

    Terrain aspect alters the spatial distribution of insolation across topography, driving eco-pedo-hydro-geomorphic feedbacks that can alter landform evolution and result in valley asymmetries for a suite of land surface characteristics (e.g. slope length and steepness, vegetation, soil properties, and drainage development). Asymmetric valleys serve as natural laboratories for studying how landscapes respond to climate perturbation. In the semi-arid montane granodioritic terrain of the Idaho batholith, Northern Rocky Mountains, USA, prior works indicate that reduced insolation on northern (pole-facing) aspects prolongs snow pack persistence, and is associated with thicker, finer-grained soils, that retain more water, prolong the growing season, support coniferous forest rather than sagebrush steppe ecosystems, stabilize slopes at steeper angles, and produce sparser drainage networks. We hypothesize that the primary drivers of valley asymmetry development are changes in the pedon-scale water-balance that coalesce to alter catchment-scale runoff and drainage development, and ultimately cause the divide between north and south-facing land surfaces to migrate northward. We explore this conceptual framework by coupling land surface analyses with statistical modeling to assess relationships and the relative importance of land surface characteristics. Throughout the Idaho batholith, we systematically mapped and tabulated various statistical measures of landforms, land cover, and hydroclimate within discrete valley segments (n=~10,000). We developed a random forest based statistical model to predict valley slope asymmetry based upon numerous measures (n>300) of landscape asymmetries. Preliminary results suggest that drainages are tightly coupled with hillslopes throughout the region, with drainage-network slope being one of the strongest predictors of land-surface-averaged slope asymmetry. When slope-related statistics are excluded, due to possible autocorrelation, valley slope asymmetry is most strongly predicted by asymmetries of insolation and drainage density, which generally supports a water-balance based conceptual model of valley asymmetry development. Surprisingly, vegetation asymmetries had relatively low predictive importance.

  14. Sampling surface and subsurface particle-size distributions in wadable gravel-and cobble-bed streams for analyses in sediment transport, hydraulics, and streambed monitoring

    Treesearch

    Kristin Bunte; Steven R. Abt

    2001-01-01

    This document provides guidance for sampling surface and subsurface sediment from wadable gravel-and cobble-bed streams. After a short introduction to streams types and classifications in gravel-bed rivers, the document explains the field and laboratory measurement of particle sizes and the statistical analysis of particle-size distributions. Analysis of particle...

  15. Concept and Analysis of a Satellite for Space-Based Radio Detection of Ultra-High Energy Cosmic Rays

    NASA Astrophysics Data System (ADS)

    Romero-Wolf, Andrew; Gorham, P.; Booth, J.; Chen, P.; Duren, R. M.; Liewer, K.; Nam, J.; Saltzberg, D.; Schoorlemmer, H.; Wissel, S.; Zairfian, P.

    2014-01-01

    We present a concept for on-orbit radio detection of ultra-high energy cosmic rays (UHECRs) that has the potential to provide collection rates of ~100 events per year for energies above 10^20 eV. The synoptic wideband orbiting radio detector (SWORD) mission's high event statistics at these energies combined with the pointing capabilities of a space-borne antenna array could enable charged particle astronomy. The detector concept is based on ANITA's successful detection UHECRs where the geosynchrotron radio signal produced by the extended air shower is reflected off the Earth's surface and detected in flight.

  16. Base-flow characteristics of streams in the Valley and Ridge, Blue Ridge, and Piedmont physiographic provinces of Virginia

    USGS Publications Warehouse

    Nelms, D.L.; Harlow, G.E.; Hayes, Donald C.

    1995-01-01

    Growth within the Valley and Ridge, Blue Ridge, and Piedmont Physiographic Provinces of Virginia has focussed concern about allocation of surface-water flow and increased demands on the ground-water resources. The purpose of this report is to (1) describe the base-flow characteristics of streams, (2) identify regional differences in these flow characteristics, and (3) describe, if possible, the potential surface-water and ground-water yields of basins on the basis of the base-flow character- istics. Base-flow characteristics are presented for streams in the Valley and Ridge, Blue Ridge, and Piedmont Physiographic Provinces of Virginia. The provinces are separated into five regions: (1) Valley and Ridge, (2) Blue Ridge, (3) Piedmont/Blue Ridge transition, (4) Piedmont northern, and (5) Piedmont southern. Different flow statistics, which represent streamflows predominantly comprised of base flow, were determined for 217 continuous-record streamflow-gaging stations from historical mean daily discharge and for 192 partial-record streamflow-gaging stations by means of correlation of discharge measurements. Variability of base flow is represented by a duration ratio developed during this investigation. Effective recharge rates were also calculated. Median values for the different flow statistics range from 0.05 cubic foot per second per square mile for the 90-percent discharge on the streamflow-duration curve to 0.61 cubic foot per second per square mile for mean base flow. An excellent estimator of mean base flow for the Piedmont/Blue Ridge transition region and Piedmont southern region is the 50-percent discharge on the streamflow-duration curve, but tends to under- estimate mean base flow for the remaining regions. The base-flow variability index ranges from 0.07 to 2.27, with a median value of 0.55. Effective recharge rates range from 0.07 to 33.07 inches per year, with a median value of 8.32 inches per year. Differences in the base-flow characteristics exist between regions. The median discharges for the Valley and Ridge, Blue Ridge, and Piedmont/Blue Ridge transition regions are higher than those for the Piedmont regions. Results from statistical analysis indicate that the regions can be ranked in terms of base-flow characteristics from highest to lowest as follows: (1) Piedmont/Blue Ridge transition, (2) Valley and Ridge and Blue Ridge, (3) Piedmont southern, and (4) Piedmont northern. The flow statistics are consistently higher and the values for base-flow variability are lower for basins within the Piedmont/Blue Ridge transition region relative to those from the other regions, whereas the basins within the Piedmont northern region show the opposite pattern. The group rankings of the base-flow characteristics were used to designate the potential surface-water yield for the regions. In addition, an approach developed for this investigation assigns a rank for potential surface- water yield to a basin according to the quartiles in which the values for the base-flow character- istics are located. Both procedures indicate that the Valley and Ridge, Blue Ridge, and Piedmont/Blue Ridge transition regions have moderate-to-high potential surface-water yield and the Piedmont regions have low-to-moderate potential surface- water yield. In order to indicate potential ground-water yield from base-flow characteristics, aquifer properties for 51 streamflow-gaging stations with continuous record of streamflow data were determined by methods that use streamflow records and basin characteristics. Areal diffusivity ranges from 17,100 to 88,400 feet squared per day, with a median value of 38,400 feet squared per day. Areal transmissivity ranges from 63 to 830 feet squared per day, with a median value of 270 feet squared per day. Storage coefficients, which were estimated by dividing areal transmissivity by areal diffusivity, range from approximately 0.001 to 0.019 (dimensionless), with a median value of 0.007. The median value for areal diffus

  17. Statistical contact angle analyses; "slow moving" drops on a horizontal silicon-oxide surface.

    PubMed

    Schmitt, M; Grub, J; Heib, F

    2015-06-01

    Sessile drop experiments on horizontal surfaces are commonly used to characterise surface properties in science and in industry. The advancing angle and the receding angle are measurable on every solid. Specially on horizontal surfaces even the notions themselves are critically questioned by some authors. Building a standard, reproducible and valid method of measuring and defining specific (advancing/receding) contact angles is an important challenge of surface science. Recently we have developed two/three approaches, by sigmoid fitting, by independent and by dependent statistical analyses, which are practicable for the determination of specific angles/slopes if inclining the sample surface. These approaches lead to contact angle data which are independent on "user-skills" and subjectivity of the operator which is also of urgent need to evaluate dynamic measurements of contact angles. We will show in this contribution that the slightly modified procedures are also applicable to find specific angles for experiments on horizontal surfaces. As an example droplets on a flat freshly cleaned silicon-oxide surface (wafer) are dynamically measured by sessile drop technique while the volume of the liquid is increased/decreased. The triple points, the time, the contact angles during the advancing and the receding of the drop obtained by high-precision drop shape analysis are statistically analysed. As stated in the previous contribution the procedure is called "slow movement" analysis due to the small covered distance and the dominance of data points with low velocity. Even smallest variations in velocity such as the minimal advancing motion during the withdrawing of the liquid are identifiable which confirms the flatness and the chemical homogeneity of the sample surface and the high sensitivity of the presented approaches. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. Instantiation and registration of statistical shape models of the femur and pelvis using 3D ultrasound imaging.

    PubMed

    Barratt, Dean C; Chan, Carolyn S K; Edwards, Philip J; Penney, Graeme P; Slomczykowski, Mike; Carter, Timothy J; Hawkes, David J

    2008-06-01

    Statistical shape modelling potentially provides a powerful tool for generating patient-specific, 3D representations of bony anatomy for computer-aided orthopaedic surgery (CAOS) without the need for a preoperative CT scan. Furthermore, freehand 3D ultrasound (US) provides a non-invasive method for digitising bone surfaces in the operating theatre that enables a much greater region to be sampled compared with conventional direct-contact (i.e., pointer-based) digitisation techniques. In this paper, we describe how these approaches can be combined to simultaneously generate and register a patient-specific model of the femur and pelvis to the patient during surgery. In our implementation, a statistical deformation model (SDM) was constructed for the femur and pelvis by performing a principal component analysis on the B-spline control points that parameterise the freeform deformations required to non-rigidly register a training set of CT scans to a carefully segmented template CT scan. The segmented template bone surface, represented by a triangulated surface mesh, is instantiated and registered to a cloud of US-derived surface points using an iterative scheme in which the weights corresponding to the first five principal modes of variation of the SDM are optimised in addition to the rigid-body parameters. The accuracy of the method was evaluated using clinically realistic data obtained on three intact human cadavers (three whole pelves and six femurs). For each bone, a high-resolution CT scan and rigid-body registration transformation, calculated using bone-implanted fiducial markers, served as the gold standard bone geometry and registration transformation, respectively. After aligning the final instantiated model and CT-derived surfaces using the iterative closest point (ICP) algorithm, the average root-mean-square distance between the surfaces was 3.5mm over the whole bone and 3.7mm in the region of surgical interest. The corresponding distances after aligning the surfaces using the marker-based registration transformation were 4.6 and 4.5mm, respectively. We conclude that despite limitations on the regions of bone accessible using US imaging, this technique has potential as a cost-effective and non-invasive method to enable surgical navigation during CAOS procedures, without the additional radiation dose associated with performing a preoperative CT scan or intraoperative fluoroscopic imaging. However, further development is required to investigate errors using error measures relevant to specific surgical procedures.

  19. Daytime sea fog retrieval based on GOCI data: a case study over the Yellow Sea.

    PubMed

    Yuan, Yibo; Qiu, Zhongfeng; Sun, Deyong; Wang, Shengqiang; Yue, Xiaoyuan

    2016-01-25

    In this paper, a new daytime sea fog detection algorithm has been developed by using Geostationary Ocean Color Imager (GOCI) data. Based on spectral analysis, differences in spectral characteristics were found over different underlying surfaces, which include land, sea, middle/high level clouds, stratus clouds and sea fog. Statistical analysis showed that the Rrc (412 nm) (Rayleigh Corrected Reflectance) of sea fog pixels is approximately 0.1-0.6. Similarly, various band combinations could be used to separate different surfaces. Therefore, three indices (SLDI, MCDI and BSI) were set to discern land/sea, middle/high level clouds and fog/stratus clouds, respectively, from which it was generally easy to extract fog pixels. The remote sensing algorithm was verified using coastal sounding data, which demonstrated that the algorithm had the ability to detect sea fog. The algorithm was then used to monitor an 8-hour sea fog event and the results were consistent with observational data from buoys data deployed near the Sheyang coast (121°E, 34°N). The goal of this study was to establish a daytime sea fog detection algorithm based on GOCI data, which shows promise for detecting fog separately from stratus.

  20. Retention Forces between Titanium and Zirconia Components of Two-Part Implant Abutments with Different Techniques of Surface Modification.

    PubMed

    von Maltzahn, Nadine Freifrau; Holstermann, Jan; Kohorst, Philipp

    2016-08-01

    The adhesive connection between titanium base and zirconia coping of two-part abutments may be responsible for the failure rate. A high mechanical stability between both components is essential for the long-term success. The aim of the present in-vitro study was to evaluate the influence of different surface modification techniques and resin-based luting agents on the retention forces between titanium and zirconia components in two-part implant abutments. A total of 120 abutments with a titanium base bonded to a zirconia coping were investigated. Two different resin-based luting agents (Panavia F 2.0 and RelyX Unicem) and six different surface modifications were used to fix these components, resulting in 12 test groups (n = 10). The surface of the test specimens was mechanically pretreated with aluminium oxide blasting in combination with application of two surface activating primers (Alloy Primer, Clearfil Ceramic Primer) or a tribological conditioning (Rocatec), respectively. All specimens underwent 10,000 thermal cycles between 5°C and 55°C in a moist environment. A pull-off test was then conducted to determine retention forces between the titanium and zirconia components, and statistical analysis was performed (two-way anova). Finally, fracture surfaces were analyzed by light and scanning electron microscopy. No significant differences were found between Panavia F 2.0 and RelyX Unicem. However, the retention forces were significantly influenced by the surface modification technique used (p < 0.001). For both luting agents, the highest retention forces were found when adhesion surfaces of both the titanium bases and the zirconia copings were pretreated with aluminium oxide blasting, and with the application of Clearfil Ceramic Primer. Surface modification techniques crucially influence the retention forces between titanium and zirconia components in two-part implant abutments. All adhesion surfaces should be pretreated by sandblasting. Moreover, a phosphate-based primer serves to enhance long-term retention of the components. © 2015 Wiley Periodicals, Inc.

  1. Repair bond strength in aged methacrylate- and silorane-based composites.

    PubMed

    Bacchi, Atais; Consani, Rafael Leonardo; Sinhoreti, Mario Alexandre; Feitosa, Victor Pinheiro; Cavalcante, Larissa Maria; Pfeifer, Carmem Silva; Schneider, Luis Felipe

    2013-10-01

    To evaluate the tensile bond strength at repaired interfaces of aged dental composites, either dimethacrylate- or silorane-based, when subjected to different surface treatments. The composites used were Filtek P60 (methacrylate-based, 3M ESPE) and Filtek P90 (silorane-based, 3M ESPE), of which 50 slabs were stored for 6 months at 37°C. The surface of adhesion was abraded with a 600-grit silicone paper and the slabs repaired with the respective composite, according to the following surface treatment protocols: G1: no treatment; G2: adhesive application; G3: silane + adhesive; G4: sandblasting (Al2O3) + adhesive; G5: sandblasting (Al2O3) + silane + adhesive. After 24-h storage in distilled water at 37°C, tensile bond strength (TBS) was determined in a universal testing machine (Instron 4411) at a crosshead speed of 0.5 mm/min. The original data were submitted to two-way ANOVA and Tukey's test (α = 5%). The methacrylate-based composite presented a statistically significantly higher repair potential than did the silorane-based resin (p = 0.0002). Of the surface treatments for the silorane-based composite, aluminum-oxide air abrasion and adhesive (18.5 ± 3.3MPa) provided higher bond strength than only adhesive application or the control group without surface treatment. For Filtek P60, the control without treatment presented lower repair strength than all other groups with surface treatments, which were statistically similar to each other. The interaction between the factors resin composite and surface treatment was significant (p = 0.002). For aged silorane-based materials, repairs were considered successful after sandblasting (Al2O3) and adhesive application. For methacrylate resin, repair was successful with all surface treatments tested.

  2. Comparative evaluation of the effect of denture cleansers on the surface topography of denture base materials: An in-vitro study.

    PubMed

    Jeyapalan, Karthigeyan; Kumar, Jaya Krishna; Azhagarasan, N S

    2015-08-01

    The aim was to evaluate and compare the effects of three chemically different commercially available denture cleansing agents on the surface topography of two different denture base materials. Three chemically different denture cleansers (sodium perborate, 1% sodium hypochlorite, 0.2% chlorhexidine gluconate) were used on two denture base materials (acrylic resin and chrome cobalt alloy) and the changes were evaluated at 3 times intervals (56 h, 120 h, 240 h). Changes from baseline for surface roughness were recorded using a surface profilometer and standard error of the mean (SEM) both quantitatively and qualitatively, respectively. Qualitative surface analyses for all groups were done by SEM. The values obtained were analyzed statistically using one-way ANOVA and paired t-test. All three denture cleanser solutions showed no statistically significant surface changes on the acrylic resin portions at 56 h, 120 h, and 240 h of immersion. However, on the alloy portion changes were significant at the end of 120 h and 240 h. Of the three denture cleansers used in the study, none produced significant changes on the two denture base materials for the short duration of immersion, whereas changes were seen as the immersion periods were increased.

  3. Impact of Early and Late Visual Deprivation on the Structure of the Corpus Callosum: A Study Combining Thickness Profile with Surface Tensor-Based Morphometry.

    PubMed

    Shi, Jie; Collignon, Olivier; Xu, Liang; Wang, Gang; Kang, Yue; Leporé, Franco; Lao, Yi; Joshi, Anand A; Leporé, Natasha; Wang, Yalin

    2015-07-01

    Blindness represents a unique model to study how visual experience may shape the development of brain organization. Exploring how the structure of the corpus callosum (CC) reorganizes ensuing visual deprivation is of particular interest due to its important functional implication in vision (e.g., via the splenium of the CC). Moreover, comparing early versus late visually deprived individuals has the potential to unravel the existence of a sensitive period for reshaping the CC structure. Here, we develop a novel framework to capture a complete set of shape differences in the CC between congenitally blind (CB), late blind (LB) and sighted control (SC) groups. The CCs were manually segmented from T1-weighted brain MRI and modeled by 3D tetrahedral meshes. We statistically compared the combination of local area and thickness at each point between subject groups. Differences in area are found using surface tensor-based morphometry; thickness is estimated by tracing the streamlines in the volumetric harmonic field. Group differences were assessed on this combined measure using Hotelling's T(2) test. Interestingly, we observed that the total callosal volume did not differ between the groups. However, our fine-grained analysis reveals significant differences mostly localized around the splenium areas between both blind groups and the sighted group (general effects of blindness) and, importantly, specific dissimilarities between the LB and CB groups, illustrating the existence of a sensitive period for reorganization. The new multivariate statistics also gave better effect sizes for detecting morphometric differences, relative to other statistics. They may boost statistical power for CC morphometric analyses.

  4. IMPACT OF EARLY AND LATE VISUAL DEPRIVATION ON THE STRUCTURE OF THE CORPUS CALLOSUM: A STUDY COMBINING THICKNESS PROFILE WITH SURFACE TENSOR-BASED MORPHOMETRY

    PubMed Central

    Shi, Jie; Collignon, Olivier; Xu, Liang; Wang, Gang; Kang, Yue; Leporé, Franco; Lao, Yi; Joshi, Anand A.

    2015-01-01

    Blindness represents a unique model to study how visual experience may shape the development of brain organization. Exploring how the structure of the corpus callosum (CC) reorganizes ensuing visual deprivation is of particular interest due to its important functional implication in vision (e.g. via the splenium of the CC). Moreover, comparing early versus late visually deprived individuals has the potential to unravel the existence of a sensitive period for reshaping the CC structure. Here, we develop a novel framework to capture a complete set of shape differences in the CC between congenitally blind (CB), late blind (LB) and sighted control (SC) groups. The CCs were manually segmented from T1-weighted brain MRI and modeled by 3D tetrahedral meshes. We statistically compared the combination of local area and thickness at each point between subject groups. Differences in area are found using surface tensor-based morphometry; thickness is estimated by tracing the streamlines in the volumetric harmonic field. Group differences were assessed on this combined measure using Hotelling’s T2 test. Interestingly, we observed that the total callosal volume did not differ between the groups. However, our fine-grained analysis reveals significant differences mostly localized around the splenium areas between both blind groups and the sighted group (general effects of blindness) and, importantly, specific dissimilarities between the LB and CB groups, illustrating the existence of a sensitive period for reorganization. The new multivariate statistics also gave better effect sizes for detecting morphometric differences, relative to other statistics. They may boost statistical power for CC morphometric analyses. PMID:25649876

  5. On the determination of certain astronomical, selenodesic, and gravitational parameters of the moon

    NASA Technical Reports Server (NTRS)

    Aleksashin, Y. P.; Ziman, Y. L.; Isavnina, I. V.; Krasikov, V. A.; Nepoklonov, B. V.; Rodionov, B. N.; Tischenko, A. P.

    1974-01-01

    A method was examined for joint construction of a selenocentric fundamental system which can be realized by a coordinate catalog of reference contour points uniformly positioned over the entire lunar surface, and determination of the parameters characterizing the gravitational field, rotation, and orbital motion of the moon. Characteristic of the problem formulation is the introduction of a new complex of inconometric measurements which can be made using pictures obtained from an artificial lunar satellite. The proposed method can be used to solve similar problems on any other planet for which surface images can be obtained from a spacecraft. Characteristic of the proposed technique for solving the problem is the joint statistical analysis of all forms of measurements: orbital iconometric, earth-based trajectory, and also a priori information on the parameters in question which is known from earth-based astronomical studies.

  6. Site-conditions map for Portugal based on VS measurements: methodology and final model

    NASA Astrophysics Data System (ADS)

    Vilanova, Susana; Narciso, João; Carvalho, João; Lopes, Isabel; Quinta Ferreira, Mario; Moura, Rui; Borges, José; Nemser, Eliza; Pinto, carlos

    2017-04-01

    In this paper we present a statistically significant site-condition model for Portugal based on shear-wave velocity (VS) data and surface geology. We also evaluate the performance of commonly used Vs30 proxies based on exogenous data and analyze the implications of using those proxies for calculating site amplification in seismic hazard assessment. The dataset contains 161 Vs profiles acquired in Portugal in the context of research projects, technical reports, academic thesis and academic papers. The methodologies involved in characterizing the Vs structure at the sites in the database include seismic refraction, multichannel analysis of seismic waves and refraction microtremor. Invasive measurements were performed in selected locations in order to compare the Vs profiles obtained from both invasive and non-invasive techniques. In general there was good agreement in the subsurface structure of Vs30 obtained from the different methodologies. The database flat-file includes information on Vs30, surface geology at 1:50.000 and 1:500.000 scales, elevation and topographic slope and based on SRTM30 topographic dataset. The procedure used to develop the site-conditions map is based on a three-step process that includes defining a preliminary set of geological units based on the literature, performing statistical tests to assess whether or not the differences in the distributions of Vs30 are statistically significant, and merging of the geological units accordingly. The dataset was, to some extent, affected by clustering and/or preferential sampling and therefore a declustering algorithm was applied. The final model includes three geological units: 1) Igneous, metamorphic and old (Paleogene and Mesozoic) sedimentary rocks; 2) Neogene and Pleistocene formations, and 3) Holocene formations. The evaluation of proxies indicates that although geological analogues and topographic slope are in general unbiased, the latter shows significant bias for particular geological units and subsequently for some geographical regions.

  7. Antimicrobial and anti-adherence activity of various combinations of coffee-chicory solutions on Streptococcus mutans: An in-vitro study

    PubMed Central

    Sharma, Rama; Reddy, Vamsi Krishna L; Prashant, GM; Ojha, Vivek; Kumar, Naveen PG

    2014-01-01

    Context: Several studies have demonstrated the activity of natural plants on the dental biofilm and caries development. But few studies on the antimicrobial activity of coffee-based solutions were found in the literature. Further there was no study available to check the antimicrobial effect of coffee solutions with different percentages of chicory in it. Aims: To evaluate the antimicrobial activity of different combinations of coffee-chicory solutions and their anti-adherence effect on Streptococcus mutans to glass surface. Materials and Methods: Test solutions were prepared. For antimicrobial activity testing, tubes containing test solution and culture medium were inoculated with a suspension of S. mutans followed by plating on Brain Heart Infusion (BHI) agar. S. mutans adherence to glass in presence of the different test solutions was also tested. The number of adhered bacteria (CFU/mL) was determined by plating method. Statistical Analysis: Statistical significance was measured using one way ANOVA followed by Tukey's post hoc test. P value < 0.05 was considered statistically significant. Results: Pure chicory had shown significantly less bacterial count compared to all other groups. Groups IV and V had shown significant reduction in bacterial counts over the period of 4 hrs. Regarding anti-adherence effect, group I-IV had shown significantly less adherence of bacteria to glass surface. Conclusions: Chicory exerted antibacterial effect against S. mutans while coffee reduced significantly the adherence of S. mutans to the glass surface. PMID:25328299

  8. Surface defect detection in tiling Industries using digital image processing methods: analysis and evaluation.

    PubMed

    Karimi, Mohammad H; Asemani, Davud

    2014-05-01

    Ceramic and tile industries should indispensably include a grading stage to quantify the quality of products. Actually, human control systems are often used for grading purposes. An automatic grading system is essential to enhance the quality control and marketing of the products. Since there generally exist six different types of defects originating from various stages of tile manufacturing lines with distinct textures and morphologies, many image processing techniques have been proposed for defect detection. In this paper, a survey has been made on the pattern recognition and image processing algorithms which have been used to detect surface defects. Each method appears to be limited for detecting some subgroup of defects. The detection techniques may be divided into three main groups: statistical pattern recognition, feature vector extraction and texture/image classification. The methods such as wavelet transform, filtering, morphology and contourlet transform are more effective for pre-processing tasks. Others including statistical methods, neural networks and model-based algorithms can be applied to extract the surface defects. Although, statistical methods are often appropriate for identification of large defects such as Spots, but techniques such as wavelet processing provide an acceptable response for detection of small defects such as Pinhole. A thorough survey is made in this paper on the existing algorithms in each subgroup. Also, the evaluation parameters are discussed including supervised and unsupervised parameters. Using various performance parameters, different defect detection algorithms are compared and evaluated. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  9. Hydrochemical characteristics and water quality assessment of surface water and groundwater in Songnen plain, Northeast China.

    PubMed

    Zhang, Bing; Song, Xianfang; Zhang, Yinghua; Han, Dongmei; Tang, Changyuan; Yu, Yilei; Ma, Ying

    2012-05-15

    Water quality is the critical factor that influence on human health and quantity and quality of grain production in semi-humid and semi-arid area. Songnen plain is one of the grain bases in China, as well as one of the three major distribution regions of soda saline-alkali soil in the world. To assess the water quality, surface water and groundwater were sampled and analyzed by fuzzy membership analysis and multivariate statistics. The surface water were gather into class I, IV and V, while groundwater were grouped as class I, II, III and V by fuzzy membership analysis. The water samples were grouped into four categories according to irrigation water quality assessment diagrams of USDA. Most water samples distributed in category C1-S1, C2-S2 and C3-S3. Three groups were generated from hierarchical cluster analysis. Four principal components were extracted from principal component analysis. The indicators to water quality assessment were Na, HCO(3), NO(3), Fe, Mn and EC from principal component analysis. We conclude that surface water and shallow groundwater are suitable for irrigation, the reservoir and deep groundwater in upstream are the resources for drinking. The water for drinking should remove of the naturally occurring ions of Fe and Mn. The control of sodium and salinity hazard is required for irrigation. The integrated management of surface water and groundwater for drinking and irrigation is to solve the water issues. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. Hydrology team

    NASA Technical Reports Server (NTRS)

    Ragan, R.

    1982-01-01

    General problems faced by hydrologists when using historical records, real time data, statistical analysis, and system simulation in providing quantitative information on the temporal and spatial distribution of water are related to the limitations of these data. Major problem areas requiring multispectral imaging-based research to improve hydrology models involve: evapotranspiration rates and soil moisture dynamics for large areas; the three dimensional characteristics of bodies of water; flooding in wetlands; snow water equivalents; runoff and sediment yield from ungaged watersheds; storm rainfall; fluorescence and polarization of water and its contained substances; discriminating between sediment and chlorophyll in water; role of barrier island dynamics in coastal zone processes; the relationship between remotely measured surface roughness and hydraulic roughness of land surfaces and stream networks; and modeling the runoff process.

  11. Statistical analysis of the effect of temperature and inlet humidities on the parameters of a semiempirical model of the internal resistance of a polymer electrolyte membrane fuel cell

    NASA Astrophysics Data System (ADS)

    Giner-Sanz, J. J.; Ortega, E. M.; Pérez-Herranz, V.

    2018-03-01

    The internal resistance of a PEM fuel cell depends on the operation conditions and on the current delivered by the cell. This work's goal is to obtain a semiempirical model able to reproduce the effect of the operation current on the internal resistance of an individual cell of a commercial PEM fuel cell stack; and to perform a statistical analysis in order to study the effect of the operation temperature and the inlet humidities on the parameters of the model. First, the internal resistance of the individual fuel cell operating in different operation conditions was experimentally measured for different DC currents, using the high frequency intercept of the impedance spectra. Then, a semiempirical model based on Springer and co-workers' model was proposed. This model is able to successfully reproduce the experimental trends. Subsequently, the curves of resistance versus DC current obtained for different operation conditions were fitted to the semiempirical model, and an analysis of variance (ANOVA) was performed in order to determine which factors have a statistically significant effect on each model parameter. Finally, a response surface method was applied in order to obtain a regression model.

  12. [Simulation and data analysis of stereological modeling based on virtual slices].

    PubMed

    Wang, Hao; Shen, Hong; Bai, Xiao-yan

    2008-05-01

    To establish a computer-assisted stereological model for simulating the process of slice section and evaluate the relationship between section surface and estimated three-dimensional structure. The model was designed by mathematic method as a win32 software based on the MFC using Microsoft visual studio as IDE for simulating the infinite process of sections and analysis of the data derived from the model. The linearity of the fitting of the model was evaluated by comparison with the traditional formula. The win32 software based on this algorithm allowed random sectioning of the particles distributed randomly in an ideal virtual cube. The stereological parameters showed very high throughput (>94.5% and 92%) in homogeneity and independence tests. The data of density, shape and size of the section were tested to conform to normal distribution. The output of the model and that from the image analysis system showed statistical correlation and consistency. The algorithm we described can be used for evaluating the stereologic parameters of the structure of tissue slices.

  13. Discrimination surfaces with application to region-specific brain asymmetry analysis.

    PubMed

    Martos, Gabriel; de Carvalho, Miguel

    2018-05-20

    Discrimination surfaces are here introduced as a diagnostic tool for localizing brain regions where discrimination between diseased and nondiseased participants is higher. To estimate discrimination surfaces, we introduce a Mann-Whitney type of statistic for random fields and present large-sample results characterizing its asymptotic behavior. Simulation results demonstrate that our estimator accurately recovers the true surface and corresponding interval of maximal discrimination. The empirical analysis suggests that in the anterior region of the brain, schizophrenic patients tend to present lower local asymmetry scores in comparison with participants in the control group. Copyright © 2018 John Wiley & Sons, Ltd.

  14. Virtual reconstruction of glenoid bone defects using a statistical shape model.

    PubMed

    Plessers, Katrien; Vanden Berghe, Peter; Van Dijck, Christophe; Wirix-Speetjens, Roel; Debeer, Philippe; Jonkers, Ilse; Vander Sloten, Jos

    2018-01-01

    Description of the native shape of a glenoid helps surgeons to preoperatively plan the position of a shoulder implant. A statistical shape model (SSM) can be used to virtually reconstruct a glenoid bone defect and to predict the inclination, version, and center position of the native glenoid. An SSM-based reconstruction method has already been developed for acetabular bone reconstruction. The goal of this study was to evaluate the SSM-based method for the reconstruction of glenoid bone defects and the prediction of native anatomic parameters. First, an SSM was created on the basis of 66 healthy scapulae. Then, artificial bone defects were created in all scapulae and reconstructed using the SSM-based reconstruction method. For each bone defect, the reconstructed surface was compared with the original surface. Furthermore, the inclination, version, and glenoid center point of the reconstructed surface were compared with the original parameters of each scapula. For small glenoid bone defects, the healthy surface of the glenoid was reconstructed with a root mean square error of 1.2 ± 0.4 mm. Inclination, version, and glenoid center point were predicted with an accuracy of 2.4° ± 2.1°, 2.9° ± 2.2°, and 1.8 ± 0.8 mm, respectively. The SSM-based reconstruction method is able to accurately reconstruct the native glenoid surface and to predict the native anatomic parameters. Based on this outcome, statistical shape modeling can be considered a successful technique for use in the preoperative planning of shoulder arthroplasty. Copyright © 2017 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  15. River-spring connectivity and hydrogeochemical interactions in a shallow fractured rock formation. The case study of Fuensanta river valley (Southern Spain)

    NASA Astrophysics Data System (ADS)

    Barberá, J. A.; Andreo, B.

    2017-04-01

    In upland catchments, the hydrology and hydrochemistry of streams are largely influenced by groundwater inflows, at both regional and local scale. However, reverse conditions (groundwater dynamics conditioned by surface water interferences), although less described, may also occur. In this research, the local river-spring connectivity and induced hydrogeochemical interactions in intensely folded, fractured and layered Cretaceous marls and marly-limestones (Fuensanta river valley, S Spain) are discussed based on field observations, tracer tests and hydrodynamic and hydrochemical data. The differential flow measurements and tracing experiments performed in the Fuensanta river permitted us to quantify the surface water losses and to verify its direct hydraulic connection with the Fuensanta spring. The numerical simulations of tracer breakthrough curves suggest the existence of a groundwater flow system through well-connected master and tributary fractures, with fast and multi-source flow components. Furthermore, the multivariate statistical analysis conducted using chemical data from the sampled waters, the geochemical study of water-rock interactions and the proposed water mixing approach allowed the spatial characterization of the chemistry of the springs and river/stream waters draining low permeable Cretaceous formations. Results corroborated that the mixing of surface waters, as well as calcite dissolution and CO2 dissolution/exsolution, are the main geochemical processes constraining Fuensanta spring hydrochemistry. The estimated contribution of the tributary surface waters to the spring flow during the research period was approximately 26-53% (Fuensanta river) and 47-74% (Convento stream), being predominant the first component during high flow and the second one during the dry season. The identification of secondary geochemical processes (dolomite and gypsum dissolution and dedolomitization) in Fuensanta spring waters evidences the induced hydrogeochemical changes resulting from the allogenic recharge. This research highlights the usefulness of an integrated approach based on river and spring flow examination, dye tracing interpretation and regression and multivariate statistical analysis using hydrochemical data for surface water-groundwater interaction assessment in fractured complex environments worldwide, whose implementation becomes critical for an appropriate groundwater policy.

  16. Surface roughness of glass ionomer cements indicated for uncooperative patients according to surface protection treatment

    PubMed Central

    Pacifici, Edoardo; Bossù, Maurizio; Giovannetti, Agostino; La Torre, Giuseppe; Guerra, Fabrizio; Polimeni, Antonella

    2013-01-01

    Summary Background Even today, use of Glass Ionomer Cements (GIC) as restorative material is indicated for uncooperative patients. Aim The study aimed at estimating the surface roughness of different GICs using or not their proprietary surface coatings and at observing the interfaces between cement and coating through SEM. Materials and methods Forty specimens have been obtained and divided into 4 groups: Fuji IX (IX), Fuji IX/G-Coat Plus (IXC), Vitremer (V), Vitremer/Finishing Gloss (VFG). Samples were obtained using silicone moulds to simulate class I restorations. All specimens were processed for profilometric evaluation. The statistical differences of surface roughness between groups were assessed using One-Way Analysis of Variance (One-Way ANOVA) (p<0.05). The Two-Way Analysis of Variance (Two-Way ANOVA) was used to evaluate the influence of two factors: restoration material and presence of coating. Coated restoration specimens (IXC and VFG) were sectioned perpendicular to the restoration surface and processed for SEM evaluation. Results No statistical differences in roughness could be noticed between groups or factors. Following microscopic observation, interfaces between restoration material and coating were better for group IXC than for group VFG. Conclusions When specimens are obtained simulating normal clinical procedures, the presence of surface protection does not significantly improve the surface roughness of GICs. PMID:24611090

  17. Surface roughness of glass ionomer cements indicated for uncooperative patients according to surface protection treatment.

    PubMed

    Pacifici, Edoardo; Bossù, Maurizio; Giovannetti, Agostino; La Torre, Giuseppe; Guerra, Fabrizio; Polimeni, Antonella

    2013-01-01

    Even today, use of Glass Ionomer Cements (GIC) as restorative material is indicated for uncooperative patients. The study aimed at estimating the surface roughness of different GICs using or not their proprietary surface coatings and at observing the interfaces between cement and coating through SEM. Forty specimens have been obtained and divided into 4 groups: Fuji IX (IX), Fuji IX/G-Coat Plus (IXC), Vitremer (V), Vitremer/Finishing Gloss (VFG). Samples were obtained using silicone moulds to simulate class I restorations. All specimens were processed for profilometric evaluation. The statistical differences of surface roughness between groups were assessed using One-Way Analysis of Variance (One-Way ANOVA) (p<0.05). The Two-Way Analysis of Variance (Two-Way ANOVA) was used to evaluate the influence of two factors: restoration material and presence of coating. Coated restoration specimens (IXC and VFG) were sectioned perpendicular to the restoration surface and processed for SEM evaluation. No statistical differences in roughness could be noticed between groups or factors. Following microscopic observation, interfaces between restoration material and coating were better for group IXC than for group VFG. When specimens are obtained simulating normal clinical procedures, the presence of surface protection does not significantly improve the surface roughness of GICs.

  18. Response surface methodology as an approach to determine optimal activities of lipase entrapped in sol-gel matrix using different vegetable oils.

    PubMed

    Pinheiro, Rubiane C; Soares, Cleide M F; de Castro, Heizir F; Moraes, Flavio F; Zanin, Gisella M

    2008-03-01

    The conditions for maximization of the enzymatic activity of lipase entrapped in sol-gel matrix were determined for different vegetable oils using an experimental design. The effects of pH, temperature, and biocatalyst loading on lipase activity were verified using a central composite experimental design leading to a set of 13 assays and the surface response analysis. For canola oil and entrapped lipase, statistical analyses showed significant effects for pH and temperature and also the interactions between pH and temperature and temperature and biocatalyst loading. For the olive oil and entrapped lipase, it was verified that the pH was the only variable statistically significant. This study demonstrated that response surface analysis is a methodology appropriate for the maximization of the percentage of hydrolysis, as a function of pH, temperature, and lipase loading.

  19. Evaluation of Surface Roughness and Tensile Strength of Base Metal Alloys Used for Crown and Bridge on Recasting (Recycling)

    PubMed Central

    Hashmi, Syed W.; Rao, Yogesh; Garg, Akanksha

    2015-01-01

    Background Dental casting alloys play a prominent role in the restoration of the partial dentition. Casting alloys have to survive long term in the mouth and also have the combination of structure, molecules, wear resistance and biologic compatibility. According to ADA system casting alloys were divided into three groups (wt%); high noble, Noble and predominantly base metal alloys. Aim To evaluate the mechanical properties such as tensile strength and surface roughness of the new and recast base metal (nickel-chromium) alloys. Materials and Methods Recasting of the base metal alloys derived from sprue and button, to make it reusable has been done. A total of 200 test specimens were fabricated using specially fabricated jig of metal and divided into two groups- 100 specimens of new alloy and 100 specimens of recast alloys, which were tested for tensile strength on universal testing machine and surface roughness on surface roughness tester. Results Tensile strength of new alloy showed no statistically significant difference (p-value>0.05) from recast alloy whereas new alloy had statistically significant surface roughness (Maximum and Average surface roughness) difference (p-value<0.01) as compared to recast alloy. Conclusion Within the limitations of the study it is concluded that the tensile strength will not be affected by recasting of nickel-chromium alloy whereas surface roughness increases markedly. PMID:26393194

  20. Estimation of potential impacts and natural resource damages of oil.

    PubMed

    McCay, Deborah French; Rowe, Jill Jennings; Whittier, Nicole; Sankaranarayanan, Sankar; Etkin, Dagmar Schmidt

    2004-02-27

    Methods were developed to estimate the potential impacts and natural resource damages resulting from oil spills using probabilistic modeling techniques. The oil fates model uses wind data, current data, and transport and weathering algorithms to calculate mass balance of fuel components in various environmental compartments (water surface, shoreline, water column, atmosphere, sediments, etc.), oil pathway over time (trajectory), surface distribution, shoreline oiling, and concentrations of the fuel components in water and sediments. Exposure of aquatic habitats and organisms to whole oil and toxic components is estimated in the biological model, followed by estimation of resulting acute mortality and ecological losses. Natural resource damages are based on estimated costs to restore equivalent resources and/or ecological services, using Habitat Equivalency Analysis (HEA) and Resource Equivalency Analysis (REA) methods. Oil spill modeling was performed for two spill sites in central San Francisco Bay, three spill sizes (20th, 50th, and 95th percentile volumes from tankers and larger freight vessels, based on an analysis of likely spill volumes given a spill has occurred) and four oil types (gasoline, diesel, heavy fuel oil, and crude oil). The scenarios were run in stochastic mode to determine the frequency distribution, mean and standard deviation of fates, impacts, and damages. This work is significant as it demonstrates a statistically quantifiable method for estimating potential impacts and financial consequences that may be used in ecological risk assessment and cost-benefit analyses. The statistically-defined spill volumes and consequences provide an objective measure of the magnitude, range and variability of impacts to wildlife, aquatic organisms and shorelines for potential spills of four oil/fuel types, each having distinct environmental fates and effects.

  1. Evaluation of mechanical and thermal properties of commonly used denture base resins.

    PubMed

    Phoenix, Rodney D; Mansueto, Michael A; Ackerman, Neal A; Jones, Robert E

    2004-03-01

    The purpose of this investigation was to evaluate and compare the mechanical and thermal properties of 6 commonly used polymethyl methacrylate denture base resins. Sorption, solubility, color stability, adaptation, flexural stiffness, and hardness were assessed to determine compliance with ADA Specification No. 12. Thermal assessments were performed using differential scanning calorimetry and dynamic mechanical analysis. Results were assessed using statistical and observational analyses. All materials satisfied ADA requirements for sorption, solubility, and color stability. Adaptation testing indicated that microwave-activated systems provided better adaptation to associated casts than conventional heat-activated resins. According to flexural testing results, microwaveable resins were relatively stiff, while rubber-modified resins were more flexible. Differential scanning calorimetry indicated that microwave-activated systems were more completely polymerized than conventional heat-activated materials. The microwaveable resins displayed better adaptation, greater stiffness, and greater surface hardness than other denture base resins included in this investigation. Elastomeric toughening agents yielded decreased stiffness, decreased surface hardness, and decreased glass transition temperatures.

  2. Random heteropolymers preserve protein function in foreign environments

    NASA Astrophysics Data System (ADS)

    Panganiban, Brian; Qiao, Baofu; Jiang, Tao; DelRe, Christopher; Obadia, Mona M.; Nguyen, Trung Dac; Smith, Anton A. A.; Hall, Aaron; Sit, Izaac; Crosby, Marquise G.; Dennis, Patrick B.; Drockenmuller, Eric; Olvera de la Cruz, Monica; Xu, Ting

    2018-03-01

    The successful incorporation of active proteins into synthetic polymers could lead to a new class of materials with functions found only in living systems. However, proteins rarely function under the conditions suitable for polymer processing. On the basis of an analysis of trends in protein sequences and characteristic chemical patterns on protein surfaces, we designed four-monomer random heteropolymers to mimic intrinsically disordered proteins for protein solubilization and stabilization in non-native environments. The heteropolymers, with optimized composition and statistical monomer distribution, enable cell-free synthesis of membrane proteins with proper protein folding for transport and enzyme-containing plastics for toxin bioremediation. Controlling the statistical monomer distribution in a heteropolymer, rather than the specific monomer sequence, affords a new strategy to interface with biological systems for protein-based biomaterials.

  3. General Framework for Meta-analysis of Rare Variants in Sequencing Association Studies

    PubMed Central

    Lee, Seunggeun; Teslovich, Tanya M.; Boehnke, Michael; Lin, Xihong

    2013-01-01

    We propose a general statistical framework for meta-analysis of gene- or region-based multimarker rare variant association tests in sequencing association studies. In genome-wide association studies, single-marker meta-analysis has been widely used to increase statistical power by combining results via regression coefficients and standard errors from different studies. In analysis of rare variants in sequencing studies, region-based multimarker tests are often used to increase power. We propose meta-analysis methods for commonly used gene- or region-based rare variants tests, such as burden tests and variance component tests. Because estimation of regression coefficients of individual rare variants is often unstable or not feasible, the proposed method avoids this difficulty by calculating score statistics instead that only require fitting the null model for each study and then aggregating these score statistics across studies. Our proposed meta-analysis rare variant association tests are conducted based on study-specific summary statistics, specifically score statistics for each variant and between-variant covariance-type (linkage disequilibrium) relationship statistics for each gene or region. The proposed methods are able to incorporate different levels of heterogeneity of genetic effects across studies and are applicable to meta-analysis of multiple ancestry groups. We show that the proposed methods are essentially as powerful as joint analysis by directly pooling individual level genotype data. We conduct extensive simulations to evaluate the performance of our methods by varying levels of heterogeneity across studies, and we apply the proposed methods to meta-analysis of rare variant effects in a multicohort study of the genetics of blood lipid levels. PMID:23768515

  4. Sulcal depth-based cortical shape analysis in normal healthy control and schizophrenia groups

    NASA Astrophysics Data System (ADS)

    Lyu, Ilwoo; Kang, Hakmook; Woodward, Neil D.; Landman, Bennett A.

    2018-03-01

    Sulcal depth is an important marker of brain anatomy in neuroscience/neurological function. Previously, sulcal depth has been explored at the region-of-interest (ROI) level to increase statistical sensitivity to group differences. In this paper, we present a fully automated method that enables inferences of ROI properties from a sulcal region- focused perspective consisting of two main components: 1) sulcal depth computation and 2) sulcal curve-based refined ROIs. In conventional statistical analysis, the average sulcal depth measurements are employed in several ROIs of the cortical surface. However, taking the average sulcal depth over the full ROI blurs overall sulcal depth measurements which may result in reduced sensitivity to detect sulcal depth changes in neurological and psychiatric disorders. To overcome such a blurring effect, we focus on sulcal fundic regions in each ROI by filtering out other gyral regions. Consequently, the proposed method results in more sensitive to group differences than a traditional ROI approach. In the experiment, we focused on a cortical morphological analysis to sulcal depth reduction in schizophrenia with a comparison to the normal healthy control group. We show that the proposed method is more sensitivity to abnormalities of sulcal depth in schizophrenia; sulcal depth is significantly smaller in most cortical lobes in schizophrenia compared to healthy controls (p < 0.05).

  5. Ion induced electron emission statistics under Agm- cluster bombardment of Ag

    NASA Astrophysics Data System (ADS)

    Breuers, A.; Penning, R.; Wucher, A.

    2018-05-01

    The electron emission from a polycrystalline silver surface under bombardment with Agm- cluster ions (m = 1, 2, 3) is investigated in terms of ion induced kinetic excitation. The electron yield γ is determined directly by a current measurement method on the one hand and implicitly by the analysis of the electron emission statistics on the other hand. Successful measurements of the electron emission spectra ensure a deeper understanding of the ion induced kinetic electron emission process, with particular emphasis on the effect of the projectile cluster size to the yield as well as to emission statistics. The results allow a quantitative comparison to computer simulations performed for silver atoms and clusters impinging onto a silver surface.

  6. A web-based system for neural network based classification in temporomandibular joint osteoarthritis.

    PubMed

    de Dumast, Priscille; Mirabel, Clément; Cevidanes, Lucia; Ruellas, Antonio; Yatabe, Marilia; Ioshida, Marcos; Ribera, Nina Tubau; Michoud, Loic; Gomes, Liliane; Huang, Chao; Zhu, Hongtu; Muniz, Luciana; Shoukri, Brandon; Paniagua, Beatriz; Styner, Martin; Pieper, Steve; Budin, Francois; Vimort, Jean-Baptiste; Pascal, Laura; Prieto, Juan Carlos

    2018-07-01

    The purpose of this study is to describe the methodological innovations of a web-based system for storage, integration and computation of biomedical data, using a training imaging dataset to remotely compute a deep neural network classifier of temporomandibular joint osteoarthritis (TMJOA). This study imaging dataset consisted of three-dimensional (3D) surface meshes of mandibular condyles constructed from cone beam computed tomography (CBCT) scans. The training dataset consisted of 259 condyles, 105 from control subjects and 154 from patients with diagnosis of TMJ OA. For the image analysis classification, 34 right and left condyles from 17 patients (39.9 ± 11.7 years), who experienced signs and symptoms of the disease for less than 5 years, were included as the testing dataset. For the integrative statistical model of clinical, biological and imaging markers, the sample consisted of the same 17 test OA subjects and 17 age and sex matched control subjects (39.4 ± 15.4 years), who did not show any sign or symptom of OA. For these 34 subjects, a standardized clinical questionnaire, blood and saliva samples were also collected. The technological methodologies in this study include a deep neural network classifier of 3D condylar morphology (ShapeVariationAnalyzer, SVA), and a flexible web-based system for data storage, computation and integration (DSCI) of high dimensional imaging, clinical, and biological data. The DSCI system trained and tested the neural network, indicating 5 stages of structural degenerative changes in condylar morphology in the TMJ with 91% close agreement between the clinician consensus and the SVA classifier. The DSCI remotely ran with a novel application of a statistical analysis, the Multivariate Functional Shape Data Analysis, that computed high dimensional correlations between shape 3D coordinates, clinical pain levels and levels of biological markers, and then graphically displayed the computation results. The findings of this study demonstrate a comprehensive phenotypic characterization of TMJ health and disease at clinical, imaging and biological levels, using novel flexible and versatile open-source tools for a web-based system that provides advanced shape statistical analysis and a neural network based classification of temporomandibular joint osteoarthritis. Published by Elsevier Ltd.

  7. Metallurgical characterization of orthodontic brackets produced by Metal Injection Molding (MIM).

    PubMed

    Zinelis, Spiros; Annousaki, Olga; Makou, Margarita; Eliades, Theodore

    2005-11-01

    The aim of this study was to investigate the bonding base surface morphology, alloy type, microstructure, and hardness of four types of orthodontic brackets produced by Metal Injection Molding technology (Discovery, Extremo, Freedom, and Topic). The bonding base morphology of the brackets was evaluated by scanning electron microscopy (SEM). Brackets from each manufacturer were embedded in epoxy resin, and after metallographic grinding, polishing and coating were analyzed by x-ray energy-dispersive spectroscopic (EDS) microanalysis to assess their elemental composition. Then, the brackets were subjected to metallographic etching to reveal their metallurgical structure. The same specimen surfaces were repolished and used for Vickers microhardness measurements. The results were statistically analyzed with one-way analysis of variance and Student-Newman-Keuls multiple comparison test at the 0.05 level of significance. The findings of SEM observations showed a great variability in the base morphology design among the brackets tested. The x-ray EDS analysis demonstrated that each bracket was manufactured from different ferrous or Co-based alloys. Metallographic analysis showed the presence of a large grain size for the Discovery, Freedom, and Topic brackets and a much finer grain size for the Extremo bracket. Vickers hardness showed great variations among the brackets (Topic: 287 +/- 16, Freedom: 248 +/- 13, Discovery: 214 +/- 12, and Extremo: 154 +/- 9). The results of this study showed that there are significant differences in the base morphology, composition, microstructure, and microhardness among the brackets tested, which may anticipate significant clinical implications.

  8. General specifications for the development of a USL NASA PC R and D statistical analysis support package

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Bassari, Jinous; Triantafyllopoulos, Spiros

    1984-01-01

    The University of Southwestern Louisiana (USL) NASA PC R and D statistical analysis support package is designed to be a three-level package to allow statistical analysis for a variety of applications within the USL Data Base Management System (DBMS) contract work. The design addresses usage of the statistical facilities as a library package, as an interactive statistical analysis system, and as a batch processing package.

  9. Aggregation and Disaggregation of Senile Plaques in Alzheimer Disease

    NASA Astrophysics Data System (ADS)

    Cruz, L.; Urbanc, B.; Buldyrev, S. V.; Christie, R.; Gomez-Isla, T.; Havlin, S.; McNamara, M.; Stanley, H. E.; Hyman, B. T.

    1997-07-01

    We quantitatively analyzed, using laser scanning confocal microscopy, the three-dimensional structure of individual senile plaques in Alzheimer disease. We carried out the quantitative analysis using statistical methods to gain insights about the processes that govern Aβ peptide deposition. Our results show that plaques are complex porous structures with characteristic pore sizes. We interpret plaque morphology in the context of a new dynamical model based on competing aggregation and disaggregation processes in kinetic steady-state equilibrium with an additional diffusion process allowing Aβ deposits to diffuse over the surface of plaques.

  10. Group for High Resolution Sea Surface Temperature (GHRSST) Analysis Fields Inter-Comparisons. Part 2. Near Real Time Web-based Level 4 SST Quality Monitor (L4-SQUAM)

    DTIC Science & Technology

    2012-01-01

    Reynoldsr, Viva Banzon*. Helen Beggs h, Jean-Francois Cayula1, Yi Chaoj, Robert Grumbinek, Eileen Maturia, Andy Harrisal. Jonathan Mittaza•’, John...number of valid OSTIA SSTs because NN matching is done to OSTIA grid. A dotted gray line shows an ideal Gaussian fit, X~N(Median, RSD...show significant differences. In the right panels, AT, statistics are annotated on the left side of the histograms, dotted gray line shows an ideal

  11. Effect of biofilm formation, and biocorrosion on denture base fractures.

    PubMed

    Sahin, Cem; Ergin, Alper; Ayyildiz, Simel; Cosgun, Erdal; Uzun, Gulay

    2013-05-01

    The aim of this study was to investigate the destructive effects of biofilm formation and/or biocorrosive activity of 6 different oral microorganisms. Three different heat polymerized acrylic resins (Ivocap Plus, Lucitone 550, QC 20) were used to prepare three different types of samples. Type "A" samples with "V" type notch was used to measure the fracture strength, "B" type to evaluate the surfaces with scanning electron microscopy and "C" type for quantitative biofilm assay. Development and calculation of biofilm covered surfaces on denture base materials were accomplished by SEM and quantitative biofilm assay. According to normality assumptions ANOVA or Kruskal-Wallis was selected for statistical analysis (α=0.05). Significant differences were obtained among the adhesion potential of 6 different microorganisms and there were significant differences among their adhesion onto 3 different denture base materials. Compared to the control groups after contamination with the microorganisms, the three point bending test values of denture base materials decreased significantly (P<.05); microorganisms diffused at least 52% of the denture base surface. The highest median quantitative biofilm value within all the denture base materials was obtained with P. aeruginosa on Lucitone 550. The type of denture base material did not alter the diffusion potential of the microorganisms significantly (P>.05). All the tested microorganisms had destructive effect over the structure and composition of the denture base materials.

  12. Effect of biofilm formation, and biocorrosion on denture base fractures

    PubMed Central

    Ergin, Alper; Ayyildiz, Simel; Cosgun, Erdal; Uzun, Gulay

    2013-01-01

    PURPOSE The aim of this study was to investigate the destructive effects of biofilm formation and/or biocorrosive activity of 6 different oral microorganisms. MATERIALS AND METHODS Three different heat polymerized acrylic resins (Ivocap Plus, Lucitone 550, QC 20) were used to prepare three different types of samples. Type "A" samples with "V" type notch was used to measure the fracture strength, "B" type to evaluate the surfaces with scanning electron microscopy and "C" type for quantitative biofilm assay. Development and calculation of biofilm covered surfaces on denture base materials were accomplished by SEM and quantitative biofilm assay. According to normality assumptions ANOVA or Kruskal-Wallis was selected for statistical analysis (α=0.05). RESULTS Significant differences were obtained among the adhesion potential of 6 different microorganisms and there were significant differences among their adhesion onto 3 different denture base materials. Compared to the control groups after contamination with the microorganisms, the three point bending test values of denture base materials decreased significantly (P<.05); microorganisms diffused at least 52% of the denture base surface. The highest median quantitative biofilm value within all the denture base materials was obtained with P. aeruginosa on Lucitone 550. The type of denture base material did not alter the diffusion potential of the microorganisms significantly (P>.05). CONCLUSION All the tested microorganisms had destructive effect over the structure and composition of the denture base materials. PMID:23755339

  13. VARIATION OF KOC IN SURFACE SEDIMENTS FROM NARRAGANSETT BAY AND LONG ISLAND SOUND: ANALYSIS OF THE ROLE OF OTHER PARTICULATE CHARACTERISTICS

    EPA Science Inventory

    In the first part of this investigation, we examined whether differences in the Kocs of three nonpolar organic chemicals (Lindane, fluoranthene, tetrachlorinated biphenyl (PCB)) from five sites along the New England coast were statistically significant. Although no statistical di...

  14. Bacterial adhesion on conventional and self-ligating metallic brackets after surface treatment with plasma-polymerized hexamethyldisiloxane.

    PubMed

    Tupinambá, Rogerio Amaral; Claro, Cristiane Aparecida de Assis; Pereira, Cristiane Aparecida; Nobrega, Celestino José Prudente; Claro, Ana Paula Rosifini Alves

    2017-01-01

    Plasma-polymerized film deposition was created to modify metallic orthodontic brackets surface properties in order to inhibit bacterial adhesion. Hexamethyldisiloxane (HMDSO) polymer films were deposited on conventional (n = 10) and self-ligating (n = 10) stainless steel orthodontic brackets using the Plasma-Enhanced Chemical Vapor Deposition (PECVD) radio frequency technique. The samples were divided into two groups according to the kind of bracket and two subgroups after surface treatment. Scanning Electron Microscopy (SEM) analysis was performed to assess the presence of bacterial adhesion over samples surfaces (slot and wings region) and film layer integrity. Surface roughness was assessed by Confocal Interferometry (CI) and surface wettability, by goniometry. For bacterial adhesion analysis, samples were exposed for 72 hours to a Streptococcus mutans solution for biofilm formation. The values obtained for surface roughness were analyzed using the Mann-Whitney test while biofilm adhesion were assessed by Kruskal-Wallis and SNK test. Significant statistical differences (p< 0.05) for surface roughness and bacterial adhesion reduction were observed on conventional brackets after surface treatment and between conventional and self-ligating brackets; no significant statistical differences were observed between self-ligating groups (p> 0.05). Plasma-polymerized film deposition was only effective on reducing surface roughness and bacterial adhesion in conventional brackets. It was also noted that conventional brackets showed lower biofilm adhesion than self-ligating brackets despite the absence of film.

  15. Relationships between aerodynamic roughness and land use and land cover in Baltimore, Maryland

    USGS Publications Warehouse

    Nicholas, F.W.; Lewis, J.E.

    1980-01-01

    Urbanization changes the radiative, thermal, hydrologic, and aerodynamic properties of the Earth's surface. Knowledge of these surface characteristics, therefore, is essential to urban climate analysis. Aerodynamic or surface roughness of urban areas is not well documented, however, because of practical constraints in measuring the wind profile in the presence of large buildings. Using an empirical method designed by Lettau, and an analysis of variance of surface roughness values calculated for 324 samples averaging 0.8 hectare (ha) of land use and land cover sample in Baltimore, Md., a strong statistical relation was found between aerodynamic roughness and urban land use and land cover types. Assessment of three land use and land cover systems indicates that some of these types have significantly different surface roughness characteristics. The tests further indicate that statistically significant differences exist in estimated surface roughness values when categories (classes) from different land use and land cover classification systems are used as surrogates. A Level III extension of the U.S. Geological Survey Level II land use and land cover classification system provided the most reliable results. An evaluation of the physical association between the aerodynamic properties of land use and land cover and the surface climate by numerical simulation of the surface energy balance indicates that changes in surface roughness within the range of values typical of the Level III categories induce important changes in the surface climate.

  16. Occupational Exposure to Cobalt and Tungsten in the Swedish Hard Metal Industry: Air Concentrations of Particle Mass, Number, and Surface Area

    PubMed Central

    Bryngelsson, Ing-Liss; Pettersson, Carin; Husby, Bente; Arvidsson, Helena; Westberg, Håkan

    2016-01-01

    Exposure to cobalt in the hard metal industry entails severe adverse health effects, including lung cancer and hard metal fibrosis. The main aim of this study was to determine exposure air concentration levels of cobalt and tungsten for risk assessment and dose–response analysis in our medical investigations in a Swedish hard metal plant. We also present mass-based, particle surface area, and particle number air concentrations from stationary sampling and investigate the possibility of using these data as proxies for exposure measures in our study. Personal exposure full-shift measurements were performed for inhalable and total dust, cobalt, and tungsten, including personal real-time continuous monitoring of dust. Stationary measurements of inhalable and total dust, PM2.5, and PM10 was also performed and cobalt and tungsten levels were determined, as were air concentration of particle number and particle surface area of fine particles. The personal exposure levels of inhalable dust were consistently low (AM 0.15mg m−3, range <0.023–3.0mg m−3) and below the present Swedish occupational exposure limit (OEL) of 10mg m−3. The cobalt levels were low as well (AM 0.0030mg m−3, range 0.000028–0.056mg m−3) and only 6% of the samples exceeded the Swedish OEL of 0.02mg m−3. For continuous personal monitoring of dust exposure, the peaks ranged from 0.001 to 83mg m−3 by work task. Stationary measurements showed lower average levels both for inhalable and total dust and cobalt. The particle number concentration of fine particles (AM 3000 p·cm−3) showed the highest levels at the departments of powder production, pressing and storage, and for the particle surface area concentrations (AM 7.6 µm2·cm−3) similar results were found. Correlating cobalt mass-based exposure measurements to cobalt stationary mass-based, particle area, and particle number concentrations by rank and department showed significant correlations for all measures except for particle number. Linear regression analysis of the same data showed statistically significant regression coefficients only for the mass-based aerosol measures. Similar results were seen for rank correlation in the stationary rig, and linear regression analysis implied significant correlation for mass-based and particle surface area measures. The mass-based air concentration levels of cobalt and tungsten in the hard metal plant in our study were low compared to Swedish OELs. Particle number and particle surface area concentrations were in the same order of magnitude as for other industrial settings. Regression analysis implied the use of stationary determined mass-based and particle surface area aerosol concentration as proxies for various exposure measures in our study. PMID:27143598

  17. Utilization of an Enhanced Canonical Correlation Analysis (ECCA) to Predict Daily Precipitation and Temperature in a Semi-Arid Environment

    NASA Astrophysics Data System (ADS)

    Lopez, S. R.; Hogue, T. S.

    2011-12-01

    Global climate models (GCMs) are primarily used to generate historical and future large-scale circulation patterns at a coarse resolution (typical order of 50,000 km2) and fail to capture climate variability at the ground level due to localized surface influences (i.e topography, marine, layer, land cover, etc). Their inability to accurately resolve these processes has led to the development of numerous 'downscaling' techniques. The goal of this study is to enhance statistical downscaling of daily precipitation and temperature for regions with heterogeneous land cover and topography. Our analysis was divided into two periods, historical (1961-2000) and contemporary (1980-2000), and tested using sixteen predictand combinations from four GCMs (GFDL CM2.0, GFDL CM2.1, CNRM-CM3 and MRI-CGCM2 3.2a. The Southern California area was separated into five county regions: Santa Barbara, Ventura, Los Angeles, Orange and San Diego. Principle component analysis (PCA) was performed on ground-based observations in order to (1) reduce the number of redundant gauges and minimize dimensionality and (2) cluster gauges that behave statistically similarly for post-analysis. Post-PCA analysis included extensive testing of predictor-predictand relationships using an enhanced canonical correlation analysis (ECCA). The ECCA includes obtaining the optimal predictand sets for all models within each spatial domain (county) as governed by daily and monthly overall statistics. Results show all models maintain mean annual and monthly behavior within each county and daily statistics are improved. The level of improvement highly depends on the vegetation extent within each county and the land-to-ocean ratio within the GCM spatial grid. The utilization of the entire historical period also leads to better statistical representation of observed daily precipitation. The validated ECCA technique is being applied to future climate scenarios distributed by the IPCC in order to provide forcing data for regional hydrologic models and assess future water resources in the Southern California region.

  18. Stationary statistical theory of two-surface multipactor regarding all impacts for efficient threshold analysis

    NASA Astrophysics Data System (ADS)

    Lin, Shu; Wang, Rui; Xia, Ning; Li, Yongdong; Liu, Chunliang

    2018-01-01

    Statistical multipactor theories are critical prediction approaches for multipactor breakdown determination. However, these approaches still require a negotiation between the calculation efficiency and accuracy. This paper presents an improved stationary statistical theory for efficient threshold analysis of two-surface multipactor. A general integral equation over the distribution function of the electron emission phase with both the single-sided and double-sided impacts considered is formulated. The modeling results indicate that the improved stationary statistical theory can not only obtain equally good accuracy of multipactor threshold calculation as the nonstationary statistical theory, but also achieve high calculation efficiency concurrently. By using this improved stationary statistical theory, the total time consumption in calculating full multipactor susceptibility zones of parallel plates can be decreased by as much as a factor of four relative to the nonstationary statistical theory. It also shows that the effect of single-sided impacts is indispensable for accurate multipactor prediction of coaxial lines and also more significant for the high order multipactor. Finally, the influence of secondary emission yield (SEY) properties on the multipactor threshold is further investigated. It is observed that the first cross energy and the energy range between the first cross and the SEY maximum both play a significant role in determining the multipactor threshold, which agrees with the numerical simulation results in the literature.

  19. Iterative Assessment of Statistically-Oriented and Standard Algorithms for Determining Muscle Onset with Intramuscular Electromyography.

    PubMed

    Tenan, Matthew S; Tweedell, Andrew J; Haynes, Courtney A

    2017-12-01

    The onset of muscle activity, as measured by electromyography (EMG), is a commonly applied metric in biomechanics. Intramuscular EMG is often used to examine deep musculature and there are currently no studies examining the effectiveness of algorithms for intramuscular EMG onset. The present study examines standard surface EMG onset algorithms (linear envelope, Teager-Kaiser Energy Operator, and sample entropy) and novel algorithms (time series mean-variance analysis, sequential/batch processing with parametric and nonparametric methods, and Bayesian changepoint analysis). Thirteen male and 5 female subjects had intramuscular EMG collected during isolated biceps brachii and vastus lateralis contractions, resulting in 103 trials. EMG onset was visually determined twice by 3 blinded reviewers. Since the reliability of visual onset was high (ICC (1,1) : 0.92), the mean of the 6 visual assessments was contrasted with the algorithmic approaches. Poorly performing algorithms were stepwise eliminated via (1) root mean square error analysis, (2) algorithm failure to identify onset/premature onset, (3) linear regression analysis, and (4) Bland-Altman plots. The top performing algorithms were all based on Bayesian changepoint analysis of rectified EMG and were statistically indistinguishable from visual analysis. Bayesian changepoint analysis has the potential to produce more reliable, accurate, and objective intramuscular EMG onset results than standard methodologies.

  20. Validation of Methods to Predict Vibration of a Panel in the Near Field of a Hot Supersonic Rocket Plume

    NASA Technical Reports Server (NTRS)

    Bremner, P. G.; Blelloch, P. A.; Hutchings, A.; Shah, P.; Streett, C. L.; Larsen, C. E.

    2011-01-01

    This paper describes the measurement and analysis of surface fluctuating pressure level (FPL) data and vibration data from a plume impingement aero-acoustic and vibration (PIAAV) test to validate NASA s physics-based modeling methods for prediction of panel vibration in the near field of a hot supersonic rocket plume. For this test - reported more fully in a companion paper by Osterholt & Knox at 26th Aerospace Testing Seminar, 2011 - the flexible panel was located 2.4 nozzle diameters from the plume centerline and 4.3 nozzle diameters downstream from the nozzle exit. The FPL loading is analyzed in terms of its auto spectrum, its cross spectrum, its spatial correlation parameters and its statistical properties. The panel vibration data is used to estimate the in-situ damping under plume FPL loading conditions and to validate both finite element analysis (FEA) and statistical energy analysis (SEA) methods for prediction of panel response. An assessment is also made of the effects of non-linearity in the panel elasticity.

  1. Monte Carlo investigation of thrust imbalance of solid rocket motor pairs

    NASA Technical Reports Server (NTRS)

    Sforzini, R. H.; Foster, W. A., Jr.

    1976-01-01

    The Monte Carlo method of statistical analysis is used to investigate the theoretical thrust imbalance of pairs of solid rocket motors (SRMs) firing in parallel. Sets of the significant variables are selected using a random sampling technique and the imbalance calculated for a large number of motor pairs using a simplified, but comprehensive, model of the internal ballistics. The treatment of burning surface geometry allows for the variations in the ovality and alignment of the motor case and mandrel as well as those arising from differences in the basic size dimensions and propellant properties. The analysis is used to predict the thrust-time characteristics of 130 randomly selected pairs of Titan IIIC SRMs. A statistical comparison of the results with test data for 20 pairs shows the theory underpredicts the standard deviation in maximum thrust imbalance by 20% with variability in burning times matched within 2%. The range in thrust imbalance of Space Shuttle type SRM pairs is also estimated using applicable tolerances and variabilities and a correction factor based on the Titan IIIC analysis.

  2. Enamel alteration following tooth bleaching and remineralization.

    PubMed

    Coceska, Emilija; Gjorgievska, Elizabeta; Coleman, Nichola J; Gabric, Dragana; Slipper, Ian J; Stevanovic, Marija; Nicholson, John W

    2016-06-01

    The purpose of this study was to compare the effects of professional tooth whitening agents containing highly concentrated hydrogen peroxide (with and without laser activation), on the enamel surface; and the potential of four different toothpastes to remineralize any alterations. The study was performed on 50 human molars, divided in two groups: treated with Opalescence(®) Boost and Mirawhite(®) Laser Bleaching. Furthermore, each group was divided into five subgroups, a control one and 4 subgroups remineralized with: Mirasensitive(®) hap+, Mirawhite(®) Gelleѐ, GC Tooth Mousse™ and Mirafluor(®) C. The samples were analysed by SEM/3D-SEM-micrographs, SEM/EDX-qualitative analysis and SEM/EDX-semiquantitative analysis. The microphotographs show that both types of bleaching cause alterations: emphasized perikymata, erosions, loss of interprizmatic substance; the laser treatment is more aggressive and loss of integrity of the enamel is determined by shearing off the enamel rods. In all samples undergoing remineralization deposits were observed, those of toothpastes based on calcium phosphate technologies seem to merge with each other and cover almost the entire surface of the enamel. Loss of integrity and minerals were detected only in the line-scans of the sample remineralized with GC Tooth Mousse™. The semiquantitative EDX analysis of individual elements in the surface layer of the enamel indicates that during tooth-bleaching with HP statistically significant loss of Na and Mg occurs, whereas the bleaching in combination with a laser leads to statistically significant loss of Ca and P. The results undoubtedly confirm that teeth whitening procedures lead to enamel alterations. In this context, it must be noted that laser bleaching is more aggressive for dental substances. However, these changes are reversible and can be repaired by application of remineralization toothpastes. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.

  3. Validation of PC-based Sound Card with Biopac for Digitalization of ECG Recording in Short-term HRV Analysis.

    PubMed

    Maheshkumar, K; Dilara, K; Maruthy, K N; Sundareswaren, L

    2016-07-01

    Heart rate variability (HRV) analysis is a simple and noninvasive technique capable of assessing autonomic nervous system modulation on heart rate (HR) in healthy as well as disease conditions. The aim of the present study was to compare (validate) the HRV using a temporal series of electrocardiograms (ECG) obtained by simple analog amplifier with PC-based sound card (audacity) and Biopac MP36 module. Based on the inclusion criteria, 120 healthy participants, including 72 males and 48 females, participated in the present study. Following standard protocol, 5-min ECG was recorded after 10 min of supine rest by Portable simple analog amplifier PC-based sound card as well as by Biopac module with surface electrodes in Leads II position simultaneously. All the ECG data was visually screened and was found to be free of ectopic beats and noise. RR intervals from both ECG recordings were analyzed separately in Kubios software. Short-term HRV indexes in both time and frequency domain were used. The unpaired Student's t-test and Pearson correlation coefficient test were used for the analysis using the R statistical software. No statistically significant differences were observed when comparing the values analyzed by means of the two devices for HRV. Correlation analysis revealed perfect positive correlation (r = 0.99, P < 0.001) between the values in time and frequency domain obtained by the devices. On the basis of the results of the present study, we suggest that the calculation of HRV values in the time and frequency domains by RR series obtained from the PC-based sound card is probably as reliable as those obtained by the gold standard Biopac MP36.

  4. Quality assessment of butter cookies applying multispectral imaging

    PubMed Central

    Andresen, Mette S; Dissing, Bjørn S; Løje, Hanne

    2013-01-01

    A method for characterization of butter cookie quality by assessing the surface browning and water content using multispectral images is presented. Based on evaluations of the browning of butter cookies, cookies were manually divided into groups. From this categorization, reference values were calculated for a statistical prediction model correlating multispectral images with a browning score. The browning score is calculated as a function of oven temperature and baking time. It is presented as a quadratic response surface. The investigated process window was the intervals 4–16 min and 160–200°C in a forced convection electrically heated oven. In addition to the browning score, a model for predicting the average water content based on the same images is presented. This shows how multispectral images of butter cookies may be used for the assessment of different quality parameters. Statistical analysis showed that the most significant wavelengths for browning predictions were in the interval 400–700 nm and the wavelengths significant for water prediction were primarily located in the near-infrared spectrum. The water prediction model was found to correctly estimate the average water content with an absolute error of 0.22%. From the images it was also possible to follow the browning and drying propagation from the cookie edge toward the center. PMID:24804036

  5. Groundwater flow and hydrogeochemical evolution in the Jianghan Plain, central China

    NASA Astrophysics Data System (ADS)

    Gan, Yiqun; Zhao, Ke; Deng, Yamin; Liang, Xing; Ma, Teng; Wang, Yanxin

    2018-05-01

    Hydrogeochemical analysis and multivariate statistics were applied to identify flow patterns and major processes controlling the hydrogeochemistry of groundwater in the Jianghan Plain, which is located in central Yangtze River Basin (central China) and characterized by intensive surface-water/groundwater interaction. Although HCO3-Ca-(Mg) type water predominated in the study area, the 457 (21 surface water and 436 groundwater) samples were effectively classified into five clusters by hierarchical cluster analysis. The hydrochemical variations among these clusters were governed by three factors from factor analysis. Major components (e.g., Ca, Mg and HCO3) in surface water and groundwater originated from carbonate and silicate weathering (factor 1). Redox conditions (factor 2) influenced the geogenic Fe and As contamination in shallow confined groundwater. Anthropogenic activities (factor 3) primarily caused high levels of Cl and SO4 in surface water and phreatic groundwater. Furthermore, the factor score 1 of samples in the shallow confined aquifer gradually increased along the flow paths. This study demonstrates that enhanced information on hydrochemistry in complex groundwater flow systems, by multivariate statistical methods, improves the understanding of groundwater flow and hydrogeochemical evolution due to natural and anthropogenic impacts.

  6. Digital data base application to porphyry copper mineralization in Alaska; case study summary

    USGS Publications Warehouse

    Trautwein, Charles M.; Greenlee, David D.; Orr, Donald G.

    1982-01-01

    The purpose of this report is to summarize the progress in use of digital image analysis techniques in developing a conceptual model for assessing porphyry copper mineral potential. The study area consists of approximately the southern one-half of the 1? by 3? Nabesna quadrangle in east-central Alaska. The digital geologic data base consists of data compiled under the Alaskan Mineral Resource Assessment Program (AMRAP) as well as digital elevation data and Landsat spectral reflectance data from the Multispectral Scanner System. The digital data base used to develop and implement a conceptual model for porphyry-type copper mineralization consisted of 16 original data types and 18 derived data sets formatted in a grid-cell (raster) structure and registered to a map base in the Universal Transverse Mercator (UTM) projection. Minimum curvature and inverse distance squared interpolation techniques were used to generate continuous surfaces from sets of irregularly spaced data points. Processing requirements included: (1) merging or overlaying of data sets, (2) display and color coding of maps and images, (3) univariate and multivariate statistical analyses, and (4) compound overlaying operations. Data sets were merged and processed to create stereoscopic displays of continuous surfaces. The ratio of several data sets were calculated to evaluate relative variations and to enhance the display of surface alteration (gossans). Factor analysis and principal components analysis techniques were used to determine complex relationships and correlations between data sets. The resultant model consists of 10 parameters that identify three areas most likely to contain porphyry copper mineralization; two of these areas are known occurrences of mineralization and the third is not well known. Field studies confirmed that the three areas identified by the model have significant copper potential.

  7. Statistical quality control through overall vibration analysis

    NASA Astrophysics Data System (ADS)

    Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos

    2010-05-01

    The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence of predictive variables (high-frequency vibration displacements) that are sensible to the processes setup and the quality of the products obtained. Based on the result of this overall vibration analysis, a second paper will analyse self-induced vibration spectrums in order to define limit vibration bands, controllable every cycle or connected to permanent vibration-monitoring systems able to adjust sensible process variables identified by ANOVA, once the vibration readings exceed established quality limits.

  8. Production of biodiesel from coastal macroalgae (Chara vulgaris) and optimization of process parameters using Box-Behnken design.

    PubMed

    Siddiqua, Shaila; Mamun, Abdullah Al; Enayetul Babar, Sheikh Md

    2015-01-01

    Renewable biodiesels are needed as an alternative to petroleum-derived transport fuels, which contribute to global warming and are of limited availability. Algae biomass, are a potential source of renewable energy, and they can be converted into energy such as biofuels. This study introduces an integrated method for the production of biodiesel from Chara vulgaris algae collected from the coastal region of Bangladesh. The Box-Behnken design based on response surface methods (RSM) used as the statistical tool to optimize three variables for predicting the best performing conditions (calorific value and yield) of algae biodiesel. The three parameters for production condition were chloroform (X1), sodium chloride concentration (X2) and temperature (X3). Optimal conditions were estimated by the aid of statistical regression analysis and surface plot chart. The optimal condition of biodiesel production parameter for 12 g of dry algae biomass was observed to be 198 ml chloroform with 0.75 % sodium chloride at 65 °C temperature, where the calorific value of biodiesel is 9255.106 kcal/kg and yield 3.6 ml.

  9. Statistical optimization of arsenic biosorption by microbial enzyme via Ca-alginate beads.

    PubMed

    Banerjee, Suchetana; Banerjee, Anindita; Sarkar, Priyabrata

    2018-04-16

    Bioremediation of arsenic using green technology via microbial enzymes has attracted scientists due to its simplicity and cost effectiveness. Statistical optimization of arsenate bioremediation was conducted by the enzyme arsenate reductase extracted from arsenic tolerant bacterium Pseudomonas alcaligenes. Response surface methodology based on Box-Behnken design matrix was performed to determine the optimal operational conditions of a multivariable system and their interactive effects on the bioremediation process. The highest biosorptive activity of 96.2 µg gm -1 of beads was achieved under optimized conditions (pH = 7.0; As (V) concentration = 1000 ppb; time = 2 h). SEM analysis showed the morphological changes on the surface of enzyme immobilized gluteraldehyde crosslinked Ca-alginate beads. The immobilized enzyme retained its activity for 8 cycles. ANOVA with a high correlation coefficient (R 2 > 0.99) and lower "Prob > F"value (<0.0001) corroborated the second-order polynomial model for the biosorption process. This study on the adsorptive removal of As (V) by enzyme-loaded biosorbent revealed a possible way of its application in large scale treatment of As (V)-contaminated water bodies.

  10. A virtual climate library of surface temperature over North America for 1979-2015

    NASA Astrophysics Data System (ADS)

    Kravtsov, Sergey; Roebber, Paul; Brazauskas, Vytaras

    2017-10-01

    The most comprehensive continuous-coverage modern climatic data sets, known as reanalyses, come from combining state-of-the-art numerical weather prediction (NWP) models with diverse available observations. These reanalysis products estimate the path of climate evolution that actually happened, and their use in a probabilistic context—for example, to document trends in extreme events in response to climate change—is, therefore, limited. Free runs of NWP models without data assimilation can in principle be used for the latter purpose, but such simulations are computationally expensive and are prone to systematic biases. Here we produce a high-resolution, 100-member ensemble simulation of surface atmospheric temperature over North America for the 1979-2015 period using a comprehensive spatially extended non-stationary statistical model derived from the data based on the North American Regional Reanalysis. The surrogate climate realizations generated by this model are independent from, yet nearly statistically congruent with reality. This data set provides unique opportunities for the analysis of weather-related risk, with applications in agriculture, energy development, and protection of human life.

  11. A virtual climate library of surface temperature over North America for 1979–2015

    PubMed Central

    Kravtsov, Sergey; Roebber, Paul; Brazauskas, Vytaras

    2017-01-01

    The most comprehensive continuous-coverage modern climatic data sets, known as reanalyses, come from combining state-of-the-art numerical weather prediction (NWP) models with diverse available observations. These reanalysis products estimate the path of climate evolution that actually happened, and their use in a probabilistic context—for example, to document trends in extreme events in response to climate change—is, therefore, limited. Free runs of NWP models without data assimilation can in principle be used for the latter purpose, but such simulations are computationally expensive and are prone to systematic biases. Here we produce a high-resolution, 100-member ensemble simulation of surface atmospheric temperature over North America for the 1979–2015 period using a comprehensive spatially extended non-stationary statistical model derived from the data based on the North American Regional Reanalysis. The surrogate climate realizations generated by this model are independent from, yet nearly statistically congruent with reality. This data set provides unique opportunities for the analysis of weather-related risk, with applications in agriculture, energy development, and protection of human life. PMID:29039842

  12. A virtual climate library of surface temperature over North America for 1979-2015.

    PubMed

    Kravtsov, Sergey; Roebber, Paul; Brazauskas, Vytaras

    2017-10-17

    The most comprehensive continuous-coverage modern climatic data sets, known as reanalyses, come from combining state-of-the-art numerical weather prediction (NWP) models with diverse available observations. These reanalysis products estimate the path of climate evolution that actually happened, and their use in a probabilistic context-for example, to document trends in extreme events in response to climate change-is, therefore, limited. Free runs of NWP models without data assimilation can in principle be used for the latter purpose, but such simulations are computationally expensive and are prone to systematic biases. Here we produce a high-resolution, 100-member ensemble simulation of surface atmospheric temperature over North America for the 1979-2015 period using a comprehensive spatially extended non-stationary statistical model derived from the data based on the North American Regional Reanalysis. The surrogate climate realizations generated by this model are independent from, yet nearly statistically congruent with reality. This data set provides unique opportunities for the analysis of weather-related risk, with applications in agriculture, energy development, and protection of human life.

  13. Nonclassical point of view of the Brownian motion generation via fractional deterministic model

    NASA Astrophysics Data System (ADS)

    Gilardi-Velázquez, H. E.; Campos-Cantón, E.

    In this paper, we present a dynamical system based on the Langevin equation without stochastic term and using fractional derivatives that exhibit properties of Brownian motion, i.e. a deterministic model to generate Brownian motion is proposed. The stochastic process is replaced by considering an additional degree of freedom in the second-order Langevin equation. Thus, it is transformed into a system of three first-order linear differential equations, additionally α-fractional derivative are considered which allow us to obtain better statistical properties. Switching surfaces are established as a part of fluctuating acceleration. The final system of three α-order linear differential equations does not contain a stochastic term, so the system generates motion in a deterministic way. Nevertheless, from the time series analysis, we found that the behavior of the system exhibits statistics properties of Brownian motion, such as, a linear growth in time of mean square displacement, a Gaussian distribution. Furthermore, we use the detrended fluctuation analysis to prove the Brownian character of this motion.

  14. Characterization of a neutron sensitive MCP/Timepix detector for quantitative image analysis at a pulsed neutron source

    NASA Astrophysics Data System (ADS)

    Watanabe, Kenichi; Minniti, Triestino; Kockelmann, Winfried; Dalgliesh, Robert; Burca, Genoveva; Tremsin, Anton S.

    2017-07-01

    The uncertainties and the stability of a neutron sensitive MCP/Timepix detector when operating in the event timing mode for quantitative image analysis at a pulsed neutron source were investigated. The dominant component to the uncertainty arises from the counting statistics. The contribution of the overlap correction to the uncertainty was concluded to be negligible from considerations based on the error propagation even if a pixel occupation probability is more than 50%. We, additionally, have taken into account the multiple counting effect in consideration of the counting statistics. Furthermore, the detection efficiency of this detector system changes under relatively high neutron fluxes due to the ageing effects of current Microchannel Plates. Since this efficiency change is position-dependent, it induces a memory image. The memory effect can be significantly reduced with correction procedures using the rate equations describing the permanent gain degradation and the scrubbing effect on the inner surfaces of the MCP pores.

  15. Effective Thermal Inactivation of the Spores of Bacillus cereus Biofilms Using Microwave.

    PubMed

    Park, Hyong Seok; Yang, Jungwoo; Choi, Hee Jung; Kim, Kyoung Heon

    2017-07-28

    Microwave sterilization was performed to inactivate the spores of biofilms of Bacillus cereus involved in foodborne illness. The sterilization conditions, such as the amount of water and the operating temperature and treatment time, were optimized using statistical analysis based on 15 runs of experimental results designed by the Box-Behnken method. Statistical analysis showed that the optimal conditions for the inactivation of B. cereus biofilms were 14 ml of water, 108°C of temperature, and 15 min of treatment time. Interestingly, response surface plots showed that the amount of water is the most important factor for microwave sterilization under the present conditions. Complete inactivation by microwaves was achieved in 5 min, and the inactivation efficiency by microwave was obviously higher than that by conventional steam autoclave. Finally, confocal laser scanning microscopy images showed that the principal effect of microwave treatment was cell membrane disruption. Thus, this study can contribute to the development of a process to control food-associated pathogens.

  16. Mapping and monitoring changes in vegetation communities of Jasper Ridge, CA, using spectral fractions derived from AVIRIS images

    NASA Technical Reports Server (NTRS)

    Sabol, Donald E., Jr.; Roberts, Dar A.; Adams, John B.; Smith, Milton O.

    1993-01-01

    An important application of remote sensing is to map and monitor changes over large areas of the land surface. This is particularly significant with the current interest in monitoring vegetation communities. Most of traditional methods for mapping different types of plant communities are based upon statistical classification techniques (i.e., parallel piped, nearest-neighbor, etc.) applied to uncalibrated multispectral data. Classes from these techniques are typically difficult to interpret (particularly to a field ecologist/botanist). Also, classes derived for one image can be very different from those derived from another image of the same area, making interpretation of observed temporal changes nearly impossible. More recently, neural networks have been applied to classification. Neural network classification, based upon spectral matching, is weak in dealing with spectral mixtures (a condition prevalent in images of natural surfaces). Another approach to mapping vegetation communities is based on spectral mixture analysis, which can provide a consistent framework for image interpretation. Roberts et al. (1990) mapped vegetation using the band residuals from a simple mixing model (the same spectral endmembers applied to all image pixels). Sabol et al. (1992b) and Roberts et al. (1992) used different methods to apply the most appropriate spectral endmembers to each image pixel, thereby allowing mapping of vegetation based upon the the different endmember spectra. In this paper, we describe a new approach to classification of vegetation communities based upon the spectra fractions derived from spectral mixture analysis. This approach was applied to three 1992 AVIRIS images of Jasper Ridge, California to observe seasonal changes in surface composition.

  17. Statistical Analysis of PDF's for Na Released by Photons from Solid Surfaces

    NASA Astrophysics Data System (ADS)

    Gamborino, D.; Wurz, P.

    2018-05-01

    We analyse the adequacy of three model speed PDF's previously used to describe the desorption of Na from a solid surface either by ESD or PSD. We found that the Maxwell PDF is too wide compared to measurements and non-thermal PDF's are better suited.

  18. Repairability of CAD/CAM high-density PMMA- and composite-based polymers.

    PubMed

    Wiegand, Annette; Stucki, Lukas; Hoffmann, Robin; Attin, Thomas; Stawarczyk, Bogna

    2015-11-01

    The study aimed to analyse the shear bond strength of computer-aided design and computer-aided manufacturing (CAD/CAM) polymethyl methacrylate (PMMA)- and composite-based polymer materials repaired with a conventional methacrylate-based composite after different surface pretreatments. Each 48 specimens was prepared from six different CAD/CAM polymer materials (Ambarino high-class, artBloc Temp, CAD-Temp, Lava Ultimate, Telio CAD, Everest C-Temp) and a conventional dimethacrylate-based composite (Filtek Supreme XTE, control) and aged by thermal cycling (5000 cycles, 5-55 °C). The surfaces were left untreated or were pretreated by mechanical roughening, aluminium oxide air abrasion or silica coating/silanization (each subgroup n = 12). The surfaces were further conditioned with an etch&rinse adhesive (OptiBond FL) before the repair composite (Filtek Supreme XTE) was adhered to the surface. After further thermal cycling, shear bond strength was tested, and failure modes were assessed. Shear bond strength was statistically analysed by two- and one-way ANOVAs and Weibull statistics, failure mode by chi(2) test (p ≤ 0.05). Shear bond strength was highest for silica coating/silanization > aluminium oxide air abrasion = mechanical roughening > no surface pretreatment. Independently of the repair pretreatment, highest bond strength values were observed in the control group and for the composite-based Everest C-Temp and Ambarino high-class, while PMMA-based materials (artBloc Temp, CAD-Temp and Telio CAD) presented significantly lowest values. For all materials, repair without any surface pretreatment resulted in adhesive failures only, which mostly were reduced when surface pretreatment was performed. Repair of CAD/CAM high-density polymers requires surface pretreatment prior to adhesive and composite application. However, four out of six of the tested CAD/CAM materials did not achieve the repair bond strength of a conventional dimethacrylate-based composite. Repair of PMMA- and composite-based polymers can be achieved by surface pretreatment followed by application of an adhesive and a conventional methacrylate-based composite.

  19. Axial displacement of external and internal implant-abutment connection evaluated by linear mixed model analysis.

    PubMed

    Seol, Hyon-Woo; Heo, Seong-Joo; Koak, Jai-Young; Kim, Seong-Kyun; Kim, Shin-Koo

    2015-01-01

    To analyze the axial displacement of external and internal implant-abutment connection after cyclic loading. Three groups of external abutments (Ext group), an internal tapered one-piece-type abutment (Int-1 group), and an internal tapered two-piece-type abutment (Int-2 group) were prepared. Cyclic loading was applied to implant-abutment assemblies at 150 N with a frequency of 3 Hz. The amount of axial displacement, the Periotest values (PTVs), and the removal torque values(RTVs) were measured. Both a repeated measures analysis of variance and pattern analysis based on the linear mixed model were used for statistical analysis. Scanning electron microscopy (SEM) was used to evaluate the surface of the implant-abutment connection. The mean axial displacements after 1,000,000 cycles were 0.6 μm in the Ext group, 3.7 μm in the Int-1 group, and 9.0 μm in the Int-2 group. Pattern analysis revealed a breakpoint at 171 cycles. The Ext group showed no declining pattern, and the Int-1 group showed no declining pattern after the breakpoint (171 cycles). However, the Int-2 group experienced continuous axial displacement. After cyclic loading, the PTV decreased in the Int-2 group, and the RTV decreased in all groups. SEM imaging revealed surface wear in all groups. Axial displacement and surface wear occurred in all groups. The PTVs remained stable, but the RTVs decreased after cyclic loading. Based on linear mixed model analysis, the Ext and Int-1 groups' axial displacements plateaued after little cyclic loading. The Int-2 group's rate of axial displacement slowed after 100,000 cycles.

  20. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    NASA Astrophysics Data System (ADS)

    Feng, Jinchao; Lansford, Joshua; Mironenko, Alexander; Pourkargar, Davood Babaei; Vlachos, Dionisios G.; Katsoulakis, Markos A.

    2018-03-01

    We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data). The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  1. Assessing the cleanliness of surfaces: Innovative molecular approaches vs. standard spore assays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooper, M.; Duc, M.T. La; Probst, A.

    2011-04-01

    A bacterial spore assay and a molecular DNA microarray method were compared for their ability to assess relative cleanliness in the context of bacterial abundance and diversity on spacecraft surfaces. Colony counts derived from the NASA standard spore assay were extremely low for spacecraft surfaces. However, the PhyloChip generation 3 (G3) DNA microarray resolved the genetic signatures of a highly diverse suite of microorganisms in the very same sample set. Samples completely devoid of cultivable spores were shown to harbor the DNA of more than 100 distinct microbial phylotypes. Furthermore, samples with higher numbers of cultivable spores did not necessarilymore » give rise to a greater microbial diversity upon analysis with the DNA microarray. The findings of this study clearly demonstrated that there is not a statistically significant correlation between the cultivable spore counts obtained from a sample and the degree of bacterial diversity present. Based on these results, it can be stated that validated state-of-the-art molecular techniques, such as DNA microarrays, can be utilized in parallel with classical culture-based methods to further describe the cleanliness of spacecraft surfaces.« less

  2. Investigation of priorities in water quality management based on correlations and variations.

    PubMed

    Boyacıoğlu, Hülya; Gündogdu, Vildan; Boyacıoğlu, Hayal

    2013-04-15

    The development of water quality assessment strategies investigating spatial and temporal changes caused by natural and anthropogenic phenomena is an important tool in management practices. This paper used cluster analysis, water quality index method, sensitivity analysis and canonical correlation analysis to investigate priorities in pollution control activities. Data sets representing 22 surface water quality parameters were subject to analysis. Results revealed that organic pollution was serious threat for overall water quality in the region. Besides, oil and grease, lead and mercury were the critical variables violating the standard. In contrast to inorganic variables, organic and physical-inorganic chemical parameters were influenced by variations in physical conditions (discharge, temperature). This study showed that information produced based on the variations and correlations in water quality data sets can be helpful to investigate priorities in water management activities. Moreover statistical techniques and index methods are useful tools in data - information transformation process. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Vision-based localization of the center of mass of large space debris via statistical shape analysis

    NASA Astrophysics Data System (ADS)

    Biondi, G.; Mauro, S.; Pastorelli, S.

    2017-08-01

    The current overpopulation of artificial objects orbiting the Earth has increased the interest of the space agencies on planning missions for de-orbiting the largest inoperative satellites. Since this kind of operations involves the capture of the debris, the accurate knowledge of the position of their center of mass is a fundamental safety requirement. As ground observations are not sufficient to reach the required accuracy level, this information should be acquired in situ just before any contact between the chaser and the target. Some estimation methods in the literature rely on the usage of stereo cameras for tracking several features of the target surface. The actual positions of these features are estimated together with the location of the center of mass by state observers. The principal drawback of these methods is related to possible sudden disappearances of one or more features from the field of view of the cameras. An alternative method based on 3D Kinematic registration is presented in this paper. The method, which does not suffer of the mentioned drawback, considers a preliminary reduction of the inaccuracies in detecting features by the usage of statistical shape analysis.

  4. Design Oriented Structural Modeling for Airplane Conceptual Design Optimization

    NASA Technical Reports Server (NTRS)

    Livne, Eli

    1999-01-01

    The main goal for research conducted with the support of this grant was to develop design oriented structural optimization methods for the conceptual design of airplanes. Traditionally in conceptual design airframe weight is estimated based on statistical equations developed over years of fitting airplane weight data in data bases of similar existing air- planes. Utilization of such regression equations for the design of new airplanes can be justified only if the new air-planes use structural technology similar to the technology on the airplanes in those weight data bases. If any new structural technology is to be pursued or any new unconventional configurations designed the statistical weight equations cannot be used. In such cases any structural weight estimation must be based on rigorous "physics based" structural analysis and optimization of the airframes under consideration. Work under this grant progressed to explore airframe design-oriented structural optimization techniques along two lines of research: methods based on "fast" design oriented finite element technology and methods based on equivalent plate / equivalent shell models of airframes, in which the vehicle is modelled as an assembly of plate and shell components, each simulating a lifting surface or nacelle / fuselage pieces. Since response to changes in geometry are essential in conceptual design of airplanes, as well as the capability to optimize the shape itself, research supported by this grant sought to develop efficient techniques for parametrization of airplane shape and sensitivity analysis with respect to shape design variables. Towards the end of the grant period a prototype automated structural analysis code designed to work with the NASA Aircraft Synthesis conceptual design code ACS= was delivered to NASA Ames.

  5. Quantifying the influences of various ecological factors on land surface temperature of urban forests.

    PubMed

    Ren, Yin; Deng, Lu-Ying; Zuo, Shu-Di; Song, Xiao-Dong; Liao, Yi-Lan; Xu, Cheng-Dong; Chen, Qi; Hua, Li-Zhong; Li, Zheng-Wei

    2016-09-01

    Identifying factors that influence the land surface temperature (LST) of urban forests can help improve simulations and predictions of spatial patterns of urban cool islands. This requires a quantitative analytical method that combines spatial statistical analysis with multi-source observational data. The purpose of this study was to reveal how human activities and ecological factors jointly influence LST in clustering regions (hot or cool spots) of urban forests. Using Xiamen City, China from 1996 to 2006 as a case study, we explored the interactions between human activities and ecological factors, as well as their influences on urban forest LST. Population density was selected as a proxy for human activity. We integrated multi-source data (forest inventory, digital elevation models (DEM), population, and remote sensing imagery) to develop a database on a unified urban scale. The driving mechanism of urban forest LST was revealed through a combination of multi-source spatial data and spatial statistical analysis of clustering regions. The results showed that the main factors contributing to urban forest LST were dominant tree species and elevation. The interactions between human activity and specific ecological factors linearly or nonlinearly increased LST in urban forests. Strong interactions between elevation and dominant species were generally observed and were prevalent in either hot or cold spots areas in different years. In conclusion, quantitative studies based on spatial statistics and GeogDetector models should be conducted in urban areas to reveal interactions between human activities, ecological factors, and LST. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Prediction of phenotypes of missense mutations in human proteins from biological assemblies.

    PubMed

    Wei, Qiong; Xu, Qifang; Dunbrack, Roland L

    2013-02-01

    Single nucleotide polymorphisms (SNPs) are the most frequent variation in the human genome. Nonsynonymous SNPs that lead to missense mutations can be neutral or deleterious, and several computational methods have been presented that predict the phenotype of human missense mutations. These methods use sequence-based and structure-based features in various combinations, relying on different statistical distributions of these features for deleterious and neutral mutations. One structure-based feature that has not been studied significantly is the accessible surface area within biologically relevant oligomeric assemblies. These assemblies are different from the crystallographic asymmetric unit for more than half of X-ray crystal structures. We find that mutations in the core of proteins or in the interfaces in biological assemblies are significantly more likely to be disease-associated than those on the surface of the biological assemblies. For structures with more than one protein in the biological assembly (whether the same sequence or different), we find the accessible surface area from biological assemblies provides a statistically significant improvement in prediction over the accessible surface area of monomers from protein crystal structures (P = 6e-5). When adding this information to sequence-based features such as the difference between wildtype and mutant position-specific profile scores, the improvement from biological assemblies is statistically significant but much smaller (P = 0.018). Combining this information with sequence-based features in a support vector machine leads to 82% accuracy on a balanced dataset of 50% disease-associated mutations from SwissVar and 50% neutral mutations from human/primate sequence differences in orthologous proteins. Copyright © 2012 Wiley Periodicals, Inc.

  7. Integrated GIS and multivariate statistical analysis for regional scale assessment of heavy metal soil contamination: A critical review.

    PubMed

    Hou, Deyi; O'Connor, David; Nathanail, Paul; Tian, Li; Ma, Yan

    2017-12-01

    Heavy metal soil contamination is associated with potential toxicity to humans or ecotoxicity. Scholars have increasingly used a combination of geographical information science (GIS) with geostatistical and multivariate statistical analysis techniques to examine the spatial distribution of heavy metals in soils at a regional scale. A review of such studies showed that most soil sampling programs were based on grid patterns and composite sampling methodologies. Many programs intended to characterize various soil types and land use types. The most often used sampling depth intervals were 0-0.10 m, or 0-0.20 m, below surface; and the sampling densities used ranged from 0.0004 to 6.1 samples per km 2 , with a median of 0.4 samples per km 2 . The most widely used spatial interpolators were inverse distance weighted interpolation and ordinary kriging; and the most often used multivariate statistical analysis techniques were principal component analysis and cluster analysis. The review also identified several determining and correlating factors in heavy metal distribution in soils, including soil type, soil pH, soil organic matter, land use type, Fe, Al, and heavy metal concentrations. The major natural and anthropogenic sources of heavy metals were found to derive from lithogenic origin, roadway and transportation, atmospheric deposition, wastewater and runoff from industrial and mining facilities, fertilizer application, livestock manure, and sewage sludge. This review argues that the full potential of integrated GIS and multivariate statistical analysis for assessing heavy metal distribution in soils on a regional scale has not yet been fully realized. It is proposed that future research be conducted to map multivariate results in GIS to pinpoint specific anthropogenic sources, to analyze temporal trends in addition to spatial patterns, to optimize modeling parameters, and to expand the use of different multivariate analysis tools beyond principal component analysis (PCA) and cluster analysis (CA). Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Influence of Structural Features and Fracture Processes on Surface Roughness: A Case Study from the Krosno Sandstones of the Górka-Mucharz Quarry (Little Beskids, Southern Poland)

    NASA Astrophysics Data System (ADS)

    Pieczara, Łukasz

    2015-09-01

    The paper presents the results of analysis of surface roughness parameters in the Krosno Sandstones of Mucharz, southern Poland. It was aimed at determining whether these parameters are influenced by structural features (mainly the laminar distribution of mineral components and directional distribution of non-isometric grains) and fracture processes. The tests applied in the analysis enabled us to determine and describe the primary statistical parameters used in the quantitative description of surface roughness, as well as specify the usefulness of contact profilometry as a method of visualizing spatial differentiation of fracture processes in rocks. These aims were achieved by selecting a model material (Krosno Sandstones from the Górka-Mucharz Quarry) and an appropriate research methodology. The schedule of laboratory analyses included: identification analyses connected with non-destructive ultrasonic tests, aimed at the preliminary determination of rock anisotropy, strength point load tests (cleaved surfaces were obtained due to destruction of rock samples), microscopic analysis (observation of thin sections in order to determine the mechanism of inducing fracture processes) and a test method of measuring surface roughness (two- and three-dimensional diagrams, topographic and contour maps, and statistical parameters of surface roughness). The highest values of roughness indicators were achieved for surfaces formed under the influence of intragranular fracture processes (cracks propagating directly through grains). This is related to the structural features of the Krosno Sandstones (distribution of lamination and bedding).

  9. Analysis of Climatic and Environmental Changes Using CLEARS Web-GIS Information-Computational System: Siberia Case Study

    NASA Astrophysics Data System (ADS)

    Titov, A. G.; Gordov, E. P.; Okladnikov, I.; Shulgina, T. M.

    2011-12-01

    Analysis of recent climatic and environmental changes in Siberia performed on the basis of the CLEARS (CLimate and Environment Analysis and Research System) information-computational system is presented. The system was developed using the specialized software framework for rapid development of thematic information-computational systems based on Web-GIS technologies. It comprises structured environmental datasets, computational kernel, specialized web portal implementing web mapping application logic, and graphical user interface. Functional capabilities of the system include a number of procedures for mathematical and statistical analysis, data processing and visualization. At present a number of georeferenced datasets is available for processing including two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 and ERA Interim Reanalysis, meteorological observation data for the territory of the former USSR, and others. Firstly, using functionality of the computational kernel employing approved statistical methods it was shown that the most reliable spatio-temporal characteristics of surface temperature and precipitation in Siberia in the second half of 20th and beginning of 21st centuries are provided by ERA-40/ERA Interim Reanalysis and APHRODITE JMA Reanalysis, respectively. Namely those Reanalyses are statistically consistent with reliable in situ meteorological observations. Analysis of surface temperature and precipitation dynamics for the territory of Siberia performed on the base of the developed information-computational system reveals fine spatial and temporal details in heterogeneous patterns obtained for the region earlier. Dynamics of bioclimatic indices determining climate change impact on structure and functioning of regional vegetation cover was investigated as well. Analysis shows significant positive trends of growing season length accompanied by statistically significant increase of sum of growing degree days and total annual precipitation over the south of Western Siberia. In particular, we conclude that analysis of trends of growing season length, sum of growing degree-days and total precipitation during the growing season reveals a tendency to an increase of vegetation ecosystems productivity across the south of Western Siberia (55°-60°N, 59°-84°E) in the past several decades. The developed system functionality providing instruments for comparison of modeling and observational data and for reliable climatological analysis allowed us to obtain new results characterizing regional manifestations of global change. It should be added that each analysis performed using the system leads also to generation of the archive of spatio-temporal data fields ready for subsequent usage by other specialists. In particular, the archive of bioclimatic indices obtained will allow performing further detailed studies of interrelations between local climate and vegetation cover changes, including changes of carbon uptake related to variations of types and amount of vegetation and spatial shift of vegetation zones. This work is partially supported by RFBR grants #10-07-00547 and #11-05-01190-a, SB RAS Basic Program Projects 4.31.1.5 and 4.31.2.7.

  10. Dental Composite Restorations and Neuropsychological Development in Children: Treatment Level Analysis from a Randomized Clinical Trial

    PubMed Central

    Maserejian, Nancy N.; Trachtenberg, Felicia L.; Hauser, Russ; McKinlay, Sonja; Shrader, Peter; Bellinger, David C.

    2012-01-01

    Background Resin-based dental restorations may intra-orally release their components and bisphenol A. Gestational bisphenol A exposure has been associated with poorer executive functioning in children. Objectives To examine whether exposure to resin-based composite restorations is associated with neuropsychological development in children. Methods Secondary analysis of treatment level data from the New England Children’s Amalgam Trial, a 2-group randomized safety trial conducted from 1997–2006. Children (N=534) aged 6–10 y with >2 posterior tooth caries were randomized to treatment with amalgam or resin-based composites (bisphenol-A-diglycidyl-dimethacrylate-composite for permanent teeth; urethane dimethacrylate-based polyacid-modified compomer for primary teeth). Neuropsychological function at 4- and 5-year follow-up (N=444) was measured by a battery of tests of executive function, intelligence, memory, visual-spatial skills, verbal fluency, and problem-solving. Multivariable generalized linear regression models were used to examine the association between composite exposure levels and changes in neuropsychological test scores from baseline to follow-up. For comparison, data on children randomized to amalgam treatment were similarly analyzed. Results With greater exposure to either dental composite material, results were generally consistent in the direction of slightly poorer changes in tests of intelligence, achievement or memory, but there were no statistically significant associations. For the four primary measures of executive function, scores were slightly worse with greater total composite exposure, but statistically significant only for the test of Letter Fluency (10-surface-years β= −0.8, SE=0.4, P=0.035), and the subtest of color naming (β= −1.5, SE=0.5, P=0.004) in the Stroop Color-Word Interference Test. Multivariate analysis of variance confirmed that the negative associations between composite level and executive function were not statistically significant (MANOVA P=0.18). Results for greater amalgam exposure were mostly nonsignificant in the opposite direction of slightly improved scores over follow-up. Conclusions Dental composite restorations had statistically insignificant associations of small magnitude with impairments in neuropsychological test change scores over 4- or 5-years of follow-up in this trial. PMID:22906860

  11. Predicting Cell Association of Surface-Modified Nanoparticles Using Protein Corona Structure - Activity Relationships (PCSAR).

    PubMed

    Kamath, Padmaja; Fernandez, Alberto; Giralt, Francesc; Rallo, Robert

    2015-01-01

    Nanoparticles are likely to interact in real-case application scenarios with mixtures of proteins and biomolecules that will absorb onto their surface forming the so-called protein corona. Information related to the composition of the protein corona and net cell association was collected from literature for a library of surface-modified gold and silver nanoparticles. For each protein in the corona, sequence information was extracted and used to calculate physicochemical properties and statistical descriptors. Data cleaning and preprocessing techniques including statistical analysis and feature selection methods were applied to remove highly correlated, redundant and non-significant features. A weighting technique was applied to construct specific signatures that represent the corona composition for each nanoparticle. Using this basic set of protein descriptors, a new Protein Corona Structure-Activity Relationship (PCSAR) that relates net cell association with the physicochemical descriptors of the proteins that form the corona was developed and validated. The features that resulted from the feature selection were in line with already published literature, and the computational model constructed on these features had a good accuracy (R(2)LOO=0.76 and R(2)LMO(25%)=0.72) and stability, with the advantage that the fingerprints based on physicochemical descriptors were independent of the specific proteins that form the corona.

  12. Log-Normality and Multifractal Analysis of Flame Surface Statistics

    NASA Astrophysics Data System (ADS)

    Saha, Abhishek; Chaudhuri, Swetaprovo; Law, Chung K.

    2013-11-01

    The turbulent flame surface is typically highly wrinkled and folded at a multitude of scales controlled by various flame properties. It is useful if the information contained in this complex geometry can be projected onto a simpler regular geometry for the use of spectral, wavelet or multifractal analyses. Here we investigate local flame surface statistics of turbulent flame expanding under constant pressure. First the statistics of local length ratio is experimentally obtained from high-speed Mie scattering images. For spherically expanding flame, length ratio on the measurement plane, at predefined equiangular sectors is defined as the ratio of the actual flame length to the length of a circular-arc of radius equal to the average radius of the flame. Assuming isotropic distribution of such flame segments we convolute suitable forms of the length-ratio probability distribution functions (pdfs) to arrive at corresponding area-ratio pdfs. Both the pdfs are found to be near log-normally distributed and shows self-similar behavior with increasing radius. Near log-normality and rather intermittent behavior of the flame-length ratio suggests similarity with dissipation rate quantities which stimulates multifractal analysis. Currently at Indian Institute of Science, India.

  13. Statistical modeling of an integrated boiler for coal fired thermal power plant.

    PubMed

    Chandrasekharan, Sreepradha; Panda, Rames Chandra; Swaminathan, Bhuvaneswari Natrajan

    2017-06-01

    The coal fired thermal power plants plays major role in the power production in the world as they are available in abundance. Many of the existing power plants are based on the subcritical technology which can produce power with the efficiency of around 33%. But the newer plants are built on either supercritical or ultra-supercritical technology whose efficiency can be up to 50%. Main objective of the work is to enhance the efficiency of the existing subcritical power plants to compensate for the increasing demand. For achieving the objective, the statistical modeling of the boiler units such as economizer, drum and the superheater are initially carried out. The effectiveness of the developed models is tested using analysis methods like R 2 analysis and ANOVA (Analysis of Variance). The dependability of the process variable (temperature) on different manipulated variables is analyzed in the paper. Validations of the model are provided with their error analysis. Response surface methodology (RSM) supported by DOE (design of experiments) are implemented to optimize the operating parameters. Individual models along with the integrated model are used to study and design the predictive control of the coal-fired thermal power plant.

  14. Sequential analysis of hydrochemical data for watershed characterization.

    PubMed

    Thyne, Geoffrey; Güler, Cüneyt; Poeter, Eileen

    2004-01-01

    A methodology for characterizing the hydrogeology of watersheds using hydrochemical data that combine statistical, geochemical, and spatial techniques is presented. Surface water and ground water base flow and spring runoff samples (180 total) from a single watershed are first classified using hierarchical cluster analysis. The statistical clusters are analyzed for spatial coherence confirming that the clusters have a geological basis corresponding to topographic flowpaths and showing that the fractured rock aquifer behaves as an equivalent porous medium on the watershed scale. Then principal component analysis (PCA) is used to determine the sources of variation between parameters. PCA analysis shows that the variations within the dataset are related to variations in calcium, magnesium, SO4, and HCO3, which are derived from natural weathering reactions, and pH, NO3, and chlorine, which indicate anthropogenic impact. PHREEQC modeling is used to quantitatively describe the natural hydrochemical evolution for the watershed and aid in discrimination of samples that have an anthropogenic component. Finally, the seasonal changes in the water chemistry of individual sites were analyzed to better characterize the spatial variability of vertical hydraulic conductivity. The integrated result provides a method to characterize the hydrogeology of the watershed that fully utilizes traditional data.

  15. COBRA ATD minefield detection model initial performance analysis

    NASA Astrophysics Data System (ADS)

    Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.

    2000-08-01

    A statistical performance analysis of the USMC Coastal Battlefield Reconnaissance and Analysis (COBRA) Minefield Detection (MFD) Model has been performed in support of the COBRA ATD Program under execution by the Naval Surface Warfare Center/Dahlgren Division/Coastal Systems Station . This analysis uses the Veridian ERIM International MFD model from the COBRA Sensor Performance Evaluation and Computational Tools for Research Analysis modeling toolbox and a collection of multispectral mine detection algorithm response distributions for mines and minelike clutter objects. These mine detection response distributions were generated form actual COBRA ATD test missions over littoral zone minefields. This analysis serves to validate both the utility and effectiveness of the COBRA MFD Model as a predictive MFD performance too. COBRA ATD minefield detection model algorithm performance results based on a simulate baseline minefield detection scenario are presented, as well as result of a MFD model algorithm parametric sensitivity study.

  16. Influence of different staining beverages on color stability, surface roughness and microhardness of silorane and methacrylate-based composite resins.

    PubMed

    Karaman, Emel; Tuncer, Duygu; Firat, Esra; Ozdemir, Oguz Suleyman; Karahan, Sevilay

    2014-05-01

    To investigate the influence of different staining beverages on color stability, surface roughness and microhardness of silorane and methacrylate-based composite resins. Three different composite resins (Filtek Silorane, Filtek P60, Filtek Supreme XT) were tested. Thirty cylindrical specimens (10 × 2 mm) per material were prepared and polished with a series of aluminum-oxide polishing disks. Each group was then randomly subdivided into three groups according to the test beverages: distilled water (control), cola and coffee. The samples were immersed into different beverages for 15 days. Color, surface roughness and microhardness values were measured by a spectrophotometer, prophylometer and Vickers hardness device respectively, at baseline and after 15 days. The data were subjected to statistical analysis. Immersion in coffee resulted in a significant discoloration for all the composites tested, although the color change was lower in Filtek Silorane than that of MBCs (p < 0.05). All the composites tested showed similar surface roughness changes after immersion in different beverages (p > 0.05). Besides coffee caused more roughness change than others. Immersion in coffee caused highest microhardness change in Filtek Supreme XT (p < 0.05). Cola and coffee altered, to some degree, the color, surface roughness and/or microhardness of the tested resin composites, depending on the characteristics of the materials.

  17. Full in-vitro analyses of new-generation bulk fill dental composites cured by halogen light.

    PubMed

    Tekin, Tuçe Hazal; Kantürk Figen, Aysel; Yılmaz Atalı, Pınar; Coşkuner Filiz, Bilge; Pişkin, Mehmet Burçin

    2017-08-01

    The objective of this study was to investigate the full in-vitro analyses of new-generation bulk-fill dental composites cured by halogen light (HLG). Two types' four composites were studied: Surefill SDR (SDR) and Xtra Base (XB) as bulk-fill flowable materials; QuixFill (QF) and XtraFill (XF) as packable bulk-fill materials. Samples were prepared for each analysis and test by applying the same procedure, but with different diameters and thicknesses appropriate to the analysis and test requirements. Thermal properties were determined by thermogravimetric analysis (TG/DTG) and differential scanning calorimetry (DSC) analysis; the Vickers microhardness (VHN) was measured after 1, 7, 15 and 30days of storage in water. The degree of conversion values for the materials (DC, %) were immediately measured using near-infrared spectroscopy (FT-IR). The surface morphology of the composites was investigated by scanning electron microscopes (SEM) and atomic-force microscopy (AFM) analyses. The sorption and solubility measurements were also performed after 1, 7, 15 and 30days of storage in water. In addition to his, the data were statistically analyzed using one-way analysis of variance, and both the Newman Keuls and Tukey multiple comparison tests. The statistical significance level was established at p<0.05. According to the ISO 4049 standards, all the tested materials showed acceptable water sorption and solubility, and a halogen light source was an option to polymerize bulk-fill, resin-based dental composites. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Local Versus Remote Contributions of Soil Moisture to Near-Surface Temperature Variability

    NASA Technical Reports Server (NTRS)

    Koster, R.; Schubert, S.; Wang, H.; Chang, Y.

    2018-01-01

    Soil moisture variations have a straightforward impact on overlying air temperatures, wetter soils can induce higher evaporative cooling of the soil and thus, locally, cooler temperatures overall. Not known, however, is the degree to which soil moisture variations can affect remote air temperatures through their impact on the atmospheric circulation. In this talk we describe a two-pronged analysis that addresses this question. In the first segment, an extensive ensemble of NASA/GSFC GEOS-5 atmospheric model simulations is analyzed statistically to isolate and quantify the contributions of various soil moisture states, both local and remote, to the variability of air temperature at a given local site. In the second segment, the relevance of the derived statistical relationships is evaluated by applying them to observations-based data. Results from the second segment suggest that the GEOS-5-based relationships do, at least to first order, hold in nature and thus may provide some skill to forecasts of air temperature at subseasonal time scales, at least in certain regions.

  19. Ship detection using STFT sea background statistical modeling for large-scale oceansat remote sensing image

    NASA Astrophysics Data System (ADS)

    Wang, Lixia; Pei, Jihong; Xie, Weixin; Liu, Jinyuan

    2018-03-01

    Large-scale oceansat remote sensing images cover a big area sea surface, which fluctuation can be considered as a non-stationary process. Short-Time Fourier Transform (STFT) is a suitable analysis tool for the time varying nonstationary signal. In this paper, a novel ship detection method using 2-D STFT sea background statistical modeling for large-scale oceansat remote sensing images is proposed. First, the paper divides the large-scale oceansat remote sensing image into small sub-blocks, and 2-D STFT is applied to each sub-block individually. Second, the 2-D STFT spectrum of sub-blocks is studied and the obvious different characteristic between sea background and non-sea background is found. Finally, the statistical model for all valid frequency points in the STFT spectrum of sea background is given, and the ship detection method based on the 2-D STFT spectrum modeling is proposed. The experimental result shows that the proposed algorithm can detect ship targets with high recall rate and low missing rate.

  20. Fragment size distribution statistics in dynamic fragmentation of laser shock-loaded tin

    NASA Astrophysics Data System (ADS)

    He, Weihua; Xin, Jianting; Zhao, Yongqiang; Chu, Genbai; Xi, Tao; Shui, Min; Lu, Feng; Gu, Yuqiu

    2017-06-01

    This work investigates the geometric statistics method to characterize the size distribution of tin fragments produced in the laser shock-loaded dynamic fragmentation process. In the shock experiments, the ejection of the tin sample with etched V-shape groove in the free surface are collected by the soft recovery technique. Subsequently, the produced fragments are automatically detected with the fine post-shot analysis techniques including the X-ray micro-tomography and the improved watershed method. To characterize the size distributions of the fragments, a theoretical random geometric statistics model based on Poisson mixtures is derived for dynamic heterogeneous fragmentation problem, which reveals linear combinational exponential distribution. The experimental data related to fragment size distributions of the laser shock-loaded tin sample are examined with the proposed theoretical model, and its fitting performance is compared with that of other state-of-the-art fragment size distribution models. The comparison results prove that our proposed model can provide far more reasonable fitting result for the laser shock-loaded tin.

  1. Multivariate mixed linear model analysis of longitudinal data: an information-rich statistical technique for analyzing disease resistance data

    USDA-ARS?s Scientific Manuscript database

    The mixed linear model (MLM) is currently among the most advanced and flexible statistical modeling techniques and its use in tackling problems in plant pathology has begun surfacing in the literature. The longitudinal MLM is a multivariate extension that handles repeatedly measured data, such as r...

  2. Lateral ventricle morphology analysis via mean latitude axis.

    PubMed

    Paniagua, Beatriz; Lyall, Amanda; Berger, Jean-Baptiste; Vachet, Clement; Hamer, Robert M; Woolson, Sandra; Lin, Weili; Gilmore, John; Styner, Martin

    2013-03-29

    Statistical shape analysis has emerged as an insightful method for evaluating brain structures in neuroimaging studies, however most shape frameworks are surface based and thus directly depend on the quality of surface alignment. In contrast, medial descriptions employ thickness information as alignment-independent shape metric. We propose a joint framework that computes local medial thickness information via a mean latitude axis from the well-known spherical harmonic (SPHARM-PDM) shape framework. In this work, we applied SPHARM derived medial representations to the morphological analysis of lateral ventricles in neonates. Mild ventriculomegaly (MVM) subjects are compared to healthy controls to highlight the potential of the methodology. Lateral ventricles were obtained from MRI scans of neonates (9-144 days of age) from 30 MVM subjects as well as age- and sex-matched normal controls (60 total). SPHARM-PDM shape analysis was extended to compute a mean latitude axis directly from the spherical parameterization. Local thickness and area was straightforwardly determined. MVM and healthy controls were compared using local MANOVA and compared with the traditional SPHARM-PDM analysis. Both surface and mean latitude axis findings differentiate successfully MVM and healthy lateral ventricle morphology. Lateral ventricles in MVM neonates show enlarged shapes in tail and head. Mean latitude axis is able to find significant differences all along the lateral ventricle shape, demonstrating that local thickness analysis provides significant insight over traditional SPHARM-PDM. This study is the first to precisely quantify 3D lateral ventricle morphology in MVM neonates using shape analysis.

  3. High-precision surface analysis of the roughness of Michelangelo's David

    NASA Astrophysics Data System (ADS)

    Fontana, Raffaella; Gambino, Maria Chiara; Greco, Marinella; Marras, Luciano; Materazzi, Marzia; Pampaloni, Enrico; Pezzati, Luca

    2003-10-01

    The knowledge of the shape of an artwork is an important element for its study and conservation. When dealing with a statue, roughness measurement is a very useful contribution to document its surface conditions, to assess either changes due to restoration intervention or surface decays due to wearing agents, and to monitor its time-evolution in terms of shape variations. In this work we present the preliminary results of the statistical analysis carried out on acquired data relative to six areas of the Michelangelo"s David marble statue, representative of differently degraded surfaces. Determination of the roughness and its relative characteristic wavelength is shown.

  4. Translucency of zirconia-based pressable ceramics with different core and veneer thicknesses.

    PubMed

    Jeong, Il-Do; Bae, So-Yeon; Kim, Dong-Yeon; Kim, Ji-Hwan; Kim, Woong-Chul

    2016-06-01

    Little information is available on the translucency of zirconia-based pressable ceramic restorations with a pressed ceramic veneer and zirconia core in various thickness combinations. The purpose of this in vitro study was to assess the translucency of 3 types of zirconia-based pressable ceramics for different core-veneer thickness combinations. A bilayered ceramic specimen was prepared with a pressable ceramic (IPS e.max Zirpress, Initial IQ, Rosetta UltraPress) veneer over a zirconia core (Zenostar Zr). Three groups of specimens (n=7) were formed with the following core+veneer thicknesses: 1 +0.5 mm, 0.7 +0.8 mm, and 0.5 +1 mm. To obtain consistent thickness and high translucency, all specimens were subjected to surface grinding with a grinding machine. To eliminate the effect of differences in roughness on the translucency, the surface roughness of the ground specimens was measured with a scanning profiler, and the consistency of these measured values was verified through statistical analysis. The luminous transmittance of the specimens was measured with a spectrophotometer. The effects of the pressable ceramic type and core-veneer thickness combination on transmittance were assessed using a 2-way ANOVA (α=.05). The consistency of the surface roughness among the tested specimens was confirmed using a 1-way ANOVA and the Tukey HSD post hoc test (P<.05). The luminous transmittance exhibited a statistically significant dependence on both the type of pressable ceramic and the core-veneer thickness combination (P<.05). The type of pressable ceramic and core-veneer thickness combination affected the translucency of the restoration. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  5. The Influence of Roughness on Gear Surface Fatigue

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy

    2005-01-01

    Gear working surfaces are subjected to repeated rolling and sliding contacts, and often designs require loads sufficient to cause eventual fatigue of the surface. This research provides experimental data and analytical tools to further the understanding of the causal relationship of gear surface roughness to surface fatigue. The research included evaluations and developments of statistical tools for gear fatigue data, experimental evaluation of the surface fatigue lives of superfinished gears with a near-mirror quality, and evaluations of the experiments by analytical methods and surface inspections. Alternative statistical methods were evaluated using Monte Carlo studies leading to a final recommendation to describe gear fatigue data using a Weibull distribution, maximum likelihood estimates of shape and scale parameters, and a presumed zero-valued location parameter. A new method was developed for comparing two datasets by extending the current methods of likelihood-ratio based statistics. The surface fatigue lives of superfinished gears were evaluated by carefully controlled experiments, and it is shown conclusively that superfinishing of gears can provide for significantly greater lives relative to ground gears. The measured life improvement was approximately a factor of five. To assist with application of this finding to products, the experimental condition was evaluated. The fatigue life results were expressed in terms of specific film thickness and shown to be consistent with bearing data. Elastohydrodynamic and stress analyses were completed to relate the stress condition to fatigue. Smooth-surface models do not adequately explain the improved fatigue lives. Based on analyses using a rough surface model, it is concluded that the improved fatigue lives of superfinished gears is due to a reduced rate of near-surface micropitting fatigue processes, not due to any reduced rate of spalling (sub-surface) fatigue processes. To complete the evaluations, surface inspection were completed. The surface topographies of the ground gears changed substantially due to running, but the topographies of the superfinished gears were essentially unchanged with running.

  6. Analysis of defect structure in silicon. Characterization of samples from UCP ingot 5848-13C

    NASA Technical Reports Server (NTRS)

    Natesh, R.; Guyer, T.; Stringfellow, G. B.

    1982-01-01

    Statistically significant quantitative structural imperfection measurements were made on samples from ubiquitous crystalline process (UCP) Ingot 5848 - 13 C. Important trends were noticed between the measured data, cell efficiency, and diffusion length. Grain boundary substructure appears to have an important effect on the conversion efficiency of solar cells from Semix material. Quantitative microscopy measurements give statistically significant information compared to other microanalytical techniques. A surface preparation technique to obtain proper contrast of structural defects suitable for QTM analysis was perfected.

  7. Gaussian curvature analysis allows for automatic block placement in multi-block hexahedral meshing.

    PubMed

    Ramme, Austin J; Shivanna, Kiran H; Magnotta, Vincent A; Grosland, Nicole M

    2011-10-01

    Musculoskeletal finite element analysis (FEA) has been essential to research in orthopaedic biomechanics. The generation of a volumetric mesh is often the most challenging step in a FEA. Hexahedral meshing tools that are based on a multi-block approach rely on the manual placement of building blocks for their mesh generation scheme. We hypothesise that Gaussian curvature analysis could be used to automatically develop a building block structure for multi-block hexahedral mesh generation. The Automated Building Block Algorithm incorporates principles from differential geometry, combinatorics, statistical analysis and computer science to automatically generate a building block structure to represent a given surface without prior information. We have applied this algorithm to 29 bones of varying geometries and successfully generated a usable mesh in all cases. This work represents a significant advancement in automating the definition of building blocks.

  8. Assessment of statistical uncertainty in the quantitative analysis of solid samples in motion using laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Cabalín, L. M.; González, A.; Ruiz, J.; Laserna, J. J.

    2010-08-01

    Statistical uncertainty in the quantitative analysis of solid samples in motion by laser-induced breakdown spectroscopy (LIBS) has been assessed. For this purpose, a LIBS demonstrator was designed and constructed in our laboratory. The LIBS system consisted of a laboratory-scale conveyor belt, a compact optical module and a Nd:YAG laser operating at 532 nm. The speed of the conveyor belt was variable and could be adjusted up to a maximum speed of 2 m s - 1 . Statistical uncertainty in the analytical measurements was estimated in terms of precision (reproducibility and repeatability) and accuracy. The results obtained by LIBS on shredded scrap samples under real conditions have demonstrated that the analytical precision and accuracy of LIBS is dependent on the sample geometry, position on the conveyor belt and surface cleanliness. Flat, relatively clean scrap samples exhibited acceptable reproducibility and repeatability; by contrast, samples with an irregular shape or a dirty surface exhibited a poor relative standard deviation.

  9. Wavelet Transform Based Higher Order Statistical Analysis of Wind and Wave Time Histories

    NASA Astrophysics Data System (ADS)

    Habib Huseni, Gulamhusenwala; Balaji, Ramakrishnan

    2017-10-01

    Wind, blowing on the surface of the ocean, imparts the energy to generate the waves. Understanding the wind-wave interactions is essential for an oceanographer. This study involves higher order spectral analyses of wind speeds and significant wave height time histories, extracted from European Centre for Medium-Range Weather Forecast database at an offshore location off Mumbai coast, through continuous wavelet transform. The time histories were divided by the seasons; pre-monsoon, monsoon, post-monsoon and winter and the analysis were carried out to the individual data sets, to assess the effect of various seasons on the wind-wave interactions. The analysis revealed that the frequency coupling of wind speeds and wave heights of various seasons. The details of data, analysing technique and results are presented in this paper.

  10. Analysis of European ozone trends in the period 1995-2014

    NASA Astrophysics Data System (ADS)

    Yan, Yingying; Pozzer, Andrea; Ojha, Narendra; Lin, Jintai; Lelieveld, Jos

    2018-04-01

    Surface-based measurements from the EMEP and Airbase networks are used to estimate the changes in surface ozone levels during the 1995-2014 period over Europe. We find significant ozone enhancements (0.20-0.59 µg m-3 yr-1 for the annual means; P-value < 0.01 according to an F-test) over the European suburban and urban stations during 1995-2012 based on the Airbase sites. For European background ozone observed at EMEP sites, it is shown that a significantly decreasing trend in the 95th percentile ozone concentrations has occurred, especially at noon (0.9 µg m-3 yr-1; P-value < 0.01), while the 5th percentile ozone concentrations continued to increase with a trend of 0.3 µg m-3 yr-1 (P-value < 0.01) during the study period. With the help of numerical simulations performed with the global chemistry-climate model EMAC, the importance of anthropogenic emissions changes in determining these changes over background sites are investigated. The EMAC model is found to successfully capture the observed temporal variability in mean ozone concentrations, as well as the contrast in the trends of 95th and 5th percentile ozone over Europe. Sensitivity simulations and statistical analysis show that a decrease in European anthropogenic emissions had contrasting effects on surface ozone trends between the 95th and 5th percentile levels and that background ozone levels have been influenced by hemispheric transport, while climate variability generally regulated the inter-annual variations of surface ozone in Europe.

  11. Chitosan based grey wastewater treatment--a statistical design approach.

    PubMed

    Thirugnanasambandham, K; Sivakumar, V; Prakash Maran, J; Kandasamy, S

    2014-01-01

    In this present study, grey wastewater was treated under different operating conditions such as agitation time (1-3 min), pH (2.5-5.5), chitosan dose (0.3-0.6g/l) and settling time (10-20 min) using response surface methodology (RSM). Four factors with three levels Box-Behnken response surface design (BBD) were employed to optimize and investigate the effect of process variables on the responses such as turbidity, BOD and COD removal. The results were analyzed by Pareto analysis of variance (ANOVA) and second order polynomial models were developed in order to predict the responses. Under the optimum conditions, experimental values such as turbidity (96%), BOD (91%) and COD (73%) removals are closely agreed with predicted values. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Synthesis and Process Optimization of Electrospun PEEK-Sulfonated Nanofibers by Response Surface Methodology

    PubMed Central

    Boaretti, Carlo; Roso, Martina; Lorenzetti, Alessandra; Modesti, Michele

    2015-01-01

    In this study electrospun nanofibers of partially sulfonated polyether ether ketone have been produced as a preliminary step for a possible development of composite proton exchange membranes for fuel cells. Response surface methodology has been employed for the modelling and optimization of the electrospinning process, using a Box-Behnken design. The investigation, based on a second order polynomial model, has been focused on the analysis of the effect of both process (voltage, tip-to-collector distance, flow rate) and material (sulfonation degree) variables on the mean fiber diameter. The final model has been verified by a series of statistical tests on the residuals and validated by a comparison procedure of samples at different sulfonation degrees, realized according to optimized conditions, for the production of homogeneous thin nanofibers. PMID:28793427

  13. An evaluation of object-oriented image analysis techniques to identify motorized vehicle effects in semi-arid to arid ecosystems of the American West

    USGS Publications Warehouse

    Mladinich, C.

    2010-01-01

    Human disturbance is a leading ecosystem stressor. Human-induced modifications include transportation networks, areal disturbances due to resource extraction, and recreation activities. High-resolution imagery and object-oriented classification rather than pixel-based techniques have successfully identified roads, buildings, and other anthropogenic features. Three commercial, automated feature-extraction software packages (Visual Learning Systems' Feature Analyst, ENVI Feature Extraction, and Definiens Developer) were evaluated by comparing their ability to effectively detect the disturbed surface patterns from motorized vehicle traffic. Each package achieved overall accuracies in the 70% range, demonstrating the potential to map the surface patterns. The Definiens classification was more consistent and statistically valid. Copyright ?? 2010 by Bellwether Publishing, Ltd. All rights reserved.

  14. 3-D High-Lift Flow-Physics Experiment - Transition Measurements

    NASA Technical Reports Server (NTRS)

    McGinley, Catherine B.; Jenkins, Luther N.; Watson, Ralph D.; Bertelrud, Arild

    2005-01-01

    An analysis of the flow state on a trapezoidal wing model from the NASA 3-D High Lift Flow Physics Experiment is presented. The objective of the experiment was to characterize the flow over a non-proprietary semi-span three-element high-lift configuration to aid in assessing the state of the art in the computation of three-dimensional high-lift flows. Surface pressures and hot-film sensors are used to determine the flow conditions on the slat, main, and flap. The locations of the attachments lines and the values of the attachment line Reynolds number are estimated based on the model surface pressures. Data from the hot-films are used to determine if the flow is laminar, transitional, or turbulent by examining the hot-film time histories, statistics, and frequency spectra.

  15. Synthesis and Process Optimization of Electrospun PEEK-Sulfonated Nanofibers by Response Surface Methodology.

    PubMed

    Boaretti, Carlo; Roso, Martina; Lorenzetti, Alessandra; Modesti, Michele

    2015-07-07

    In this study electrospun nanofibers of partially sulfonated polyether ether ketone have been produced as a preliminary step for a possible development of composite proton exchange membranes for fuel cells. Response surface methodology has been employed for the modelling and optimization of the electrospinning process, using a Box-Behnken design. The investigation, based on a second order polynomial model, has been focused on the analysis of the effect of both process (voltage, tip-to-collector distance, flow rate) and material (sulfonation degree) variables on the mean fiber diameter. The final model has been verified by a series of statistical tests on the residuals and validated by a comparison procedure of samples at different sulfonation degrees, realized according to optimized conditions, for the production of homogeneous thin nanofibers.

  16. Automatic recognition of surface landmarks of anatomical structures of back and posture

    NASA Astrophysics Data System (ADS)

    Michoński, Jakub; Glinkowski, Wojciech; Witkowski, Marcin; Sitnik, Robert

    2012-05-01

    Faulty postures, scoliosis and sagittal plane deformities should be detected as early as possible to apply preventive and treatment measures against major clinical consequences. To support documentation of the severity of deformity and diminish x-ray exposures, several solutions utilizing analysis of back surface topography data were introduced. A novel approach to automatic recognition and localization of anatomical landmarks of the human back is presented that may provide more repeatable results and speed up the whole procedure. The algorithm was designed as a two-step process involving a statistical model built upon expert knowledge and analysis of three-dimensional back surface shape data. Voronoi diagram is used to connect mean geometric relations, which provide a first approximation of the positions, with surface curvature distribution, which further guides the recognition process and gives final locations of landmarks. Positions obtained using the developed algorithms are validated with respect to accuracy of manual landmark indication by experts. Preliminary validation proved that the landmarks were localized correctly, with accuracy depending mostly on the characteristics of a given structure. It was concluded that recognition should mainly take into account the shape of the back surface, putting as little emphasis on the statistical approximation as possible.

  17. Liver segmentation from CT images using a sparse priori statistical shape model (SP-SSM).

    PubMed

    Wang, Xuehu; Zheng, Yongchang; Gan, Lan; Wang, Xuan; Sang, Xinting; Kong, Xiangfeng; Zhao, Jie

    2017-01-01

    This study proposes a new liver segmentation method based on a sparse a priori statistical shape model (SP-SSM). First, mark points are selected in the liver a priori model and the original image. Then, the a priori shape and its mark points are used to obtain a dictionary for the liver boundary information. Second, the sparse coefficient is calculated based on the correspondence between mark points in the original image and those in the a priori model, and then the sparse statistical model is established by combining the sparse coefficients and the dictionary. Finally, the intensity energy and boundary energy models are built based on the intensity information and the specific boundary information of the original image. Then, the sparse matching constraint model is established based on the sparse coding theory. These models jointly drive the iterative deformation of the sparse statistical model to approximate and accurately extract the liver boundaries. This method can solve the problems of deformation model initialization and a priori method accuracy using the sparse dictionary. The SP-SSM can achieve a mean overlap error of 4.8% and a mean volume difference of 1.8%, whereas the average symmetric surface distance and the root mean square symmetric surface distance can reach 0.8 mm and 1.4 mm, respectively.

  18. Cortical surface-based threshold-free cluster enhancement and cortexwise mediation.

    PubMed

    Lett, Tristram A; Waller, Lea; Tost, Heike; Veer, Ilya M; Nazeri, Arash; Erk, Susanne; Brandl, Eva J; Charlet, Katrin; Beck, Anne; Vollstädt-Klein, Sabine; Jorde, Anne; Kiefer, Falk; Heinz, Andreas; Meyer-Lindenberg, Andreas; Chakravarty, M Mallar; Walter, Henrik

    2017-06-01

    Threshold-free cluster enhancement (TFCE) is a sensitive means to incorporate spatial neighborhood information in neuroimaging studies without using arbitrary thresholds. The majority of methods have applied TFCE to voxelwise data. The need to understand the relationship among multiple variables and imaging modalities has become critical. We propose a new method of applying TFCE to vertexwise statistical images as well as cortexwise (either voxel- or vertexwise) mediation analysis. Here we present TFCE_mediation, a toolbox that can be used for cortexwise multiple regression analysis with TFCE, and additionally cortexwise mediation using TFCE. The toolbox is open source and publicly available (https://github.com/trislett/TFCE_mediation). We validated TFCE_mediation in healthy controls from two independent multimodal neuroimaging samples (N = 199 and N = 183). We found a consistent structure-function relationship between surface area and the first independent component (IC1) of the N-back task, that white matter fractional anisotropy is strongly associated with IC1 N-back, and that our voxel-based results are essentially identical to FSL randomise using TFCE (all P FWE <0.05). Using cortexwise mediation, we showed that the relationship between white matter FA and IC1 N-back is mediated by surface area in the right superior frontal cortex (P FWE  < 0.05). We also demonstrated that the same mediation model is present using vertexwise mediation (P FWE  < 0.05). In conclusion, cortexwise analysis with TFCE provides an effective analysis of multimodal neuroimaging data. Furthermore, cortexwise mediation analysis may identify or explain a mechanism that underlies an observed relationship among a predictor, intermediary, and dependent variables in which one of these variables is assessed at a whole-brain scale. Hum Brain Mapp 38:2795-2807, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  19. A method of 2D/3D registration of a statistical mouse atlas with a planar X-ray projection and an optical photo.

    PubMed

    Wang, Hongkai; Stout, David B; Chatziioannou, Arion F

    2013-05-01

    The development of sophisticated and high throughput whole body small animal imaging technologies has created a need for improved image analysis and increased automation. The registration of a digital mouse atlas to individual images is a prerequisite for automated organ segmentation and uptake quantification. This paper presents a fully-automatic method for registering a statistical mouse atlas with individual subjects based on an anterior-posterior X-ray projection and a lateral optical photo of the mouse silhouette. The mouse atlas was trained as a statistical shape model based on 83 organ-segmented micro-CT images. For registration, a hierarchical approach is applied which first registers high contrast organs, and then estimates low contrast organs based on the registered high contrast organs. To register the high contrast organs, a 2D-registration-back-projection strategy is used that deforms the 3D atlas based on the 2D registrations of the atlas projections. For validation, this method was evaluated using 55 subjects of preclinical mouse studies. The results showed that this method can compensate for moderate variations of animal postures and organ anatomy. Two different metrics, the Dice coefficient and the average surface distance, were used to assess the registration accuracy of major organs. The Dice coefficients vary from 0.31 ± 0.16 for the spleen to 0.88 ± 0.03 for the whole body, and the average surface distance varies from 0.54 ± 0.06 mm for the lungs to 0.85 ± 0.10mm for the skin. The method was compared with a direct 3D deformation optimization (without 2D-registration-back-projection) and a single-subject atlas registration (instead of using the statistical atlas). The comparison revealed that the 2D-registration-back-projection strategy significantly improved the registration accuracy, and the use of the statistical mouse atlas led to more plausible organ shapes than the single-subject atlas. This method was also tested with shoulder xenograft tumor-bearing mice, and the results showed that the registration accuracy of most organs was not significantly affected by the presence of shoulder tumors, except for the lungs and the spleen. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Controlling the joint local false discovery rate is more powerful than meta-analysis methods in joint analysis of summary statistics from multiple genome-wide association studies.

    PubMed

    Jiang, Wei; Yu, Weichuan

    2017-02-15

    In genome-wide association studies (GWASs) of common diseases/traits, we often analyze multiple GWASs with the same phenotype together to discover associated genetic variants with higher power. Since it is difficult to access data with detailed individual measurements, summary-statistics-based meta-analysis methods have become popular to jointly analyze datasets from multiple GWASs. In this paper, we propose a novel summary-statistics-based joint analysis method based on controlling the joint local false discovery rate (Jlfdr). We prove that our method is the most powerful summary-statistics-based joint analysis method when controlling the false discovery rate at a certain level. In particular, the Jlfdr-based method achieves higher power than commonly used meta-analysis methods when analyzing heterogeneous datasets from multiple GWASs. Simulation experiments demonstrate the superior power of our method over meta-analysis methods. Also, our method discovers more associations than meta-analysis methods from empirical datasets of four phenotypes. The R-package is available at: http://bioinformatics.ust.hk/Jlfdr.html . eeyu@ust.hk. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  1. Statistical downscaling of IPCC sea surface wind and wind energy predictions for U.S. east coastal ocean, Gulf of Mexico and Caribbean Sea

    NASA Astrophysics Data System (ADS)

    Yao, Zhigang; Xue, Zuo; He, Ruoying; Bao, Xianwen; Song, Jun

    2016-08-01

    A multivariate statistical downscaling method is developed to produce regional, high-resolution, coastal surface wind fields based on the IPCC global model predictions for the U.S. east coastal ocean, the Gulf of Mexico (GOM), and the Caribbean Sea. The statistical relationship is built upon linear regressions between the empirical orthogonal function (EOF) spaces of a cross- calibrated, multi-platform, multi-instrument ocean surface wind velocity dataset (predictand) and the global NCEP wind reanalysis (predictor) over a 10 year period from 2000 to 2009. The statistical relationship is validated before applications and its effectiveness is confirmed by the good agreement between downscaled wind fields based on the NCEP reanalysis and in-situ surface wind measured at 16 National Data Buoy Center (NDBC) buoys in the U.S. east coastal ocean and the GOM during 1992-1999. The predictand-predictor relationship is applied to IPCC GFDL model output (2.0°×2.5°) of downscaled coastal wind at 0.25°×0.25° resolution. The temporal and spatial variability of future predicted wind speeds and wind energy potential over the study region are further quantified. It is shown that wind speed and power would significantly be reduced in the high CO2 climate scenario offshore of the mid-Atlantic and northeast U.S., with the speed falling to one quarter of its original value.

  2. Influence of different resin cements and surface treatments on microshear bond strength of zirconia-based ceramics

    PubMed Central

    Petrauskas, Anderson; Novaes Olivieri, Karina Andrea; Pupo, Yasmine Mendes; Berger, Guilherme; Gonçalves Betiol, Ederson Áureo

    2018-01-01

    Aim: This study aims to evaluate the microshear bond strength of zirconia-based ceramics with different resin cement systems and surface treatments. Materials and Methods: Forty blocks of zirconia-based ceramic were prepared and embedded in polyvinyl chloride (PVC) tubes with acrylic resin. After polishing, the samples were washed in an ultrasonic bath and dried in an oven for 10 min. Half of the samples were subjected to sandblasting with aluminum oxide. Blocks were divided into four groups (n = 10) in which two resin cements were used as follows: (1) RelyX™ U100 with surface-polished zirconia; (2) RelyX™ U100 with surface-blasted zirconia; (3) Multilink with surface-polished zirconia; and 4) Multilink with surface-blasted zirconia. After performing these surface treatments, translucent tubes (n = 30 per group) were placed on the zirconia specimens, and resin cement was injected into them and light cured. The PVC tubes were adapted in a universal testing machine; a stiletto blade, which was bolted to the machine, was positioned on the cementation interface. The microshear test was performed at a speed of 0.5 mm/min. Failure mode was analyzed in an optical microscope and classified as adhesive, cohesive, or mixed. Results: The null hypothesis of this study was rejected because there was a difference found between the resin cement and the surface treatment. There was a statistical difference (P < 0.005) in RelyX™ U100 with surface-blasted zirconia, in relation to the other three groups. For Multilink groups, there was no statistical difference between them. Conclusion: Self-adhesive resin cement showed a more significant tendency toward bond strength in the ceramic-based zirconium oxide grit-blasted surfaces. PMID:29674825

  3. Bacterial adhesion on conventional and self-ligating metallic brackets after surface treatment with plasma-polymerized hexamethyldisiloxane

    PubMed Central

    Tupinambá, Rogerio Amaral; Claro, Cristiane Aparecida de Assis; Pereira, Cristiane Aparecida; Nobrega, Celestino José Prudente; Claro, Ana Paula Rosifini Alves

    2017-01-01

    ABSTRACT Introduction: Plasma-polymerized film deposition was created to modify metallic orthodontic brackets surface properties in order to inhibit bacterial adhesion. Methods: Hexamethyldisiloxane (HMDSO) polymer films were deposited on conventional (n = 10) and self-ligating (n = 10) stainless steel orthodontic brackets using the Plasma-Enhanced Chemical Vapor Deposition (PECVD) radio frequency technique. The samples were divided into two groups according to the kind of bracket and two subgroups after surface treatment. Scanning Electron Microscopy (SEM) analysis was performed to assess the presence of bacterial adhesion over samples surfaces (slot and wings region) and film layer integrity. Surface roughness was assessed by Confocal Interferometry (CI) and surface wettability, by goniometry. For bacterial adhesion analysis, samples were exposed for 72 hours to a Streptococcus mutans solution for biofilm formation. The values obtained for surface roughness were analyzed using the Mann-Whitney test while biofilm adhesion were assessed by Kruskal-Wallis and SNK test. Results: Significant statistical differences (p< 0.05) for surface roughness and bacterial adhesion reduction were observed on conventional brackets after surface treatment and between conventional and self-ligating brackets; no significant statistical differences were observed between self-ligating groups (p> 0.05). Conclusion: Plasma-polymerized film deposition was only effective on reducing surface roughness and bacterial adhesion in conventional brackets. It was also noted that conventional brackets showed lower biofilm adhesion than self-ligating brackets despite the absence of film. PMID:28902253

  4. Noninvasive prostate cancer screening based on serum surface-enhanced Raman spectroscopy and support vector machine

    NASA Astrophysics Data System (ADS)

    Li, Shaoxin; Zhang, Yanjiao; Xu, Junfa; Li, Linfang; Zeng, Qiuyao; Lin, Lin; Guo, Zhouyi; Liu, Zhiming; Xiong, Honglian; Liu, Songhao

    2014-09-01

    This study aims to present a noninvasive prostate cancer screening methods using serum surface-enhanced Raman scattering (SERS) and support vector machine (SVM) techniques through peripheral blood sample. SERS measurements are performed using serum samples from 93 prostate cancer patients and 68 healthy volunteers by silver nanoparticles. Three types of kernel functions including linear, polynomial, and Gaussian radial basis function (RBF) are employed to build SVM diagnostic models for classifying measured SERS spectra. For comparably evaluating the performance of SVM classification models, the standard multivariate statistic analysis method of principal component analysis (PCA) is also applied to classify the same datasets. The study results show that for the RBF kernel SVM diagnostic model, the diagnostic accuracy of 98.1% is acquired, which is superior to the results of 91.3% obtained from PCA methods. The receiver operating characteristic curve of diagnostic models further confirm above research results. This study demonstrates that label-free serum SERS analysis technique combined with SVM diagnostic algorithm has great potential for noninvasive prostate cancer screening.

  5. A Statistics-Based Material Property Analysis to Support TPS Characterization

    NASA Technical Reports Server (NTRS)

    Copeland, Sean R.; Cozmuta, Ioana; Alonso, Juan J.

    2012-01-01

    Accurate characterization of entry capsule heat shield material properties is a critical component in modeling and simulating Thermal Protection System (TPS) response in a prescribed aerothermal environment. The thermal decomposition of the TPS material during the pyrolysis and charring processes is poorly characterized and typically results in large uncertainties in material properties as inputs for ablation models. These material property uncertainties contribute to large design margins on flight systems and cloud re- construction efforts for data collected during flight and ground testing, making revision to existing models for entry systems more challenging. The analysis presented in this work quantifies how material property uncertainties propagate through an ablation model and guides an experimental test regimen aimed at reducing these uncertainties and characterizing the dependencies between properties in the virgin and charred states for a Phenolic Impregnated Carbon Ablator (PICA) based TPS. A sensitivity analysis identifies how the high-fidelity model behaves in the expected flight environment, while a Monte Carlo based uncertainty propagation strategy is used to quantify the expected spread in the in-depth temperature response of the TPS. An examination of how perturbations to the input probability density functions affect output temperature statistics is accomplished using a Kriging response surface of the high-fidelity model. Simulations are based on capsule configuration and aerothermal environments expected during the Mars Science Laboratory (MSL) entry sequence. We identify and rank primary sources of uncertainty from material properties in a flight-relevant environment, show the dependence on spatial orientation and in-depth location on those uncertainty contributors, and quantify how sensitive the expected results are.

  6. A review of downscaling procedures - a contribution to the research on climate change impacts at city scale

    NASA Astrophysics Data System (ADS)

    Smid, Marek; Costa, Ana; Pebesma, Edzer; Granell, Carlos; Bhattacharya, Devanjan

    2016-04-01

    Human kind is currently predominantly urban based, and the majority of ever continuing population growth will take place in urban agglomerations. Urban systems are not only major drivers of climate change, but also the impact hot spots. Furthermore, climate change impacts are commonly managed at city scale. Therefore, assessing climate change impacts on urban systems is a very relevant subject of research. Climate and its impacts on all levels (local, meso and global scale) and also the inter-scale dependencies of those processes should be a subject to detail analysis. While global and regional projections of future climate are currently available, local-scale information is lacking. Hence, statistical downscaling methodologies represent a potentially efficient way to help to close this gap. In general, the methodological reviews of downscaling procedures cover the various methods according to their application (e.g. downscaling for the hydrological modelling). Some of the most recent and comprehensive studies, such as the ESSEM COST Action ES1102 (VALUE), use the concept of Perfect Prog and MOS. Other examples of classification schemes of downscaling techniques consider three main categories: linear methods, weather classifications and weather generators. Downscaling and climate modelling represent a multidisciplinary field, where researchers from various backgrounds intersect their efforts, resulting in specific terminology, which may be somewhat confusing. For instance, the Polynomial Regression (also called the Surface Trend Analysis) is a statistical technique. In the context of the spatial interpolation procedures, it is commonly classified as a deterministic technique, and kriging approaches are classified as stochastic. Furthermore, the terms "statistical" and "stochastic" (frequently used as names of sub-classes in downscaling methodological reviews) are not always considered as synonymous, even though both terms could be seen as identical since they are referring to methods handling input modelling factors as variables with certain probability distributions. In addition, the recent development is going towards multi-step methodologies containing deterministic and stochastic components. This evolution leads to the introduction of new terms like hybrid or semi-stochastic approaches, which makes the efforts to systematically classifying downscaling methods to the previously defined categories even more challenging. This work presents a review of statistical downscaling procedures, which classifies the methods in two steps. In the first step, we describe several techniques that produce a single climatic surface based on observations. The methods are classified into two categories using an approximation to the broadest consensual statistical terms: linear and non-linear methods. The second step covers techniques that use simulations to generate alternative surfaces, which correspond to different realizations of the same processes. Those simulations are essential because there is a limited number of real observational data, and such procedures are crucial for modelling extremes. This work emphasises the link between statistical downscaling methods and the research of climate change impacts at city scale.

  7. Electromyographic analysis of the serratus anterior and trapezius muscles during push-ups on stable and unstable bases in subjects with scapular dyskinesis.

    PubMed

    Pirauá, André Luiz Torres; Pitangui, Ana Carolina Rodarti; Silva, Juliana Pereira; Pereira dos Passos, Muana Hiandra; Alves de Oliveira, Valéria Mayaly; Batista, Laísla da Silva Paixão; Cappato de Araújo, Rodrigo

    2014-10-01

    The present study was performed to assess the electromyographic activity of the scapular muscles during push-ups on a stable and unstable surface, in subjects with scapular dyskinesis. Muscle activation (upper trapezius [UT]; lower trapezius [LT]; upper serratus anterior [SA_5th]; lower serratus anterior [SA_7th]) and ratios (UT/LT; UT/SA_5th; UT/ SA_7th) levels were determined by surface EMG in 30 asymptomatic men with scapular dyskinesis, during push-up performed on a stable and unstable surface. Multivariate analysis of variance with repeated measures was used for statistical analyses. The unstable surface caused a decrease in the EMG activity of the serratus anterior and an increase in EMG activity of the trapezius (p=0.001). UT/SA_5th and UT/ SA_7th ratios were higher during unstable push-ups (p=0.001). The results suggest that, in individuals with scapular dyskinesis, there is increased EMG activity of the trapezius and decreased EMG activity of the serratus anterior in response to an unstable surface. These results suggest that the performance of the push up exercise on an unstable surface may be more favorable to produce higher levels of trapezius activation and lower levels of serratus anterior activation. However, if the goal of the exercise program is the strengthening of the SA muscle, it is suggested to perform the push up on a stable surface. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Is the Maxillary Sinus Really Suitable in Sex Determination? A Three-Dimensional Analysis of Maxillary Sinus Volume and Surface Depending on Sex and Dentition.

    PubMed

    Möhlhenrich, Stephan Christian; Heussen, Nicole; Peters, Florian; Steiner, Timm; Hölzle, Frank; Modabber, Ali

    2015-11-01

    The morphometric analysis of maxillary sinus was recently presented as a helpful instrument for sex determination. The aim of the present study was to examine the volume and surface of the fully dentate, partial, and complete edentulous maxillary sinus depending on the sex. Computed tomography data from 276 patients were imported in DICOM format via special virtual planning software, and surfaces (mm) and volumes (mm) of maxillary sinuses were measured. In sex-specific comparisons (women vs men), statistically significant differences for the mean maxillary sinus volume and surface were found between fully dentate (volume, 13,267.77 mm vs 16,623.17 mm, P < 0.0001; surface, 3480.05 mm vs 4100.83 mm, P < 0.0001) and partially edentulous (volume, 10,577.35 mm vs 14,608.10 mm, P = 0.0002; surface, 2980.11 mm vs 3797.42 mm, P < 0.0001) or complete edentulous sinuses (volume, 11,200.99 mm vs 15,382.29 mm, P < 0.0001; surface, 3118.32 mm vs 3877.25 mm, P < 0.0001). For males, the statistically different mean values were calculated between fully dentate and partially edentulous (volume, P = 0.0022; surface, P = 0.0048) maxillary sinuses. Between the sexes, no differences were only measured for female and male partially dentate fully edentulous sinuses (2 teeth missing) and between partially edentulous sinuses in women and men (1 teeth vs 2 teeth missing). With a corresponding software program, it is possible to analyze the maxillary sinus precisely. The dentition influences the volume and surface of the pneumatic maxillary sinus. Therefore, sex determination is possible by analysis of the maxillary sinus event through the increase in pneumatization.

  9. EFFECTS OF LASER RADIATION ON MATTER: Influence of fluctuations of the size and number of surface microdefects on the thresholds of laser plasma formation

    NASA Astrophysics Data System (ADS)

    Borets-Pervak, I. Yu; Vorob'ev, V. S.

    1990-08-01

    An analysis is made of the influence of the statistical scatter of the size of thermally insulated microdefects and of their number in the focusing spot on the threshold energies of plasma formation by microsecond laser pulses interacting with metal surfaces. The coordinates of the laser pulse intensity and the surface density of the laser energy are used in constructing plasma formation regions corresponding to different numbers of microdefects within the focusing spot area; the same coordinates are used to represent laser pulses. Various threshold and nonthreshold plasma formation mechanisms are discussed. The sizes of microdefects and their statistical characteristics deduced from limited experimental data provide a consistent description of the characteristics of plasma formation near polished and nonpolished surfaces.

  10. Deriving inertial wave characteristics from surface drifter velocities - Frequency variability in the tropical Pacific

    NASA Technical Reports Server (NTRS)

    Poulain, Pierre-Marie; Luther, Douglas S.; Patzert, William C.

    1992-01-01

    Two techniques were developed for estimating statistics of inertial oscillations from satellite-tracked drifters that overcome the difficulties inherent in estimating such statistics from data dependent upon space coordinates that are a function of time. Application of these techniques to tropical surface drifter data collected during the NORPAX, EPOCS, and TOGA programs reveals a latitude-dependent, statistically significant 'blue shift' of inertial wave frequency. The latitudinal dependence of the blue shift is similar to predictions based on 'global' internal-wave spectral models, with a superposition of frequency shifting due to modification of the effective local inertial frequency by the presence of strongly sheared zonal mean currents within 12 deg of the equator.

  11. Implementation of quality by design principles in the development of microsponges as drug delivery carriers: Identification and optimization of critical factors using multivariate statistical analyses and design of experiments studies.

    PubMed

    Simonoska Crcarevska, Maja; Dimitrovska, Aneta; Sibinovska, Nadica; Mladenovska, Kristina; Slavevska Raicki, Renata; Glavas Dodov, Marija

    2015-07-15

    Microsponges drug delivery system (MDDC) was prepared by double emulsion-solvent-diffusion technique using rotor-stator homogenization. Quality by design (QbD) concept was implemented for the development of MDDC with potential to be incorporated into semisolid dosage form (gel). Quality target product profile (QTPP) and critical quality attributes (CQA) were defined and identified, accordingly. Critical material attributes (CMA) and Critical process parameters (CPP) were identified using quality risk management (QRM) tool, failure mode, effects and criticality analysis (FMECA). CMA and CPP were identified based on results obtained from principal component analysis (PCA-X&Y) and partial least squares (PLS) statistical analysis along with literature data, product and process knowledge and understanding. FMECA identified amount of ethylcellulose, chitosan, acetone, dichloromethane, span 80, tween 80 and water ratio in primary/multiple emulsions as CMA and rotation speed and stirrer type used for organic solvent removal as CPP. The relationship between identified CPP and particle size as CQA was described in the design space using design of experiments - one-factor response surface method. Obtained results from statistically designed experiments enabled establishment of mathematical models and equations that were used for detailed characterization of influence of identified CPP upon MDDC particle size and particle size distribution and their subsequent optimization. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. From The Pierre Auger Observatory to AugerPrime

    NASA Astrophysics Data System (ADS)

    Parra, Alejandra; Martínez Bravo, Oscar; Pierre Auger Collaboration

    2017-06-01

    In the present work we report the principal motivation and reasons for the new stage of the Pierre Auger Observatory, AugerPrime. This upgrade has as its principal goal to clarify the origin of the highest energy cosmic rays through improvement in studies of the mass composition. To accomplished this goal, AugerPrime will use air shower universality, which states that extensive air showers can be completely described by three parameters: the primary energy E 0, the atmospheric shower depth of maximum X max, and the number of muons, Nμ . The Auger Collaboration has planned to complement its surface array (SD), based on water-Cherenkov detectors (WCD) with scintillator detectors, calls SSD (Scintillator Surface Detector). These will be placed at the top of each WCD station. The SSD will allow a shower to shower analysis, instead of the statistical analysis that the Observatory has previously done, to determine the mass composition of the primary particle by the electromagnetic to muonic ratio.

  13. Cloud cover analysis with Arctic Advanced Very High Resolution Radiometer data. II - Classification with spectral and textural measures

    NASA Technical Reports Server (NTRS)

    Key, J.

    1990-01-01

    The spectral and textural characteristics of polar clouds and surfaces for a 7-day summer series of AVHRR data in two Arctic locations are examined, and the results used in the development of a cloud classification procedure for polar satellite data. Since spatial coherence and texture sensitivity tests indicate that a joint spectral-textural analysis based on the same cell size is inappropriate, cloud detection with AVHRR data and surface identification with passive microwave data are first done on the pixel level as described by Key and Barry (1989). Next, cloud patterns within 250-sq-km regions are described, then the spectral and local textural characteristics of cloud patterns in the image are determined and each cloud pixel is classified by statistical methods. Results indicate that both spectral and textural features can be utilized in the classification of cloudy pixels, although spectral features are most useful for the discrimination between cloud classes.

  14. A histomorphometric analysis of the effects of various surface treatment methods on osseointegration.

    PubMed

    Kim, Yeon-Hee; Koak, Jai-Young; Chang, Ik-Tae; Wennerberg, Ann; Heo, Seong-Joo

    2003-01-01

    One major factor in the success and biocompatibility of an implant is its surface properties. The purposes of this study were to analyze the surface characteristics of implants after blasting and thermal oxidation and to evaluate the bone response around these implants with histomorphometric analysis. Threaded implants (3.75 mm in diameter, 8.0 mm in length) were manufactured by machining a commercially pure titanium (grade 2). A total of 48 implants were evaluated with histomorphometric methods and included in the statistical analyses. Two different groups of samples were prepared according to the following procedures: Group 1 samples were blasted with 50-microm aluminum oxide (Al2O3) particles, and group 2 samples were blasted with 50-microm Al2O3, then thermally oxidized at 800 degrees C for 2 hours in a pure oxygen atmosphere. A noncontacting optical profilometer was used to measure the surface topography. The surface composition of the implants used and the oxide thickness were investigated with Rutherford backscattering spectrometry. The different preparations produced implant surfaces with essentially similar chemical composition, but with different oxide thickness and roughness. The morphologic evaluation of the bone formation revealed that: (1) the percentage of bone-to-implant contact of the oxidized implants (33.3%) after 4 weeks was greater than that of the blasted group (23.1%); (2) the percentages of bone-to-implant contact after 12 weeks were not statistically significantly different between the groups; (3) the percentages of bone area inside the thread after 4 weeks and 12 weeks were not statistically significantly different between groups. This investigation demonstrated the possibility that different surface treatments, such as blasting and oxidation, have an effect on the ingrowth of bone into the thread. However, the clinical implications of surface treatments on implants, and the exact mechanisms by which the surface properties of the implant affect the process of osseointegration, remain subjects for further study.

  15. Shape and rotational elements of comet 67P/ Churyumov-Gerasimenko derived by stereo-photogrammetric analysis of OSIRIS NAC image data

    NASA Astrophysics Data System (ADS)

    Preusker, Frank; Scholten, Frank; Matz, Klaus-Dieter; Roatsch, Thomas; Willner, Konrad; Hviid, Stubbe; Knollenberg, Jörg; Kührt, Ekkehard; Sierks, Holger

    2015-04-01

    The European Space Agency's Rosetta spacecraft is equipped with the OSIRIS imaging system which consists of a wide-angle and a narrow-angle camera (WAC and NAC). After the approach phase, Rosetta was inserted into a descent trajectory of comet 67P/Churyumov-Gerasimenko (C-G) in early August 2014. Until early September, OSIRIS acquired several hundred NAC images of C-G's surface at different scales (from ~5 m/pixel during approach to ~0.9 m/pixel during descent). In that one month observation period, the surface was imaged several times within different mapping sequences. With the comet's rotation period of ~12.4 h and the low spacecraft velocity (< 1 m/s), the entire NAC dataset provides multiple NAC stereo coverage, adequate for stereo-photogrammetric (SPG) analysis towards the derivation of 3D surface models. We constrained the OSIRIS NAC images with our stereo requirements (15° < stereo angles < 45°, incidence angles <85°, emission angles <45°, differences in illumination < 10°, scale better than 5 m/pixel) and extracted about 220 NAC images that provide at least triple stereo image coverage for the entire illuminated surface in about 250 independent multi-stereo image combinations. For each image combination we determined tie points by multi-image matching in order to set-up a 3D control network and a dense surface point cloud for the precise reconstruction of C-G's shape. The control point network defines the input for a stereo-photogrammetric least squares adjustment. Based on the statistical analysis of adjustments we first refined C-G's rotational state (pole orientation and rotational period) and its behavior over time. Based upon this description of the orientation of C-G's body-fixed reference frame, we derived corrections for the nominal navigation data (pointing and position) within a final stereo-photogrammetric block adjustment where the mean 3D point accuracy of more than 100 million surface points has been improved from ~10 m to the sub-meter range. We finally applied point filtering and interpolation techniques to these surface 3D points and show the resulting SPG-based 3D surface model with a lateral sampling rate of about 2 m.

  16. Dealing with missing standard deviation and mean values in meta-analysis of continuous outcomes: a systematic review.

    PubMed

    Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C

    2018-03-07

    Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.

  17. Multi-scale Characterization and Modeling of Surface Slope Probability Distribution for ~20-km Diameter Lunar Craters

    NASA Astrophysics Data System (ADS)

    Mahanti, P.; Robinson, M. S.; Boyd, A. K.

    2013-12-01

    Craters ~20-km diameter and above significantly shaped the lunar landscape. The statistical nature of the slope distribution on their walls and floors dominate the overall slope distribution statistics for the lunar surface. Slope statistics are inherently useful for characterizing the current topography of the surface, determining accurate photometric and surface scattering properties, and in defining lunar surface trafficability [1-4]. Earlier experimental studies on the statistical nature of lunar surface slopes were restricted either by resolution limits (Apollo era photogrammetric studies) or by model error considerations (photoclinometric and radar scattering studies) where the true nature of slope probability distribution was not discernible at baselines smaller than a kilometer[2,3,5]. Accordingly, historical modeling of lunar surface slopes probability distributions for applications such as in scattering theory development or rover traversability assessment is more general in nature (use of simple statistical models such as the Gaussian distribution[1,2,5,6]). With the advent of high resolution, high precision topographic models of the Moon[7,8], slopes in lunar craters can now be obtained at baselines as low as 6-meters allowing unprecedented multi-scale (multiple baselines) modeling possibilities for slope probability distributions. Topographic analysis (Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) 2-m digital elevation models (DEM)) of ~20-km diameter Copernican lunar craters revealed generally steep slopes on interior walls (30° to 36°, locally exceeding 40°) over 15-meter baselines[9]. In this work, we extend the analysis from a probability distribution modeling point-of-view with NAC DEMs to characterize the slope statistics for the floors and walls for the same ~20-km Copernican lunar craters. The difference in slope standard deviations between the Gaussian approximation and the actual distribution (2-meter sampling) was computed over multiple scales. This slope analysis showed that local slope distributions are non-Gaussian for both crater walls and floors. Over larger baselines (~100 meters), crater wall slope probability distributions do approximate Gaussian distributions better, but have long distribution tails. Crater floor probability distributions however, were always asymmetric (for the baseline scales analyzed) and less affected by baseline scale variations. Accordingly, our results suggest that use of long tailed probability distributions (like Cauchy) and a baseline-dependant multi-scale model can be more effective in describing the slope statistics for lunar topography. Refrences: [1]Moore, H.(1971), JGR,75(11) [2]Marcus, A. H.(1969),JGR,74 (22).[3]R.J. Pike (1970),U.S. Geological Survey Working Paper [4]N. C. Costes, J. E. Farmer and E. B. George (1972),NASA Technical Report TR R-401 [5]M. N. Parker and G. L. Tyler(1973), Radio Science, 8(3),177-184 [6]Alekseev, V. A.et al (1968), Soviet Astronomy, Vol. 11, p.860 [7]Burns et al. (2012) Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XXXIX-B4, 483-488.[8]Smith et al. (2010) GRL 37, L18204, DOI: 10.1029/2010GL043751. [9]Wagner R., Robinson, M., Speyerer E., Mahanti, P., LPSC 2013, #2924.

  18. Design of a nanoplatform for treating pancreatic cancer

    NASA Astrophysics Data System (ADS)

    Manawadu, Harshi Chathurangi

    Pancreatic cancer is the fourth leading cause of cancer-related deaths in the USA. Asymptomatic early cancer stages and late diagnosis leads to very low survival rates of pancreatic cancers, compared to other cancers. Treatment options for advanced pancreatic cancer are limited to chemotherapy and/or radiation therapy, as surgical removal of the cancerous tissue becomes impossible at later stages. Therefore, there's a critical need for innovative and improved chemotherapeutic treatment of (late) pancreatic cancers. It is mandatory for successful treatment strategies to overcome the drug resistance associated with pancreatic cancers. Nanotechnology based drug formulations have been providing promising alternatives in cancer treatment due to their selective targeting and accumulation in tumor vasculature, which can be used for efficient delivery of chemotherapeutic agents to tumors and metastases. The research of my thesis is following the principle approach to high therapeutic efficacy that has been first described by Dr. Helmut Ringsdorf in 1975. However, I have extended the use of the Ringsdorf model from polymeric to nanoparticle-based drug carriers by exploring an iron / iron oxide nanoparticle based drug delivery system. A series of drug delivery systems have been synthesized by varying the total numbers and the ratio of the tumor homing peptide sequence CGKRK and the chemotherapeutic drug doxorubicin at the surfaces of Fe/Fe3O 4-nanoparticles. The cytotoxicity of these nanoformulations was tested against murine pancreatic cancer cell lines (Pan02) to assess their therapeutic capabilities for effective treatments of pancreatic cancers. Healthy mouse fibroblast cells (STO) were also tested for comparison, because an effective chemotherapeutic drug has to be selective towards cancer cells. Optimal Experimental Design methodology was applied to identify the nanoformulation with the highest therapeutic activity. A statistical analysis method known as response surface methodology was carried out to evaluate the in-vitro cytotoxicity data, and to determine whether the chosen experimental parameters truly express the optimized conditions of the nanoparticle based drug delivery system. The overall goal was to optimize the therapeutic efficacy in nanoparticle-based pancreatic cancer treatment. Based on the statistical data, the most effective iron/iron oxide nanoparticle-based drug delivery system has been identified. Its Fe/Fe3O4 core has a diameter of 20 nm. The surface of this nanoparticle is loaded with the homing sequence CGKRK (139-142 peptide molecules per nanoparticle surface) and the chemotherapeutic agent doxorubicin (156-159 molecules per surface), This nanoplatform is a promising candidate for the nanoparticle-based chemotherapy of pancreatic cancer.

  19. Surface pretreatment of plastics with an atmospheric pressure plasma jet - Influence of generator power and kinematics

    NASA Astrophysics Data System (ADS)

    Moritzer, E.; Leister, C.

    2014-05-01

    The industrial use of atmospheric pressure plasmas in the plastics processing industry has increased significantly in recent years. Users of this treatment process have the possibility to influence the target values (e.g. bond strength or surface energy) with the help of kinematic and electrical parameters. Until now, systematic procedures have been used with which the parameters can be adapted to the process or product requirements but only by very time-consuming methods. For this reason, the relationship between influencing values and target values will be examined based on the example of a pretreatment in the bonding process with the help of statistical experimental design. Because of the large number of parameters involved, the analysis is restricted to the kinematic and electrical parameters. In the experimental tests, the following factors are taken as parameters: gap between nozzle and substrate, treatment velocity (kinematic data), voltage and duty cycle (electrical data). The statistical evaluation shows significant relationships between the parameters and surface energy in the case of polypropylene. An increase in the voltage and duty cycle increases the polar proportion of the surface energy, while a larger gap and higher velocity leads to lower energy levels. The bond strength of the overlapping bond is also significantly influenced by the voltage, velocity and gap. The direction of their effects is identical with those of the surface energy. In addition to the kinematic influences of the motion of an atmospheric pressure plasma jet, it is therefore especially important that the parameters for the plasma production are taken into account when designing the pretreatment processes.

  20. Quantitative metrics for assessment of chemical image quality and spatial resolution

    DOE PAGES

    Kertesz, Vilmos; Cahill, John F.; Van Berkel, Gary J.

    2016-02-28

    Rationale: Currently objective/quantitative descriptions of the quality and spatial resolution of mass spectrometry derived chemical images are not standardized. Development of these standardized metrics is required to objectively describe chemical imaging capabilities of existing and/or new mass spectrometry imaging technologies. Such metrics would allow unbiased judgment of intra-laboratory advancement and/or inter-laboratory comparison for these technologies if used together with standardized surfaces. Methods: We developed two image metrics, viz., chemical image contrast (ChemIC) based on signal-to-noise related statistical measures on chemical image pixels and corrected resolving power factor (cRPF) constructed from statistical analysis of mass-to-charge chronograms across features of interest inmore » an image. These metrics, quantifying chemical image quality and spatial resolution, respectively, were used to evaluate chemical images of a model photoresist patterned surface collected using a laser ablation/liquid vortex capture mass spectrometry imaging system under different instrument operational parameters. Results: The calculated ChemIC and cRPF metrics determined in an unbiased fashion the relative ranking of chemical image quality obtained with the laser ablation/liquid vortex capture mass spectrometry imaging system. These rankings were used to show that both chemical image contrast and spatial resolution deteriorated with increasing surface scan speed, increased lane spacing and decreasing size of surface features. Conclusions: ChemIC and cRPF, respectively, were developed and successfully applied for the objective description of chemical image quality and spatial resolution of chemical images collected from model surfaces using a laser ablation/liquid vortex capture mass spectrometry imaging system.« less

  1. Quantitative metrics for assessment of chemical image quality and spatial resolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kertesz, Vilmos; Cahill, John F.; Van Berkel, Gary J.

    Rationale: Currently objective/quantitative descriptions of the quality and spatial resolution of mass spectrometry derived chemical images are not standardized. Development of these standardized metrics is required to objectively describe chemical imaging capabilities of existing and/or new mass spectrometry imaging technologies. Such metrics would allow unbiased judgment of intra-laboratory advancement and/or inter-laboratory comparison for these technologies if used together with standardized surfaces. Methods: We developed two image metrics, viz., chemical image contrast (ChemIC) based on signal-to-noise related statistical measures on chemical image pixels and corrected resolving power factor (cRPF) constructed from statistical analysis of mass-to-charge chronograms across features of interest inmore » an image. These metrics, quantifying chemical image quality and spatial resolution, respectively, were used to evaluate chemical images of a model photoresist patterned surface collected using a laser ablation/liquid vortex capture mass spectrometry imaging system under different instrument operational parameters. Results: The calculated ChemIC and cRPF metrics determined in an unbiased fashion the relative ranking of chemical image quality obtained with the laser ablation/liquid vortex capture mass spectrometry imaging system. These rankings were used to show that both chemical image contrast and spatial resolution deteriorated with increasing surface scan speed, increased lane spacing and decreasing size of surface features. Conclusions: ChemIC and cRPF, respectively, were developed and successfully applied for the objective description of chemical image quality and spatial resolution of chemical images collected from model surfaces using a laser ablation/liquid vortex capture mass spectrometry imaging system.« less

  2. [Design and implementation of online statistical analysis function in information system of air pollution and health impact monitoring].

    PubMed

    Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun

    2018-01-01

    To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.

  3. Statistical Analyses of Brain Surfaces Using Gaussian Random Fields on 2-D Manifolds

    PubMed Central

    Staib, Lawrence H.; Xu, Dongrong; Zhu, Hongtu; Peterson, Bradley S.

    2008-01-01

    Interest in the morphometric analysis of the brain and its subregions has recently intensified because growth or degeneration of the brain in health or illness affects not only the volume but also the shape of cortical and subcortical brain regions, and new image processing techniques permit detection of small and highly localized perturbations in shape or localized volume, with remarkable precision. An appropriate statistical representation of the shape of a brain region is essential, however, for detecting, localizing, and interpreting variability in its surface contour and for identifying differences in volume of the underlying tissue that produce that variability across individuals and groups of individuals. Our statistical representation of the shape of a brain region is defined by a reference region for that region and by a Gaussian random field (GRF) that is defined across the entire surface of the region. We first select a reference region from a set of segmented brain images of healthy individuals. The GRF is then estimated as the signed Euclidean distances between points on the surface of the reference region and the corresponding points on the corresponding region in images of brains that have been coregistered to the reference. Correspondences between points on these surfaces are defined through deformations of each region of a brain into the coordinate space of the reference region using the principles of fluid dynamics. The warped, coregistered region of each subject is then unwarped into its native space, simultaneously bringing into that space the map of corresponding points that was established when the surfaces of the subject and reference regions were tightly coregistered. The proposed statistical description of the shape of surface contours makes no assumptions, other than smoothness, about the shape of the region or its GRF. The description also allows for the detection and localization of statistically significant differences in the shapes of the surfaces across groups of subjects at both a fine and coarse scale. We demonstrate the effectiveness of these statistical methods by applying them to study differences in shape of the amygdala and hippocampus in a large sample of normal subjects and in subjects with attention deficit/hyperactivity disorder (ADHD). PMID:17243583

  4. Pattern of structural brain changes in social anxiety disorder after cognitive behavioral group therapy: a longitudinal multimodal MRI study.

    PubMed

    Steiger, V R; Brühl, A B; Weidt, S; Delsignore, A; Rufer, M; Jäncke, L; Herwig, U; Hänggi, J

    2017-08-01

    Social anxiety disorder (SAD) is characterized by fears of social and performance situations. Cognitive behavioral group therapy (CBGT) has in general positive effects on symptoms, distress and avoidance in SAD. Prior studies found increased cortical volumes and decreased fractional anisotropy (FA) in SAD compared with healthy controls (HCs). Thirty-three participants diagnosed with SAD attended in a 10-week CBGT and were scanned before and after therapy. We applied three neuroimaging methods-surface-based morphometry, diffusion tensor imaging and network-based statistics-each with specific longitudinal processing protocols, to investigate CBGT-induced structural brain alterations of the gray and white matter (WM). Surface-based morphometry revealed a significant cortical volume reduction (pre- to post-treatment) in the left inferior parietal cortex, as well as a positive partial correlation between treatment success (indexed by reductions in Liebowitz Social Anxiety Scale) and reductions in cortical volume in bilateral dorsomedial prefrontal cortex. Diffusion tensor imaging analysis revealed a significant increase in FA in bilateral uncinate fasciculus and right inferior longitudinal fasciculus. Network-based statistics revealed a significant increase of structural connectivity in a frontolimbic network. No partial correlations with treatment success have been found in WM analyses. For, we believe, the first time, we present a distinctive pattern of longitudinal structural brain changes after CBGT measured with three established magnetic resonance imaging analyzing techniques. Our findings are in line with previous cross-sectional, unimodal SAD studies and extent them by highlighting anatomical brain alterations that point toward the level of HCs in parallel with a reduction in SAD symptomatology.

  5. Effects of silicate weathering on water chemistry in forested, upland, felsic terrane of the USA

    NASA Astrophysics Data System (ADS)

    Stauffer, Robert E.; Wittchen, Bruce D.

    1991-11-01

    We use data from the US EPA National Surface Water Survey (NSWS), the USGS Bench-Mark Station monitoring program, and the National Acid Deposition Program (NADP) to evaluate the role of weathering in supplying base cations to surface waters in forested, upland, felsic terrane of the northeastern, northcentral, and northwestern (Idaho batholith) United States. Multivariate regression reveals differential effects of discharge on individual base cations and silica, but no secular trend in the Ca/Na denudation rate over 24 yr (1965-1988) for the Wild River catchment in the White Mountains. Because the turn-over time for Na in the soil-exchange complex is only ca. 1.5 yr, the long-term behavior of the ratios Ca/Na and Si/Na in waters leaving this catchment indicates that weathering is compensating for base cation export. In every subregion, Ca and Mg concentrations in lakes are statistically linked to nonmarine Na, but the median Ca/Na ratio is greater than the ratio in local plagioclase. We attribute this inequality to nonstoichiometric weathering of calcium in juvenile (formerly glaciated) terrane, not to leaching of exchangeable cations by SO 4, because intraregional and cross-regional statistical analysis reveals no effect of atmospherically derived sulfate ion. The median base cation denudation rates (meq m -2 yr -1) for these American lake regions are: Maine granites (108); western Adirondack felsic gneiss (85); Vermilion batholith (42); Idaho batholith (52). The regional rates are high enough to compensate for present wet deposition of acidifying anions except in some vulnerable lake watersheds in the western Adirondacks.

  6. On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design -- using orthogonal arrays to compute signal-to-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.

  7. Evapotranspiration variability and its association with vegetation dynamics in the Nile Basin, 2002–2011

    USGS Publications Warehouse

    Alemu, Henok; Senay, Gabriel B.; Kaptue, Armel T.; Kovalskyy, Valeriy

    2014-01-01

    Evapotranspiration (ET) is a vital component in land-atmosphere interactions. In drylands, over 90% of annual rainfall evaporates. The Nile Basin in Africa is about 42% dryland in a region experiencing rapid population growth and development. The relationship of ET with climate, vegetation and land cover in the basin during 2002–2011 is analyzed using thermal-based Simplified Surface Energy Balance Operational (SSEBop) ET, Normalized Difference Vegetation Index (NDVI)-based MODIS Terrestrial (MOD16) ET, MODIS-derived NDVI as a proxy for vegetation productivity and rainfall from Tropical Rainfall Measuring Mission (TRMM). Interannual variability and trends are analyzed using established statistical methods. Analysis based on thermal-based ET revealed that >50% of the study area exhibited negative ET anomalies for 7 years (2009, driest), while >60% exhibited positive ET anomalies for 3 years (2007, wettest). NDVI-based monthly ET correlated strongly (r > 0.77) with vegetation than thermal-based ET (0.52 < r < 0.73) at p < 0.001. Climate-zone averaged thermal-based ET anomalies positively correlated (r = 0.6, p < 0.05) with rainfall in 4 of the 9 investigated climate zones. Thermal-based and NDVI-based ET estimates revealed minor discrepancies over rainfed croplands (60 mm/yr higher for thermal-based ET), but a significant divergence over wetlands (440 mm/yr higher for thermal-based ET). Only 5% of the study area exhibited statistically significant trends in ET.

  8. Using MERRA, AMIP II, CMIP5 Outputs to Assess Actual and Potential Building Climate Zone Change and Variability From the Last 30 Years Through 2100

    NASA Astrophysics Data System (ADS)

    Stackhouse, P. W.; Westberg, D. J.; Hoell, J. M., Jr.; Chandler, W.; Zhang, T.

    2014-12-01

    In the US, residential and commercial building infrastructure combined consumes about 40% of total energy usage and emits about 39% of total CO2emission (DOE/EIA "Annual Energy Outlook 2013"). Thus, increasing the energy efficiency of buildings is paramount to reducing energy costs and emissions. Building codes, as used by local and state enforcement entities are typically tied to the dominant climate within an enforcement jurisdiction classified according to various climate zones. These climates zones are based upon a 30-year average of local surface observations and are developed by DOE and ASHRAE (formerly known as the American Society of Hearting, Refrigeration and Air-Conditioning Engineers). A significant shortcoming of the methodology used in constructing such maps is the use of surface observations (located mainly near airports) that are unequally distributed and frequently have periods of missing data that need to be filled by various approximation schemes. This paper demonstrates the usefulness of using NASA's Modern Era Retrospective-analysis for Research and Applications (MERRA) atmospheric data assimilation to derive the ASHRAE climate zone maps and then using MERRA to define the last 30 years of variability in climate zones. These results show that there is a statistically significant increase in the area covered by warmer climate zones and some tendency for a reduction of area in colder climate zones that require longer time series to confirm. Using the uncertainties of the basic surface temperature and precipitation parameters from MERRA as determined by comparison to surface measurements, we first compare patterns and variability of ASHRAE climate zones from MERRA relative to present day climate model runs from AMIP simulations to establish baseline sensitivity. Based upon these results, we assess the variability of the ASHRAE climate zones according to CMIP runs through 2100 using an ensemble analysis that classifies model output changes by percentiles. Estimates of statistical significance are then compared to original model variability during the AMIP period. This work quantifies and tests for significance the changes seen in the various US regions that represent a potential contribution by NASA to the ongoing National Climate Assessment.

  9. Effect of accelerated ageing and surface sealing on the permanent deformation of auto-polymerising soft linings.

    PubMed

    da Silva, Joaquim; Takahashi, Jessica; Nuňez, Juliana; Consani, Rafael; Mesquita, Marcelo

    2012-09-01

    To compare the effects of different ageing methods on the permanent deformation of two permanent soft liners. The materials selected were auto-polymerising acrylic resin and silicone-based reliners. Sealer coating was also evaluated. Sixty specimens of each reliner were manufactured (12.7 mm diameter and 19 mm length). Specimens were randomly distributed into 12 groups (n = 10) and submitted to one of the accelerated ageing processes. Permanent deformation tests were conducted with a mechanical device described within the American Dental Association specification number 18 with a compressive load of 750 gf applied for 30 s. All data were submitted for statistical analysis. Mann-Whitney test compared the effect of the surface sealer on each material and the permanent deformation of the materials in the same ageing group (p = 0.05). Kruskal-Wallis and Dunn tests compared all ageing groups of each material (p = 0.05). The silicone-based reliner presented a lower permanent deformation than the acrylic resin-based reliner, regardless of the ageing procedure. The surface sealer coating was effective only for the thermocycled silicone group and the accelerated ageing processes affected only the permanent deformation of the acrylic resin-based material. The silicone-based reliner presented superior elastic properties and the thermocycling was more effective in ageing the materials. © 2010 The Gerodontology Society and John Wiley & Sons A/S.

  10. CO2 Accounting and Risk Analysis for CO2 Sequestration at Enhanced Oil Recovery Sites.

    PubMed

    Dai, Zhenxue; Viswanathan, Hari; Middleton, Richard; Pan, Feng; Ampomah, William; Yang, Changbing; Jia, Wei; Xiao, Ting; Lee, Si-Yong; McPherson, Brian; Balch, Robert; Grigg, Reid; White, Mark

    2016-07-19

    Using CO2 in enhanced oil recovery (CO2-EOR) is a promising technology for emissions management because CO2-EOR can dramatically reduce sequestration costs in the absence of emissions policies that include incentives for carbon capture and storage. This study develops a multiscale statistical framework to perform CO2 accounting and risk analysis in an EOR environment at the Farnsworth Unit (FWU), Texas. A set of geostatistical-based Monte Carlo simulations of CO2-oil/gas-water flow and transport in the Morrow formation are conducted for global sensitivity and statistical analysis of the major risk metrics: CO2/water injection/production rates, cumulative net CO2 storage, cumulative oil/gas productions, and CO2 breakthrough time. The median and confidence intervals are estimated for quantifying uncertainty ranges of the risk metrics. A response-surface-based economic model has been derived to calculate the CO2-EOR profitability for the FWU site with a current oil price, which suggests that approximately 31% of the 1000 realizations can be profitable. If government carbon-tax credits are available, or the oil price goes up or CO2 capture and operating expenses reduce, more realizations would be profitable. The results from this study provide valuable insights for understanding CO2 storage potential and the corresponding environmental and economic risks of commercial-scale CO2-sequestration in depleted reservoirs.

  11. A novel framework for the local extraction of extra-axial cerebrospinal fluid from MR brain images

    NASA Astrophysics Data System (ADS)

    Mostapha, Mahmoud; Shen, Mark D.; Kim, SunHyung; Swanson, Meghan; Collins, D. Louis; Fonov, Vladimir; Gerig, Guido; Piven, Joseph; Styner, Martin A.

    2018-03-01

    The quantification of cerebrospinal fluid (CSF) in the human brain has shown to play an important role in early postnatal brain developmental. Extr a-axial fluid (EA-CSF), which is characterized by the CSF in the subarachnoid space, is promising in the early detection of children at risk for neurodevelopmental disorders. Currently, though, there is no tool to extract local EA-CSF measurements in a way that is suitable for localized analysis. In this paper, we propose a novel framework for the localized, cortical surface based analysis of EA-CSF. In our proposed processing, we combine probabilistic brain tissue segmentation, cortical surface reconstruction as well as streamline based local EA-CSF quantification. For streamline computation, we employ the vector field generated by solving a Laplacian partial differential equation (PDE) between the cortical surface and the outer CSF hull. To achieve sub-voxel accuracy while minimizing numerical errors, fourth-order Runge-Kutta (RK4) integration was used to generate the streamlines. Finally, the local EA-CSF is computed by integrating the CSF probability along the generated streamlines. The proposed local EA-CSF extraction tool was used to study the early postnatal brain development in typically developing infants. The results show that the proposed localized EA-CSF extraction pipeline can produce statistically significant regions that are not observed in previous global approach.

  12. Spatiotemporal analysis of urban environment based on the vegetation-impervious surface-soil model

    NASA Astrophysics Data System (ADS)

    Guo, Huadong; Huang, Qingni; Li, Xinwu; Sun, Zhongchang; Zhang, Ying

    2014-01-01

    This study explores a spatiotemporal comparative analysis of urban agglomeration, comparing the Greater Toronto and Hamilton Area (GTHA) of Canada and the city of Tianjin in China. The vegetation-impervious surface-soil (V-I-S) model is used to quantify the ecological composition of urban/peri-urban environments with multitemporal Landsat images (3 stages, 18 scenes) and LULC data from 1985 to 2005. The support vector machine algorithm and several knowledge-based methods are applied to get the V-I-S component fractions at high accuracies. The statistical results show that the urban expansion in the GTHA occurred mainly between 1985 and 1999, and only two districts revealed increasing trends for impervious surfaces for the period from 1999 to 2005. In contrast, Tianjin has been experiencing rapid urban sprawl at all stages and this has been accelerating since 1999. The urban growth patterns in the GTHA evolved from a monocentric and dispersed pattern to a polycentric and aggregated pattern, while in Tianjin it changed from monocentric to polycentric. Central Tianjin has become more centralized, while most other municipal areas have developed dispersed patterns. The GTHA also has a higher level of greenery and a more balanced ecological environment than Tianjin. These differences in the two areas may play an important role in urban planning and decision-making in developing countries.

  13. Skin injury model classification based on shape vector analysis

    PubMed Central

    2012-01-01

    Background: Skin injuries can be crucial in judicial decision making. Forensic experts base their classification on subjective opinions. This study investigates whether known classes of simulated skin injuries are correctly classified statistically based on 3D surface models and derived numerical shape descriptors. Methods: Skin injury surface characteristics are simulated with plasticine. Six injury classes – abrasions, incised wounds, gunshot entry wounds, smooth and textured strangulation marks as well as patterned injuries - with 18 instances each are used for a k-fold cross validation with six partitions. Deformed plasticine models are captured with a 3D surface scanner. Mean curvature is estimated for each polygon surface vertex. Subsequently, distance distributions and derived aspect ratios, convex hulls, concentric spheres, hyperbolic points and Fourier transforms are used to generate 1284-dimensional shape vectors. Subsequent descriptor reduction maximizing SNR (signal-to-noise ratio) result in an average of 41 descriptors (varying across k-folds). With non-normal multivariate distribution of heteroskedastic data, requirements for LDA (linear discriminant analysis) are not met. Thus, shrinkage parameters of RDA (regularized discriminant analysis) are optimized yielding a best performance with λ = 0.99 and γ = 0.001. Results: Receiver Operating Characteristic of a descriptive RDA yields an ideal Area Under the Curve of 1.0for all six categories. Predictive RDA results in an average CRR (correct recognition rate) of 97,22% under a 6 partition k-fold. Adding uniform noise within the range of one standard deviation degrades the average CRR to 71,3%. Conclusions: Digitized 3D surface shape data can be used to automatically classify idealized shape models of simulated skin injuries. Deriving some well established descriptors such as histograms, saddle shape of hyperbolic points or convex hulls with subsequent reduction of dimensionality while maximizing SNR seem to work well for the data at hand, as predictive RDA results in CRR of 97,22%. Objective basis for discrimination of non-overlapping hypotheses or categories are a major issue in medicolegal skin injury analysis and that is where this method appears to be strong. Technical surface quality is important in that adding noise clearly degrades CRR. Trial registration: This study does not cover the results of a controlled health care intervention as only plasticine was used. Thus, there was no trial registration. PMID:23497357

  14. The National Water-Quality Assessment Program of the United States: Strategies for Monitoring Trends and Results from the First Two Decades of Study: 1991-2011

    NASA Astrophysics Data System (ADS)

    Lindsey, B.; McMahon, P.; Rupert, M.; Tesoriero, J.; Starn, J.; Anning, D.; Green, C.

    2012-04-01

    The U.S. Geological Survey National Water-Quality Assessment (NAWQA) Program was implemented in 1991 to provide long-term, consistent, and comparable information on the quality of surface and groundwater resources of the United States. Findings are used to support national, regional, state, and local information needs with respect to water quality. The three main goals of the program are to 1) assess the condition of the nation's streams, rivers, groundwater, and aquatic systems; 2) assess how conditions are changing over time; and 3) determine how natural features and human activities affect these conditions, and where those effects are most pronounced. As data collection progressed into the second decade, the emphasis of the interpretation of the data has shifted from primarily understanding status, to evaluation of trends. The program has conducted national and regional evaluations of change in the quality of water in streams, rivers, groundwater, and health of aquatic systems. Evaluating trends in environmental systems requires complex analytical and statistical methods, and a periodic re-evaluation of the monitoring methods used to collect these data. Examples given herein summarize the lessons learned from the evaluation of changes in water quality during the past two decades with an emphasis on the finding with respect to groundwater. The analysis of trends in groundwater is based on 56 well networks located in 22 principal aquifers of the United States. Analysis has focused on 3 approaches: 1) a statistical analysis of results of sampling over various time scales, 2) studies of factors affecting trends in groundwater quality, and 3) use of models to simulate groundwater trends and forecast future trends. Data collection for analysis of changes in groundwater-quality has focused on decadal resampling of wells. Understanding the trends in groundwater quality and the factors affecting those trends has been conducted using quarterly sampling, biennial sampling, and more recently continuous monitoring of selected parameters in a small number of wells. Models such as MODFLOW have been used for simulation and forecasting of future trends. Important outcomes from the groundwater-trends studies include issues involving statistics, sampling frequency, changes in laboratory analytical methods over time, the need for groundwater age-dating information, the value of understanding geochemical conditions and contaminant degradation, the need to understand groundwater-surface water interaction, and the value of modeling in understanding trends and forecasting potential future conditions. Statistically significant increases in chloride, dissolved solids, and nitrate concentrations were found in a large number of well networks over the first decadal sampling period. Statistically significant decreases of chloride, dissolved solids, and nitrate concentrations were found in a very small number of networks. Trends in surface-water are analyzed within 8 large major river basins within the United States with a focus on issues of regional importance. Examples of regional surface-water issues include an analysis of trends in dissolved solids in the Southeastern United States, trends in pesticides in the north-central United States, and trends in nitrate in the Mississippi River Basin. Evaluations of ecological indicators of water quality include temporal changes in stream habitat, and aquatic-invertebrate and fish assemblages.

  15. Statistical analysis of polarization-inhomogeneous Fourier spectra of laser radiation scattered by human skin in the tasks of differentiation of benign and malignant formations

    NASA Astrophysics Data System (ADS)

    Ushenko, Alexander G.; Dubolazov, Alexander V.; Ushenko, Vladimir A.; Novakovskaya, Olga Y.

    2016-07-01

    The optical model of formation of polarization structure of laser radiation scattered by polycrystalline networks of human skin in Fourier plane was elaborated. The results of investigation of the values of statistical (statistical moments of the 1st to 4th order) parameters of polarization-inhomogeneous images of skin surface in Fourier plane were presented. The diagnostic criteria of pathological process in human skin and its severity degree differentiation were determined.

  16. Sandmeier model based topographic correction to lunar spectral profiler (SP) data from KAGUYA satellite.

    PubMed

    Chen, Sheng-Bo; Wang, Jing-Ran; Guo, Peng-Ju; Wang, Ming-Chang

    2014-09-01

    The Moon may be considered as the frontier base for the deep space exploration. The spectral analysis is one of the key techniques to determine the lunar surface rock and mineral compositions. But the lunar topographic relief is more remarkable than that of the Earth. It is necessary to conduct the topographic correction for lunar spectral data before they are used to retrieve the compositions. In the present paper, a lunar Sandmeier model was proposed by considering the radiance effect from the macro and ambient topographic relief. And the reflectance correction model was also reduced based on the Sandmeier model. The Spectral Profile (SP) data from KAGUYA satellite in the Sinus Iridum quadrangle was taken as an example. And the digital elevation data from Lunar Orbiter Laser Altimeter are used to calculate the slope, aspect, incidence and emergence angles, and terrain-viewing factor for the topographic correction Thus, the lunar surface reflectance from the SP data was corrected by the proposed model after the direct component of irradiance on a horizontal surface was derived. As a result, the high spectral reflectance facing the sun is decreased and low spectral reflectance back to the sun is compensated. The statistical histogram of reflectance-corrected pixel numbers presents Gaussian distribution Therefore, the model is robust to correct lunar topographic effect and estimate lunar surface reflectance.

  17. Modified retrieval algorithm for three types of precipitation distribution using x-band synthetic aperture radar

    NASA Astrophysics Data System (ADS)

    Xie, Yanan; Zhou, Mingliang; Pan, Dengke

    2017-10-01

    The forward-scattering model is introduced to describe the response of normalized radar cross section (NRCS) of precipitation with synthetic aperture radar (SAR). Since the distribution of near-surface rainfall is related to the rate of near-surface rainfall and horizontal distribution factor, a retrieval algorithm called modified regression empirical and model-oriented statistical (M-M) based on the volterra integration theory is proposed. Compared with the model-oriented statistical and volterra integration (MOSVI) algorithm, the biggest difference is that the M-M algorithm is based on the modified regression empirical algorithm rather than the linear regression formula to retrieve the value of near-surface rainfall rate. Half of the empirical parameters are reduced in the weighted integral work and a smaller average relative error is received while the rainfall rate is less than 100 mm/h. Therefore, the algorithm proposed in this paper can obtain high-precision rainfall information.

  18. Surface shape analysis with an application to brain surface asymmetry in schizophrenia.

    PubMed

    Brignell, Christopher J; Dryden, Ian L; Gattone, S Antonio; Park, Bert; Leask, Stuart; Browne, William J; Flynn, Sean

    2010-10-01

    Some methods for the statistical analysis of surface shapes and asymmetry are introduced. We focus on a case study where magnetic resonance images of the brain are available from groups of 30 schizophrenia patients and 38 controls, and we investigate large-scale brain surface shape differences. Key aspects of shape analysis are to remove nuisance transformations by registration and to identify which parts of one object correspond with the parts of another object. We introduce maximum likelihood and Bayesian methods for registering brain images and providing large-scale correspondences of the brain surfaces. Brain surface size-and-shape analysis is considered using random field theory, and also dimension reduction is carried out using principal and independent components analysis. Some small but significant differences are observed between the the patient and control groups. We then investigate a particular type of asymmetry called torque. Differences in asymmetry are observed between the control and patient groups, which add strength to other observations in the literature. Further investigations of the midline plane location in the 2 groups and the fitting of nonplanar curved midlines are also considered.

  19. Effect of surface treatment methods on the shear bond strength of auto-polymerized resin to thermoplastic denture base polymer.

    PubMed

    Koodaryan, Roodabeh; Hafezeqoran, Ali

    2016-12-01

    Polyamide polymers do not provide sufficient bond strength to auto-polymerized resins for repairing fractured denture or replacing dislodged denture teeth. Limited treatment methods have been developed to improve the bond strength between auto-polymerized reline resins and polyamide denture base materials. The objective of the present study was to evaluate the effect of surface modification by acetic acid on surface characteristics and bond strength of reline resin to polyamide denture base. 84 polyamide specimens were divided into three surface treatment groups (n=28): control (N), silica-coated (S), and acid-treated (A). Two different auto-polymerized reline resins GC and Triplex resins were bonded to the samples (subgroups T and G, respectively, n=14). The specimens were subjected to shear bond strength test after they were stored in distilled water for 1 week and thermo-cycled for 5000 cycles. Data were analyzed with independent t-test, two-way analysis of variance (ANOVA), and Tukey's post hoc multiple comparison test (α=.05). The bond strength values of A and S were significantly higher than those of N ( P <.001 for both). However, statistically significant difference was not observed between group A and group S. According to the independent Student's t-test, the shear bond strength values of AT were significantly higher than those of AG ( P <.001). The surface treatment of polyamide denture base materials with acetic acid may be an efficient and cost-effective method for increasing the shear bond strength to auto-polymerized reline resin.

  20. Comparative evaluation of topographical data of dental implant surfaces applying optical interferometry and scanning electron microscopy.

    PubMed

    Kournetas, N; Spintzyk, S; Schweizer, E; Sawada, T; Said, F; Schmid, P; Geis-Gerstorfer, J; Eliades, G; Rupp, F

    2017-08-01

    Comparability of topographical data of implant surfaces in literature is low and their clinical relevance often equivocal. The aim of this study was to investigate the ability of scanning electron microscopy and optical interferometry to assess statistically similar 3-dimensional roughness parameter results and to evaluate these data based on predefined criteria regarded relevant for a favorable biological response. Four different commercial dental screw-type implants (NanoTite Certain Prevail, TiUnite Brånemark Mk III, XiVE S Plus and SLA Standard Plus) were analyzed by stereo scanning electron microscopy and white light interferometry. Surface height, spatial and hybrid roughness parameters (Sa, Sz, Ssk, Sku, Sal, Str, Sdr) were assessed from raw and filtered data (Gaussian 50μm and 5μm cut-off-filters), respectively. Data were statistically compared by one-way ANOVA and Tukey-Kramer post-hoc test. For a clinically relevant interpretation, a categorizing evaluation approach was used based on predefined threshold criteria for each roughness parameter. The two methods exhibited predominantly statistical differences. Dependent on roughness parameters and filter settings, both methods showed variations in rankings of the implant surfaces and differed in their ability to discriminate the different topographies. Overall, the analyses revealed scale-dependent roughness data. Compared to the pure statistical approach, the categorizing evaluation resulted in much more similarities between the two methods. This study suggests to reconsider current approaches for the topographical evaluation of implant surfaces and to further seek after proper experimental settings. Furthermore, the specific role of different roughness parameters for the bioresponse has to be studied in detail in order to better define clinically relevant, scale-dependent and parameter-specific thresholds and ranges. Copyright © 2017 The Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  1. Extracting Galaxy Cluster Gas Inhomogeneity from X-Ray Surface Brightness: A Statistical Approach and Application to Abell 3667

    NASA Astrophysics Data System (ADS)

    Kawahara, Hajime; Reese, Erik D.; Kitayama, Tetsu; Sasaki, Shin; Suto, Yasushi

    2008-11-01

    Our previous analysis indicates that small-scale fluctuations in the intracluster medium (ICM) from cosmological hydrodynamic simulations follow the lognormal probability density function. In order to test the lognormal nature of the ICM directly against X-ray observations of galaxy clusters, we develop a method of extracting statistical information about the three-dimensional properties of the fluctuations from the two-dimensional X-ray surface brightness. We first create a set of synthetic clusters with lognormal fluctuations around their mean profile given by spherical isothermal β-models, later considering polytropic temperature profiles as well. Performing mock observations of these synthetic clusters, we find that the resulting X-ray surface brightness fluctuations also follow the lognormal distribution fairly well. Systematic analysis of the synthetic clusters provides an empirical relation between the three-dimensional density fluctuations and the two-dimensional X-ray surface brightness. We analyze Chandra observations of the galaxy cluster Abell 3667, and find that its X-ray surface brightness fluctuations follow the lognormal distribution. While the lognormal model was originally motivated by cosmological hydrodynamic simulations, this is the first observational confirmation of the lognormal signature in a real cluster. Finally we check the synthetic cluster results against clusters from cosmological hydrodynamic simulations. As a result of the complex structure exhibited by simulated clusters, the empirical relation between the two- and three-dimensional fluctuation properties calibrated with synthetic clusters when applied to simulated clusters shows large scatter. Nevertheless we are able to reproduce the true value of the fluctuation amplitude of simulated clusters within a factor of 2 from their two-dimensional X-ray surface brightness alone. Our current methodology combined with existing observational data is useful in describing and inferring the statistical properties of the three-dimensional inhomogeneity in galaxy clusters.

  2. Utilization of Skylab (EREP) system for appraising changes in continental migratory bird habitat

    USGS Publications Warehouse

    Work, E.A.; Gilmer, D.S.

    1975-01-01

    The author has identified the following significant results. Surface water statistics using data obtained by supporting aircraft were generated. Signature extraction and refinement preliminary to wetland and associated upland vegetation recognition were accomplished, using a selected portion of the aircraft data. Final classification mapping and analysis of surface water trends will be accomplished.

  3. Time-dynamics of the two-color emission from vertical-external-cavity surface-emitting lasers

    NASA Astrophysics Data System (ADS)

    Chernikov, A.; Wichmann, M.; Shakfa, M. K.; Scheller, M.; Moloney, J. V.; Koch, S. W.; Koch, M.

    2012-01-01

    The temporal stability of a two-color vertical-external-cavity surface-emitting laser is studied using single-shot streak-camera measurements. The collected data is evaluated via quantitative statistical analysis schemes. Dynamically stable and unstable regions for the two-color operation are identified and the dependence on the pump conditions is analyzed.

  4. Vegetation Coverage and Impervious Surface Area Estimated Based on the Estarfm Model and Remote Sensing Monitoring

    NASA Astrophysics Data System (ADS)

    Hu, Rongming; Wang, Shu; Guo, Jiao; Guo, Liankun

    2018-04-01

    Impervious surface area and vegetation coverage are important biophysical indicators of urban surface features which can be derived from medium-resolution images. However, remote sensing data obtained by a single sensor are easily affected by many factors such as weather conditions, and the spatial and temporal resolution can not meet the needs for soil erosion estimation. Therefore, the integrated multi-source remote sensing data are needed to carry out high spatio-temporal resolution vegetation coverage estimation. Two spatial and temporal vegetation coverage data and impervious data were obtained from MODIS and Landsat 8 remote sensing images. Based on the Enhanced Spatial and Temporal Adaptive Reflectance Fusion Model (ESTARFM), the vegetation coverage data of two scales were fused and the data of vegetation coverage fusion (ESTARFM FVC) and impervious layer with high spatiotemporal resolution (30 m, 8 day) were obtained. On this basis, the spatial variability of the seepage-free surface and the vegetation cover landscape in the study area was measured by means of statistics and spatial autocorrelation analysis. The results showed that: 1) ESTARFM FVC and impermeable surface have higher accuracy and can characterize the characteristics of the biophysical components covered by the earth's surface; 2) The average impervious surface proportion and the spatial configuration of each area are different, which are affected by natural conditions and urbanization. In the urban area of Xi'an, which has typical characteristics of spontaneous urbanization, landscapes are fragmented and have less spatial dependence.

  5. Mars: Noachian hydrology by its statistics and topology

    NASA Technical Reports Server (NTRS)

    Cabrol, N. A.; Grin, E. A.

    1993-01-01

    Discrimination between fluvial features generated by surface drainage and subsurface aquifer discharges will provide clues to the understanding of early Mars' climatic history. Our approach is to define the process of formation of the oldest fluvial valleys by statistical and topological analyses. Formation of fluvial valley systems reached its highest statistical concentration during the Noachian Period. Nevertheless, they are a scarce phenomenom in Martian history, localized on the craterized upland, and subject to latitudinal distribution. They occur sparsely on Noachian geological units with a weak distribution density, and appear in reduced isolated surface (around 5 x 10(exp 3)(sq km)), filled by short streams (100-300 km length). Topological analysis of the internal organization of 71 surveyed Noachian fluvial valley networks also provides information on the mechanisms of formation.

  6. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    PubMed

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  7. Polysaccharide-derived mesoporous materials (Starbon®) for sustainable separation of complex mixtures.

    PubMed

    Zuin, Vânia G; Budarin, Vitaliy L; De Bruyn, Mario; Shuttleworth, Peter S; Hunt, Andrew J; Pluciennik, Camille; Borisova, Aleksandra; Dodson, Jennifer; Parker, Helen L; Clark, James H

    2017-09-21

    The recovery and separation of high value and low volume extractives are a considerable challenge for the commercial realisation of zero-waste biorefineries. Using solid-phase extractions (SPE) based on sustainable sorbents is a promising method to enable efficient, green and selective separation of these complex extractive mixtures. Mesoporous carbonaceous solids derived from renewable polysaccharides are ideal stationary phases due to their tuneable functionality and surface structure. In this study, the structure-separation relationships of thirteen polysaccharide-derived mesoporous materials and two modified types as sorbents for ten naturally-occurring bioactive phenolic compounds were investigated. For the first time, a comprehensive statistical analysis of the key molecular and surface properties influencing the recovery of these species was carried out. The obtained results show the possibility of developing tailored materials for purification, separation or extraction, depending on the molecular composition of the analyte. The wide versatility and application span of these polysaccharide-derived mesoporous materials offer new sustainable and inexpensive alternatives to traditional silica-based stationary phases.

  8. Statistical tools for transgene copy number estimation based on real-time PCR.

    PubMed

    Yuan, Joshua S; Burris, Jason; Stewart, Nathan R; Mentewab, Ayalew; Stewart, C Neal

    2007-11-01

    As compared with traditional transgene copy number detection technologies such as Southern blot analysis, real-time PCR provides a fast, inexpensive and high-throughput alternative. However, the real-time PCR based transgene copy number estimation tends to be ambiguous and subjective stemming from the lack of proper statistical analysis and data quality control to render a reliable estimation of copy number with a prediction value. Despite the recent progresses in statistical analysis of real-time PCR, few publications have integrated these advancements in real-time PCR based transgene copy number determination. Three experimental designs and four data quality control integrated statistical models are presented. For the first method, external calibration curves are established for the transgene based on serially-diluted templates. The Ct number from a control transgenic event and putative transgenic event are compared to derive the transgene copy number or zygosity estimation. Simple linear regression and two group T-test procedures were combined to model the data from this design. For the second experimental design, standard curves were generated for both an internal reference gene and the transgene, and the copy number of transgene was compared with that of internal reference gene. Multiple regression models and ANOVA models can be employed to analyze the data and perform quality control for this approach. In the third experimental design, transgene copy number is compared with reference gene without a standard curve, but rather, is based directly on fluorescence data. Two different multiple regression models were proposed to analyze the data based on two different approaches of amplification efficiency integration. Our results highlight the importance of proper statistical treatment and quality control integration in real-time PCR-based transgene copy number determination. These statistical methods allow the real-time PCR-based transgene copy number estimation to be more reliable and precise with a proper statistical estimation. Proper confidence intervals are necessary for unambiguous prediction of trangene copy number. The four different statistical methods are compared for their advantages and disadvantages. Moreover, the statistical methods can also be applied for other real-time PCR-based quantification assays including transfection efficiency analysis and pathogen quantification.

  9. Dental students' perception of their approaches to learning in a PBL programme.

    PubMed

    Haghparast, H; Ghorbani, A; Rohlin, M

    2017-08-01

    To compare dental students' perceptions of their learning approaches between different years of a problem-based learning (PBL) programme. The hypothesis was that in a comparison between senior and junior students, the senior students would perceive themselves as having a higher level of deep learning approach and a lower level of surface learning approach than junior students would. This hypothesis was based on the fact that senior students have longer experience of a student-centred educational context, which is supposed to underpin student learning. Students of three cohorts (first year, third year and fifth year) of a PBL-based dental programme were asked to respond to a questionnaire (R-SPQ-2F) developed to analyse students' learning approaches, that is deep approach and surface approach, using four subscales including deep strategy, surface strategy, deep motive and surface motive. The results of the three cohorts were compared using a one-way analysis of variance (ANOVA). A P-value was set at <0.05 for statistical significance. The fifth-year students demonstrated a lower surface approach than the first-year students (P = 0.020). There was a significant decrease in surface strategy from the first to the fifth year (P = 0.003). No differences were found concerning deep approach or its subscales (deep strategy and deep motive) between the mean scores of the three cohorts. The results did not show the expected increased depth in learning approaches over the programme years. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  10. Effects of the peracetic acid and sodium hypochlorite on the colour stability and surface roughness of the denture base acrylic resins polymerised by microwave and water bath methods.

    PubMed

    Fernandes, Flavio H C N; Orsi, Iara A; Villabona, Camilo A

    2013-03-01

    This study evaluated the surface roughness (Ra) and color stability of acrylic resin colors (Lucitone 550, QC-20 and Vipi-Wave) used for fabricating bases for complete, removable dentures, overdentures and prosthetic protocol after immersion in chemical disinfectants (1% sodium hypochlorite and 2% peracetic acid) for 30 and 60 minutes. Sixty specimens were made of each commercial brand of resin composite, and divided into 2 groups according to the chemical disinfectants. Specimens had undergone the finishing and polishing procedures, the initial color and roughness measurements were taken (t=0), and after this, ten test specimens of each commercial brand of resin composite were immersed in sodium hypochlorite and ten in peracetic acid, for 30 and 60 minutes, with measurements being taken after each immersion period. These data were submitted to statistical analysis. There was evidence of an increase in Ra after 30 minutes immersion in the disinfectants in all the resins, with QC-20 presenting the highest Ra values, and Vipi-Wave the lowest. After 60 minutes immersion in the disinfectants all the resins presented statistically significant color alteration. Disinfection with 1% sodium hypochlorite and peracetic acid altered the properties of roughness and color of the resins. © 2012 The Gerodontology Society and John Wiley & Sons A/S.

  11. Ground-Based Navigation and Dispersion Analysis for the Orion Exploration Mission 1

    NASA Technical Reports Server (NTRS)

    D' Souza, Christopher; Holt, Greg; Zanetti, Renato; Wood, Brandon

    2016-01-01

    This paper presents the Orion Exploration Mission 1 Linear Covariance Analysis for the DRO mission using ground-based navigation. The Delta V statistics for each maneuver are presented. In particular, the statistics of the lunar encounters and the Entry Interface are presented.

  12. SOCR: Statistics Online Computational Resource

    PubMed Central

    Dinov, Ivo D.

    2011-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741

  13. Corneal modeling for analysis of photorefractive keratectomy

    NASA Astrophysics Data System (ADS)

    Della Vecchia, Michael A.; Lamkin-Kennard, Kathleen

    1997-05-01

    Procedurally, excimer photorefractive keratectomy is based on the refractive correction of composite spherical and cylindrical ophthalmic errors of the entire eye. These refractive errors are inputted for correction at the corneal plane and for the properly controlled duration and location of laser energy. Topography is usually taken to correspondingly monitor spherical and cylindrical corneorefractive errors. While a corneal topographer provides surface morphologic information, the keratorefractive photoablation is based on the patient's spherical and cylindrical spectacle correction. Topography is at present not directly part of the procedural deterministic parameters. Examination of how corneal curvature at each of the keratometric reference loci affect the shape of the resultant corneal photoablated surface may enhance the accuracy of the desired correction. The objective of this study was to develop a methodology to utilize corneal topography for construction of models depicting pre- and post-operative keratomorphology for analysis of photorefractive keratectomy. Multiple types of models were developed then recreated in optical design software for examination of focal lengths and other optical characteristics. The corneal models were developed using data extracted from the TMS I corneal modeling system (Computed Anatomy, New York, NY). The TMS I does not allow for manipulation of data or differentiation of pre- and post-operative surfaces within its platform, thus models needed to be created for analysis. The data were imported into Matlab where 3D models, surface meshes, and contour plots were created. The data used to generate the models were pre- and post-operative curvatures, heights from the corneal apes, and x-y positions at 6400 locations on the corneal surface. Outlying non-contributory points were eliminated through statistical operations. Pre- and post- operative models were analyzed to obtain the resultant changes in the corneal surfaces during PRK. A sensitivity analysis of the corneal topography system was also performed. Ray tracings were performed using the height data and the optical design software Zemax (Focus Software, Inc., Tucson, AZ). Examining pre- and post-operative values of corneal surfaces may further the understanding of how areas of the cornea contribute toward desired visual correction. Gross resultant power across the corneal surface is used in PRK, however, understanding the contribution of each point to the average power may have important implications and prove to be significant for achieving projected surgical results.

  14. Analysis of Occupational Accidents in Underground and Surface Mining in Spain Using Data-Mining Techniques

    PubMed Central

    Sanmiquel, Lluís; Bascompta, Marc; Rossell, Josep M.; Anticoi, Hernán Francisco; Guash, Eduard

    2018-01-01

    An analysis of occupational accidents in the mining sector was conducted using the data from the Spanish Ministry of Employment and Social Safety between 2005 and 2015, and data-mining techniques were applied. Data was processed with the software Weka. Two scenarios were chosen from the accidents database: surface and underground mining. The most important variables involved in occupational accidents and their association rules were determined. These rules are composed of several predictor variables that cause accidents, defining its characteristics and context. This study exposes the 20 most important association rules in the sector—either surface or underground mining—based on the statistical confidence levels of each rule as obtained by Weka. The outcomes display the most typical immediate causes, along with the percentage of accidents with a basis in each association rule. The most important immediate cause is body movement with physical effort or overexertion, and the type of accident is physical effort or overexertion. On the other hand, the second most important immediate cause and type of accident are different between the two scenarios. Data-mining techniques were chosen as a useful tool to find out the root cause of the accidents. PMID:29518921

  15. Analysis of Occupational Accidents in Underground and Surface Mining in Spain Using Data-Mining Techniques.

    PubMed

    Sanmiquel, Lluís; Bascompta, Marc; Rossell, Josep M; Anticoi, Hernán Francisco; Guash, Eduard

    2018-03-07

    An analysis of occupational accidents in the mining sector was conducted using the data from the Spanish Ministry of Employment and Social Safety between 2005 and 2015, and data-mining techniques were applied. Data was processed with the software Weka. Two scenarios were chosen from the accidents database: surface and underground mining. The most important variables involved in occupational accidents and their association rules were determined. These rules are composed of several predictor variables that cause accidents, defining its characteristics and context. This study exposes the 20 most important association rules in the sector-either surface or underground mining-based on the statistical confidence levels of each rule as obtained by Weka. The outcomes display the most typical immediate causes, along with the percentage of accidents with a basis in each association rule. The most important immediate cause is body movement with physical effort or overexertion, and the type of accident is physical effort or overexertion. On the other hand, the second most important immediate cause and type of accident are different between the two scenarios. Data-mining techniques were chosen as a useful tool to find out the root cause of the accidents.

  16. Statistical parsimony networks and species assemblages in Cephalotrichid nemerteans (nemertea).

    PubMed

    Chen, Haixia; Strand, Malin; Norenburg, Jon L; Sun, Shichun; Kajihara, Hiroshi; Chernyshev, Alexey V; Maslakova, Svetlana A; Sundberg, Per

    2010-09-21

    It has been suggested that statistical parsimony network analysis could be used to get an indication of species represented in a set of nucleotide data, and the approach has been used to discuss species boundaries in some taxa. Based on 635 base pairs of the mitochondrial protein-coding gene cytochrome c oxidase I (COI), we analyzed 152 nemertean specimens using statistical parsimony network analysis with the connection probability set to 95%. The analysis revealed 15 distinct networks together with seven singletons. Statistical parsimony yielded three networks supporting the species status of Cephalothrix rufifrons, C. major and C. spiralis as they currently have been delineated by morphological characters and geographical location. Many other networks contained haplotypes from nearby geographical locations. Cladistic structure by maximum likelihood analysis overall supported the network analysis, but indicated a false positive result where subnetworks should have been connected into one network/species. This probably is caused by undersampling of the intraspecific haplotype diversity. Statistical parsimony network analysis provides a rapid and useful tool for detecting possible undescribed/cryptic species among cephalotrichid nemerteans based on COI gene. It should be combined with phylogenetic analysis to get indications of false positive results, i.e., subnetworks that would have been connected with more extensive haplotype sampling.

  17. PAH Baselines for Amazonic Surficial Sediments: A Case of Study in Guajará Bay and Guamá River (Northern Brazil).

    PubMed

    Rodrigues, Camila Carneiro Dos Santos; Santos, Ewerton; Ramos, Brunalisa Silva; Damasceno, Flaviana Cardoso; Correa, José Augusto Martins

    2018-06-01

    The 16 priority PAH were determined in sediment samples from the insular zone of Guajará Bay and Guamá River (Southern Amazon River mouth). Low hydrocarbon levels were observed and naphthalene was the most representative PAH. The low molecular weight PAH represented 51% of the total PAH. Statistical analysis showed that the sampling sites are not significantly different. Source analysis by PAH ratios and principal component analysis revealed that PAH are primary from a few rate of fossil fuel combustion, mainly related to the local small community activity. All samples presented no biological stress or damage potencial according to the sediment quality guidelines. This study discuss baselines for PAH in surface sediments from Amazonic aquatic systems based on source determination by PAH ratios and principal component analysis, sediment quality guidelines and through comparison with previous studies data.

  18. On the use of statistical methods to interpret electrical resistivity data from the Eumsung basin (Cretaceous), Korea

    NASA Astrophysics Data System (ADS)

    Kim, Ji-Soo; Han, Soo-Hyung; Ryang, Woo-Hun

    2001-12-01

    Electrical resistivity mapping was conducted to delineate boundaries and architecture of the Eumsung Basin Cretaceous. Basin boundaries are effectively clarified in electrical dipole-dipole resistivity sections as high-resistivity contrast bands. High resistivities most likely originate from the basement of Jurassic granite and Precambrian gneiss, contrasting with the lower resistivities from infilled sedimentary rocks. The electrical properties of basin-margin boundaries are compatible with the results of vertical electrical soundings and very-low-frequency electromagnetic surveys. A statistical analysis of the resistivity sections is tested in terms of standard deviation and is found to be an effective scheme for the subsurface reconstruction of basin architecture as well as the surface demarcation of basin-margin faults and brittle fracture zones, characterized by much higher standard deviation. Pseudo three-dimensional architecture of the basin is delineated by integrating the composite resistivity structure information from two cross-basin E-W magnetotelluric lines and dipole-dipole resistivity lines. Based on statistical analysis, the maximum depth of the basin varies from about 1 km in the northern part to 3 km or more in the middle part. This strong variation supports the view that the basin experienced pull-apart opening with rapid subsidence of the central blocks and asymmetric cross-basinal extension.

  19. Articular cartilage degeneration classification by means of high-frequency ultrasound.

    PubMed

    Männicke, N; Schöne, M; Oelze, M; Raum, K

    2014-10-01

    To date only single ultrasound parameters were regarded in statistical analyses to characterize osteoarthritic changes in articular cartilage and the potential benefit of using parameter combinations for characterization remains unclear. Therefore, the aim of this work was to utilize feature selection and classification of a Mankin subset score (i.e., cartilage surface and cell sub-scores) using ultrasound-based parameter pairs and investigate both classification accuracy and the sensitivity towards different degeneration stages. 40 punch biopsies of human cartilage were previously scanned ex vivo with a 40-MHz transducer. Ultrasound-based surface parameters, as well as backscatter and envelope statistics parameters were available. Logistic regression was performed with each unique US parameter pair as predictor and different degeneration stages as response variables. The best ultrasound-based parameter pair for each Mankin subset score value was assessed by highest classification accuracy and utilized in receiver operating characteristics (ROC) analysis. The classifications discriminating between early degenerations yielded area under the ROC curve (AUC) values of 0.94-0.99 (mean ± SD: 0.97 ± 0.03). In contrast, classifications among higher Mankin subset scores resulted in lower AUC values: 0.75-0.91 (mean ± SD: 0.84 ± 0.08). Variable sensitivities of the different ultrasound features were observed with respect to different degeneration stages. Our results strongly suggest that combinations of high-frequency ultrasound-based parameters exhibit potential to characterize different, particularly very early, degeneration stages of hyaline cartilage. Variable sensitivities towards different degeneration stages suggest that a concurrent estimation of multiple ultrasound-based parameters is diagnostically valuable. In-vivo application of the present findings is conceivable in both minimally invasive arthroscopic ultrasound and high-frequency transcutaneous ultrasound. Copyright © 2014 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  20. A simulations approach for meta-analysis of genetic association studies based on additive genetic model.

    PubMed

    John, Majnu; Lencz, Todd; Malhotra, Anil K; Correll, Christoph U; Zhang, Jian-Ping

    2018-06-01

    Meta-analysis of genetic association studies is being increasingly used to assess phenotypic differences between genotype groups. When the underlying genetic model is assumed to be dominant or recessive, assessing the phenotype differences based on summary statistics, reported for individual studies in a meta-analysis, is a valid strategy. However, when the genetic model is additive, a similar strategy based on summary statistics will lead to biased results. This fact about the additive model is one of the things that we establish in this paper, using simulations. The main goal of this paper is to present an alternate strategy for the additive model based on simulating data for the individual studies. We show that the alternate strategy is far superior to the strategy based on summary statistics.

  1. Effect of soldering on the metal-ceramic bond strength of an Ni-Cr base alloy.

    PubMed

    Nikellis, Ioannis; Levi, Anna; Zinelis, Spiros

    2005-11-01

    Although soldering is a common laboratory procedure, the use of soldering alloys may adversely affect metal-ceramic bond strength and potentially decrease the longevity of metal-ceramic restorations. The purpose of this study was to investigate the effect of soldering on metal-ceramic bond strength of a representative Ni-Cr base metal alloy. Twenty-eight rectangular (25 x 3 x 0.5 mm) Ni-based alloy (Wiron 99) specimens were equally divided into soldering (S) and reference (R) groups. Soldering group specimens were covered with a 0.1-mm layer of the appropriate solder (Wiron-Lot) and reduced by 0.1 mm on the opposite side. Five specimens of each group were used for the measurement of surface roughness parameter (R(z)) and hardness, and 3 were used for measurement of the modulus of elasticity. Six specimens of each group were covered with porcelain (Ceramco 3) and subjected to a 3-point bending test for evaluation of the metal-ceramic bond strength according to the ISO 9693 specification. The data from surface roughness, hardness, modulus of elasticity, and metal-ceramic bond strength were analyzed statistically, using independent t tests (alpha=.05). Statistical analysis of the R(z) surface roughness parameter (S: 3.4 +/- 0.3 mum; R: 3.7 +/- 0.7 microm; P=.07) and bond strength (S: 46 +/- 3 MPa; R: 40 +/- 5 MPa; P=.057) failed to reveal any significant difference between the 2 groups. The specimens of the soldering group demonstrated significantly lower values both in hardness (S: 128 +/- 11 VHN; R: 217 +/- 4 VHN; P<.001) and in modulus of elasticity (S: 135 +/- 4 GPa; R: 183 +/- 6 GPa; P=.035) than the reference group. Under the conditions of the present study, the addition of solder to the base metal alloy did not affect the metal-ceramic bond strength.

  2. Experimental investigation and modelling of surface roughness and resultant cutting force in hard turning of AISI H13 Steel

    NASA Astrophysics Data System (ADS)

    Boy, M.; Yaşar, N.; Çiftçi, İ.

    2016-11-01

    In recent years, turning of hardened steels has replaced grinding for finishing operations. This process is compared to grinding operations; hard turning has higher material removal rates, the possibility of greater process flexibility, lower equipment costs, and shorter setup time. CBN or ceramic cutting tools are widely used hard part machining. For successful application of hard turning, selection of suitable cutting parameters for a given cutting tool is an important step. For this purpose, an experimental investigation was conducted to determine the effects of cutting tool edge geometry, feed rate and cutting speed on surface roughness and resultant cutting force in hard turning of AISI H13 steel with ceramic cutting tools. Machining experiments were conducted in a CNC lathe based on Taguchi experimental design (L16) in different levels of cutting parameters. In the experiments, a Kistler 9257 B, three cutting force components (Fc, Ff and Fr) piezoelectric dynamometer was used to measure cutting forces. Surface roughness measurements were performed by using a Mahrsurf PS1 device. For statistical analysis, analysis of variance has been performed and mathematical model have been developed for surface roughness and resultant cutting forces. The analysis of variance results showed that the cutting edge geometry, cutting speed and feed rate were the most significant factors on resultant cutting force while the cutting edge geometry and feed rate were the most significant factor for the surface roughness. The regression analysis was applied to predict the outcomes of the experiment. The predicted values and measured values were very close to each other. Afterwards a confirmation tests were performed to make a comparison between the predicted results and the measured results. According to the confirmation test results, measured values are within the 95% confidence interval.

  3. Smooth extrapolation of unknown anatomy via statistical shape models

    NASA Astrophysics Data System (ADS)

    Grupp, R. B.; Chiang, H.; Otake, Y.; Murphy, R. J.; Gordon, C. R.; Armand, M.; Taylor, R. H.

    2015-03-01

    Several methods to perform extrapolation of unknown anatomy were evaluated. The primary application is to enhance surgical procedures that may use partial medical images or medical images of incomplete anatomy. Le Fort-based, face-jaw-teeth transplant is one such procedure. From CT data of 36 skulls and 21 mandibles separate Statistical Shape Models of the anatomical surfaces were created. Using the Statistical Shape Models, incomplete surfaces were projected to obtain complete surface estimates. The surface estimates exhibit non-zero error in regions where the true surface is known; it is desirable to keep the true surface and seamlessly merge the estimated unknown surface. Existing extrapolation techniques produce non-smooth transitions from the true surface to the estimated surface, resulting in additional error and a less aesthetically pleasing result. The three extrapolation techniques evaluated were: copying and pasting of the surface estimate (non-smooth baseline), a feathering between the patient surface and surface estimate, and an estimate generated via a Thin Plate Spline trained from displacements between the surface estimate and corresponding vertices of the known patient surface. Feathering and Thin Plate Spline approaches both yielded smooth transitions. However, feathering corrupted known vertex values. Leave-one-out analyses were conducted, with 5% to 50% of known anatomy removed from the left-out patient and estimated via the proposed approaches. The Thin Plate Spline approach yielded smaller errors than the other two approaches, with an average vertex error improvement of 1.46 mm and 1.38 mm for the skull and mandible respectively, over the baseline approach.

  4. A Hessian-based methodology for automatic surface crack detection and classification from pavement images

    NASA Astrophysics Data System (ADS)

    Ghanta, Sindhu; Shahini Shamsabadi, Salar; Dy, Jennifer; Wang, Ming; Birken, Ralf

    2015-04-01

    Around 3,000,000 million vehicle miles are annually traveled utilizing the US transportation systems alone. In addition to the road traffic safety, maintaining the road infrastructure in a sound condition promotes a more productive and competitive economy. Due to the significant amounts of financial and human resources required to detect surface cracks by visual inspection, detection of these surface defects are often delayed resulting in deferred maintenance operations. This paper introduces an automatic system for acquisition, detection, classification, and evaluation of pavement surface cracks by unsupervised analysis of images collected from a camera mounted on the rear of a moving vehicle. A Hessian-based multi-scale filter has been utilized to detect ridges in these images at various scales. Post-processing on the extracted features has been implemented to produce statistics of length, width, and area covered by cracks, which are crucial for roadway agencies to assess pavement quality. This process has been realized on three sets of roads with different pavement conditions in the city of Brockton, MA. A ground truth dataset labeled manually is made available to evaluate this algorithm and results rendered more than 90% segmentation accuracy demonstrating the feasibility of employing this approach at a larger scale.

  5. A pediatric brain structure atlas from T1-weighted MR images

    NASA Astrophysics Data System (ADS)

    Shan, Zuyao Y.; Parra, Carlos; Ji, Qing; Ogg, Robert J.; Zhang, Yong; Laningham, Fred H.; Reddick, Wilburn E.

    2006-03-01

    In this paper, we have developed a digital atlas of the pediatric human brain. Human brain atlases, used to visualize spatially complex structures of the brain, are indispensable tools in model-based segmentation and quantitative analysis of brain structures. However, adult brain atlases do not adequately represent the normal maturational patterns of the pediatric brain, and the use of an adult model in pediatric studies may introduce substantial bias. Therefore, we proposed to develop a digital atlas of the pediatric human brain in this study. The atlas was constructed from T1 weighted MR data set of a 9 year old, right-handed girl. Furthermore, we extracted and simplified boundary surfaces of 25 manually defined brain structures (cortical and subcortical) based on surface curvature. Higher curvature surfaces were simplified with more reference points; lower curvature surfaces, with fewer. We constructed a 3D triangular mesh model for each structure by triangulation of the structure's reference points. Kappa statistics (cortical, 0.97; subcortical, 0.91) indicated substantial similarities between the mesh-defined and the original volumes. Our brain atlas and structural mesh models (www.stjude.org/BrainAtlas) can be used to plan treatment, to conduct knowledge and modeldriven segmentation, and to analyze the shapes of brain structures in pediatric patients.

  6. Statistical downscaling of GCM simulations to streamflow using relevance vector machine

    NASA Astrophysics Data System (ADS)

    Ghosh, Subimal; Mujumdar, P. P.

    2008-01-01

    General circulation models (GCMs), the climate models often used in assessing the impact of climate change, operate on a coarse scale and thus the simulation results obtained from GCMs are not particularly useful in a comparatively smaller river basin scale hydrology. The article presents a methodology of statistical downscaling based on sparse Bayesian learning and Relevance Vector Machine (RVM) to model streamflow at river basin scale for monsoon period (June, July, August, September) using GCM simulated climatic variables. NCEP/NCAR reanalysis data have been used for training the model to establish a statistical relationship between streamflow and climatic variables. The relationship thus obtained is used to project the future streamflow from GCM simulations. The statistical methodology involves principal component analysis, fuzzy clustering and RVM. Different kernel functions are used for comparison purpose. The model is applied to Mahanadi river basin in India. The results obtained using RVM are compared with those of state-of-the-art Support Vector Machine (SVM) to present the advantages of RVMs over SVMs. A decreasing trend is observed for monsoon streamflow of Mahanadi due to high surface warming in future, with the CCSR/NIES GCM and B2 scenario.

  7. Spatial distribution and ecological risk assessment of heavy metal on surface sediment in west part of Java Sea

    NASA Astrophysics Data System (ADS)

    Effendi, Hefni; Wardiatno, Yusli; Kawaroe, Mujizat; Mursalin; Fauzia Lestari, Dea

    2017-01-01

    The surface sediments were identified from west part of Java Sea to evaluate spatial distribution and ecological risk potential of heavy metals (Hg, As, Cd, Cr, Cu, Pb, Zn and Ni). The samples were taken from surface sediment (<0.5 m) in 26 m up to 80 m water depth with Eikman grab. The average material composition on sediment samples were clay (9.86%), sand (8.57%) and mud sand (81.57%). The analysis showed that Pb (11.2%), Cd (49.7%), and Ni (59.5%) exceeded of Probably Effect Level (PEL). Base on ecological risk analysis, {{Cd }}≤ft( {E_r^i:300.64} \\right) and {{Cr }}≤ft( {E_r^i:0.02} \\right) were categorized to high risk and low risk criteria. The ecological risk potential sequences of this study were Cd>Hg>Pb>Ni>Cu>As>Zn>Cr. Furthermore, the result of multivariate statistical analysis shows that correlation among heavy metals (As/Ni, Cd/Ni, and Cu/Zn) and heavy metals with Risk Index (Cd/Ri and Ni/Ri) had positive correlation in significance level p<0.05. Total variance of analysis factor was 80.04% and developed into 3 factors (eigenvalues >1). On the cluster analysis, Cd, Ni, Pb were identified as fairly high contaminations level (cluster 1), Hg as moderate contamination level (cluster 2) and Cu, Zn, Cr with lower contamination level (cluster 3).

  8. Statistical characteristics of polar lows over the Nordic Seas based on satellite passive microwave data

    NASA Astrophysics Data System (ADS)

    Smirnova, J. E.; Zabolotskikh, E. V.; Bobylev, L. P.; Chapron, B.

    2016-12-01

    In this study polar lows over the Nordic Seas for the period of 1995-2008 have been detected and studied using the Special Sensor Microwave Imager (SSM/I) data. A new methodology for polar low detection and monitoring based on the analysis of the total atmospheric water vapor content (WVC) fields retrieved from SSM/I was used. Lifetimes, diameters, translation speeds, distances traveled, and intensities were estimated for the detected polar lows using SSM/I WVC, sea surface wind speed fields and infrared imagery. Over the Norwegian and Barents Seas, the polar low activity was found to be almost equal. A positive tendency in the total number of polar lows for the time period of 1995-2008 was detected.

  9. A general scientific information system to support the study of climate-related data

    NASA Technical Reports Server (NTRS)

    Treinish, L. A.

    1984-01-01

    The development and use of NASA's Pilot Climate Data System (PCDS) are discussed. The PCDS is used as a focal point for managing and providing access to a large collection of actively used data for the Earth, ocean and atmospheric sciences. The PCDS provides uniform data catalogs, inventories, and access methods for selected NASA and non-NASA data sets. Scientific users can preview the data sets using graphical and statistical methods. The system has evolved from its original purpose as a climate data base management system in response to a national climate program, into an extensive package of capabilities to support many types of data sets from both spaceborne and surface based measurements with flexible data selection and analysis functions.

  10. Active Structural Acoustic Control as an Approach to Acoustic Optimization of Lightweight Structures

    DTIC Science & Technology

    2001-06-01

    appropriate approach based on Statistical Energy Analysis (SEA) would facilitate investigations of the structural behavior at a high modal density. On the way...higher frequency investigations an approach based on the Statistical Energy Analysis (SEA) is recommended to describe the structural dynamic behavior

  11. Some regularity on how to locate electrodes for higher fECG SNRs

    NASA Astrophysics Data System (ADS)

    Zhang, Jie-Min; Huang, Xiao-Lin; Guan, Qun; Liu, Tie-Bing; Li, Ping; Zhao, Ying; Liu, Hong-Xing

    2015-03-01

    The electrocardiogram (ECG) recorded from the abdominal surface of a pregnant woman is a composite of maternal ECG, fetal ECG (fECG) and other noises, while only the fECG component is always needed by us. With different locations of electrode pairs on the maternal abdominal surface to measure fECGs, the signal-to-noise ratios (SNRs) of the recorded abdominal ECGs are also correspondingly different. Some regularity on how to locate electrodes to obtain higher fECG SNRs is needed practically. In this paper, 343 groups of abdominal ECG records were acquired from 78 pregnant women with different electrode pairs locating, and an appropriate extended research database is formed. Then the regularity on fECG SNRs corresponding to different electrode pairs locating was studied. Based on statistical analysis, it is shown that the fECG SNRs are significantly higher in certain locations than others. Reasonable explanation is also provided to the statistical result using the theories of the fetal cardiac electrical axis and the signal phase delay. Project supported by the National Natural Science Foundation of China (Grant No. 61271079) and the Supporting Plan Project of Jiangsu Province, China (Grant No. BE2010720).

  12. Improved cellulase production by Botryosphaeria rhodina from OPEFB at low level moisture condition through statistical optimization.

    PubMed

    Bahrin, E K; Ibrahim, M F; Abd Razak, M N; Abd-Aziz, S; Shah, U K Md; Alitheen, N; Salleh, M Md

    2012-01-01

    The response surface method was applied in this study to improve cellulase production from oil palm empty fruit bunch (OPEFB) by Botryosphaeria rhodina. An experimental design based on a two-level factorial was employed to screen the significant environmental factors for cellulase production. The locally isolated fungus Botryosphaeria rhodina was cultivated on OPEFB under solid-state fermentation (SSF). From the analysis of variance (ANOVA), the initial moisture content, amount of substrate, and initial pH of nutrient supplied in the SSF system significantly influenced cellulase production. Then the optimization of the variables was done using the response surface method according to central composite design (CCD). Botryosphaeria rhodina exhibited its best performance with a high predicted value of FPase enzyme production (17.95 U/g) when the initial moisture content was at 24.32%, initial pH of nutrient was 5.96, and 3.98 g of substrate was present. The statistical optimization from actual experiment resulted in a significant increment of FPase production from 3.26 to 17.91 U/g (5.49-fold). High cellulase production at low moisture content is a very rare condition for fungi cultured in solid-state fermentation.

  13. Analysis of surface sputtering on a quantum statistical basis

    NASA Technical Reports Server (NTRS)

    Wilhelm, H. E.

    1975-01-01

    Surface sputtering is explained theoretically by means of a 3-body sputtering mechanism involving the ion and two surface atoms of the solid. By means of quantum-statistical mechanics, a formula for the sputtering ratio S(E) is derived from first principles. The theoretical sputtering rate S(E) was found experimentally to be proportional to the square of the difference between incident ion energy and the threshold energy for sputtering of surface atoms at low ion energies. Extrapolation of the theoretical sputtering formula to larger ion energies indicates that S(E) reaches a saturation value and finally decreases at high ion energies. The theoretical sputtering ratios S(E) for wolfram, tantalum, and molybdenum are compared with the corresponding experimental sputtering curves in the low energy region from threshold sputtering energy to 120 eV above the respective threshold energy. Theory and experiment are shown to be in good agreement.

  14. Shear bond strength in zirconia veneered ceramics using two different surface treatments prior veneering.

    PubMed

    Gasparić, Lana Bergman; Schauperl, Zdravko; Mehulić, Ketij

    2013-03-01

    Aim of the study was to assess the effect of different surface treatments on the shear bond strength (SBS) of the veneering ceramics to zirconia core. In a shear test the influence of grinding and sandblasting of the zirconia surface on bonding were assessed. Statistical analysis was performed using SPSS statistical package (version 17.0, SPSS Inc., Chicago, IL, USA) and Microsoft Office Excel 2003 (Microsoft, Seattle, WA, USA). There was a significant difference between the groups considering shear bond strength (SBS) values, i.e. ground and sandblasted samples had significantly higher SBS values than only ground samples (mean difference = -190.67; df = 10, t = -6.386, p < 0.001). The results of the present study indicate that ground and sandblasted cores are superior to ground cores, allowing significantly higher surface roughness and significantly higher shear bond strength between the core and the veneering material.

  15. A Comparison of the Effects of Electrode Implantation and Targeting on Pattern Classification Accuracy for Prosthesis Control

    PubMed Central

    Farrell, Todd R.; Weir, Richard F. ff.

    2011-01-01

    The use of surface versus intramuscular electrodes as well as the effect of electrode targeting on pattern-recognition-based multifunctional prosthesis control was explored. Surface electrodes are touted for their ability to record activity from relatively large portions of muscle tissue. Intramuscular electromyograms (EMGs) can provide focal recordings from deep muscles of the forearm and independent signals relatively free of crosstalk. However, little work has been done to compare the two. Additionally, while previous investigations have either targeted electrodes to specific muscles or used untargeted (symmetric) electrode arrays, no work has compared these approaches to determine if one is superior. The classification accuracies of pattern-recognition-based classifiers utilizing surface and intramuscular as well as targeted and untargeted electrodes were compared across 11 subjects. A repeated-measures analysis of variance revealed that when only EMG amplitude information was used from all available EMG channels, the targeted surface, targeted intramuscular, and untargeted surface electrodes produced similar classification accuracies while the untargeted intramuscular electrodes produced significantly lower accuracies. However, no statistical differences were observed between any of the electrode conditions when additional features were extracted from the EMG signal. It was concluded that the choice of electrode should be driven by clinical factors, such as signal robustness/stability, cost, etc., instead of by classification accuracy. PMID:18713689

  16. A New Approach to Galaxy Morphology. I. Analysis of the Sloan Digital Sky Survey Early Data Release

    NASA Astrophysics Data System (ADS)

    Abraham, Roberto G.; van den Bergh, Sidney; Nair, Preethi

    2003-05-01

    In this paper we present a new statistic for quantifying galaxy morphology based on measurements of the Gini coefficient of galaxy light distributions. This statistic is easy to measure and is commonly used in econometrics to measure how wealth is distributed in human populations. When applied to galaxy images, the Gini coefficient provides a quantitative measure of the inequality with which a galaxy's light is distributed among its constituent pixels. We measure the Gini coefficient of local galaxies in the Early Data Release of the Sloan Digital Sky Survey and demonstrate that this quantity is closely correlated with measurements of central concentration, but with significant scatter. This scatter is almost entirely due to variations in the mean surface brightness of galaxies. By exploring the distribution of galaxies in the three-dimensional parameter space defined by the Gini coefficient, central concentration, and mean surface brightness, we show that all nearby galaxies lie on a well-defined two-dimensional surface (a slightly warped plane) embedded within a three-dimensional parameter space. By associating each galaxy sample with the equation of this plane, we can encode the morphological composition of the entire SDSS g*-band sample using the following three numbers: {22.451, 5.366, 7.010}. The i*-band sample is encoded as {22.149, 5.373, and 7.627}.

  17. Quantitative estimation of global patterns of surface ocean biological productivity and its seasonal variation on timescales from centuries to millennia

    NASA Astrophysics Data System (ADS)

    Loubere, Paul; Fariduddin, Mohammad

    1999-03-01

    We present a quantitative method, based on the relative abundances of benthic foraminifera in deep-sea sediments, for estimating surface ocean biological productivity over the timescale of centuries to millennia. We calibrate the method using a global data set composed of 207 samples from the Atlantic, Pacific, and Indian Oceans from a water depth range between 2300 and 3600 m. The sample set was developed so that other, potentially significant, environmental variables would be uncorrelated to overlying surface ocean productivity. A regression of assemblages against productivity yielded an r2 = 0.89 demonstrating a strong productivity signal in the faunal data. In addition, we examined assemblage response to annual variability in biological productivity (seasonality). Our data set included a range of seasonalities which we quantified into a seasonality index using the pigment color bands from the coastal zone color scanner (CZCS). The response of benthic foraminiferal assemblage composition to our seasonality index was tested with regression analysis. We obtained a statistically highly significant r2 = 0.75. Further, discriminant function analysis revealed a clear separation among sample groups based on surface ocean productivity and our seasonality index. Finally, we tested the response of benthic foraminiferal assemblages to three different modes of seasonality. We observed a distinct separation of our samples into groups representing low seasonal variability, strong seasonality with a single main productivity event in the year, and strong seasonality with multiple productivity events in the year. Reconstructing surface ocean biological productivity with benthic foraminifera will aid in modeling marine biogeochemical cycles. Also, estimating mode and range of annual seasonality will provide insight to changing oceanic processes, allowing the examination of the mechanisms causing changes in the marine biotic system over time. This article contains supplementary material.

  18. Assessment of climate variability of the Greenland Ice Sheet: Integration of in situ and satellite data

    NASA Technical Reports Server (NTRS)

    Steffen, K.; Abdalati, W.; Stroeve, J.; Key, J.

    1994-01-01

    The proposed research involves the application of multispectral satellite data in combination with ground truth measurements to monitor surface properties of the Greenland Ice Sheet which are essential for describing the energy and mass of the ice sheet. Several key components of the energy balance are parameterized using satellite data and in situ measurements. The analysis will be done for a ten year time period in order to get statistics on the seasonal and interannual variations of the surface processes and the climatology. Our goal is to investigate to what accuracy and over what geographic areas large scale snow properties and radiative fluxes can be derived based upon a combination of available remote sensing and meteorological data sets. Operational satellite sensors are calibrated based on ground measurements and atmospheric modeling prior to large scale analysis to ensure the quality of the satellite data. Further, several satellite sensors of different spatial and spectral resolution are intercompared to access the parameter accuracy. Proposed parameterization schemes to derive key component of the energy balance from satellite data are validated. For the understanding of the surface processes a field program was designed to collect information on spectral albedo, specular reflectance, soot content, grain size and the physical properties of different snow types. Further, the radiative and turbulent fluxes at the ice/snow surface are monitored for the parameterization and interpretation of the satellite data. The expected results include several baseline data sets of albedo, surface temperature, radiative fluxes, and different snow types of the entire Greenland Ice Sheet. These climatological data sets will be of potential use for climate sensitivity studies in the context of future climate change.

  19. Effect of Various Laser Surface Treatments on Repair Shear Bond Strength of Aged Silorane-Based Composite

    PubMed Central

    Alizadeh Oskoee, Parnian; Savadi Oskoee, Siavash; Rikhtegaran, Sahand; Pournaghi-Azar, Fatemeh; Gholizadeh, Sarah; Aleyasin, Yasaman; Kasrae, Shahin

    2017-01-01

    Introduction: Successful repair of composite restorations depends on a strong bond between the old composite and the repair composite. This study sought to assess the repair shear bond strength of aged silorane-based composite following surface treatment with Nd:YAG, Er,Cr:YSGG and CO2 lasers. Methods: Seventy-six Filtek silorane composite cylinders were fabricated and aged by 2 months of water storage at 37°C. The samples were randomly divided into 4 groups (n=19) of no surface treatment (group 1) and surface treatment with Er,Cr:YSGG (group 2), Nd:YAG (group 3) and CO2 (group 4) lasers. The repair composite was applied and the shear bond strength was measured. The data were analyzed using one-way analysis of variance (ANOVA) and Tukey posthoc test. Prior to the application of the repair composite, 2 samples were randomly selected from each group and topographic changes on their surfaces following laser irradiation were studied using a scanning electron microscope (SEM). Seventeen other samples were also fabricated for assessment of cohesive strength of composite. Results: The highest and the lowest mean bond strength values were 8.99 MPa and 6.69 MPa for Er,Cr:YSGG and control groups, respectively. The difference in the repair bond strength was statistically significant between the Er,Cr:YSGG and other groups. Bond strength of the control, Nd:YAG and CO2 groups was not significantly different. The SEM micrographs revealed variable degrees of ablation and surface roughness in laser-treated groups. Conclusion: Surface treatment with Er,Cr:YSGG laser significantly increase the repair bond strength of aged silorane-based composite resin. PMID:29071025

  20. Influence of surface treatment on the in-vitro fracture resistance of zirconia-based all-ceramic anterior crowns.

    PubMed

    Schmitter, M; Lotze, G; Bömicke, W; Rues, S

    2015-12-01

    The purpose of this study was to assess the effect of surface treatment on the fracture resistance of zirconia-based all-ceramic anterior crowns. Sixty-four zirconia-based all-ceramic anterior crowns, veneered by use of a press-on technique, were produced. For 48 crowns intraoral adjustment was simulated (A-group), 16 crowns remained unadjusted (WA-group). The adjusted area was then treated in three ways: 1. no further surface treatment; 2. polishing, with irrigation, using polishers interspersed with diamond grit for ceramics; and 3. polishing and glaze firing. Half of the specimens were loaded until fracture in an universal testing device without artificial ageing; the other crowns underwent thermocycling and chewing simulation before ultimate-load testing. Explorative statistical analysis was performed by use of non-parametric and parametric tests. In addition, fracture-strength tests according to ISO 6872 were performed for veneer ceramic subjected to the different surface treatments. Finite element analysis was also conducted for the crowns, and surface roughness was measured. Crowns in the A-group were more sensitive to aging than crowns in the WA-group (p=0.038). Although both polishing and glaze firing slightly improved the fracture resistance of the specimens, the fracture resistance in the WA-group (initial fracture resistance (IFR): 652.0 ± 107.7N, remaining fracture resistance after aging (RFR): 560.6 ± 233.3N) was higher than the fracture resistance in the A-group (polished: IFR: 477.9 ± 108.8N, RFR: 386.0 ± 218.5N; glaze firing: IFR: 535.5 ± 128.0N, RFR: 388.6 ± 202.2N). Surface roughness without adjustment was Ra=0.1 μm; for adjustment but without further treatment it was Ra=1.4 μm; for adjustment and polishing it was Ra=0.3 μm; and for adjustment, polishing, and glazing it was Ra=0.6 μm. Stress distributions obtained by finite element analysis in combination with fracture strength tests showed that fractures most probably originated from the occlusal surface. To improve fracture resistance and reduce the incidence of failure, extensive occlusal adjustment of veneered anterior zirconia restorations should be avoided. Neither polishing nor glazing could restore the fracture resistance to the level maintained with unadjusted crowns. Copyright © 2015 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  1. Deriving inertial wave characteristics from surface drifter velocities: Frequency variability in the Tropical Pacific

    NASA Astrophysics Data System (ADS)

    Poulain, Pierre-Marie; Luther, Douglas S.; Patzert, William C.

    1992-11-01

    Two techniques have been developed for estimating statistics of inertial oscillations from satellite-tracked drifters. These techniques overcome the difficulties inherent in estimating such statistics from data dependent upon space coordinates that are a function of time. Application of these techniques to tropical surface drifter data collected during the NORPAX, EPOCS, and TOGA programs reveals a latitude-dependent, statistically significant "blue shift" of inertial wave frequency. The latitudinal dependence of the blue shift is similar to predictions based on "global" internal wave spectral models, with a superposition of frequency shifting due to modification of the effective local inertial frequency by the presence of strongly sheared zonal mean currents within 12° of the equator.

  2. Removal of antibiotics in a parallel-plate thin-film-photocatalytic reactor: Process modeling and evolution of transformation by-products and toxicity.

    PubMed

    Özkal, Can Burak; Frontistis, Zacharias; Antonopoulou, Maria; Konstantinou, Ioannis; Mantzavinos, Dionissios; Meriç, Süreyya

    2017-10-01

    Photocatalytic degradation of sulfamethoxazole (SMX) antibiotic has been studied under recycling batch and homogeneous flow conditions in a thin-film coated immobilized system namely parallel-plate (PPL) reactor. Experimentally designed, statistically evaluated with a factorial design (FD) approach with intent to provide a mathematical model takes into account the parameters influencing process performance. Initial antibiotic concentration, UV energy level, irradiated surface area, water matrix (ultrapure and secondary treated wastewater) and time, were defined as model parameters. A full of 2 5 experimental design was consisted of 32 random experiments. PPL reactor test experiments were carried out in order to set boundary levels for hydraulic, volumetric and defined defined process parameters. TTIP based thin-film with polyethylene glycol+TiO 2 additives were fabricated according to pre-described methodology. Antibiotic degradation was monitored by High Performance Liquid Chromatography analysis while the degradation products were specified by LC-TOF-MS analysis. Acute toxicity of untreated and treated SMX solutions was tested by standard Daphnia magna method. Based on the obtained mathematical model, the response of the immobilized PC system is described with a polynomial equation. The statistically significant positive effects are initial SMX concentration, process time and the combined effect of both, while combined effect of water matrix and irradiated surface area displays an adverse effect on the rate of antibiotic degradation by photocatalytic oxidation. Process efficiency and the validity of the acquired mathematical model was also verified for levofloxacin and cefaclor antibiotics. Immobilized PC degradation in PPL reactor configuration was found capable of providing reduced effluent toxicity by simultaneous degradation of SMX parent compound and TBPs. Copyright © 2017. Published by Elsevier B.V.

  3. PolarBRDF: A general purpose Python package for visualization and quantitative analysis of multi-angular remote sensing measurements

    NASA Astrophysics Data System (ADS)

    Singh, Manoj K.; Gautam, Ritesh; Gatebe, Charles K.; Poudyal, Rajesh

    2016-11-01

    The Bidirectional Reflectance Distribution Function (BRDF) is a fundamental concept for characterizing the reflectance property of a surface, and helps in the analysis of remote sensing data from satellite, airborne and surface platforms. Multi-angular remote sensing measurements are required for the development and evaluation of BRDF models for improved characterization of surface properties. However, multi-angular data and the associated BRDF models are typically multidimensional involving multi-angular and multi-wavelength information. Effective visualization of such complex multidimensional measurements for different wavelength combinations is presently somewhat lacking in the literature, and could serve as a potentially useful research and teaching tool in aiding both interpretation and analysis of BRDF measurements. This article describes a newly developed software package in Python (PolarBRDF) to help visualize and analyze multi-angular data in polar and False Color Composite (FCC) forms. PolarBRDF also includes functionalities for computing important multi-angular reflectance/albedo parameters including spectral albedo, principal plane reflectance and spectral reflectance slope. Application of PolarBRDF is demonstrated using various case studies obtained from airborne multi-angular remote sensing measurements using NASA's Cloud Absorption Radiometer (CAR). Our visualization program also provides functionalities for untangling complex surface/atmosphere features embedded in pixel-based remote sensing measurements, such as the FCC imagery generation of BRDF measurements of grasslands in the presence of wildfire smoke and clouds. Furthermore, PolarBRDF also provides quantitative information of the angular distribution of scattered surface/atmosphere radiation, in the form of relevant BRDF variables such as sunglint, hotspot and scattering statistics.

  4. PolarBRDF: A general purpose Python package for visualization and quantitative analysis of multi-angular remote sensing measurements

    NASA Astrophysics Data System (ADS)

    Poudyal, R.; Singh, M.; Gautam, R.; Gatebe, C. K.

    2016-12-01

    The Bidirectional Reflectance Distribution Function (BRDF) is a fundamental concept for characterizing the reflectance property of a surface, and helps in the analysis of remote sensing data from satellite, airborne and surface platforms. Multi-angular remote sensing measurements are required for the development and evaluation of BRDF models for improved characterization of surface properties. However, multi-angular data and the associated BRDF models are typically multidimensional involving multi-angular and multi-wavelength information. Effective visualization of such complex multidimensional measurements for different wavelength combinations is presently somewhat lacking in the literature, and could serve as a potentially useful research and teaching tool in aiding both interpretation and analysis of BRDF measurements. This article describes a newly developed software package in Python (PolarBRDF) to help visualize and analyze multi-angular data in polar and False Color Composite (FCC) forms. PolarBRDF also includes functionalities for computing important multi-angular reflectance/albedo parameters including spectral albedo, principal plane reflectance and spectral reflectance slope. Application of PolarBRDF is demonstrated using various case studies obtained from airborne multi-angular remote sensing measurements using NASA's Cloud Absorption Radiometer (CAR)- http://car.gsfc.nasa.gov/. Our visualization program also provides functionalities for untangling complex surface/atmosphere features embedded in pixel-based remote sensing measurements, such as the FCC imagery generation of BRDF measurements of grasslands in the presence of wildfire smoke and clouds. Furthermore, PolarBRDF also provides quantitative information of the angular distribution of scattered surface/atmosphere radiation, in the form of relevant BRDF variables such as sunglint, hotspot and scattering statistics.

  5. Polarbrdf: A General Purpose Python Package for Visualization Quantitative Analysis of Multi-Angular Remote Sensing Measurements

    NASA Technical Reports Server (NTRS)

    Singh, Manoj K.; Gautam, Ritesh; Gatebe, Charles K.; Poudyal, Rajesh

    2016-01-01

    The Bidirectional Reflectance Distribution Function (BRDF) is a fundamental concept for characterizing the reflectance property of a surface, and helps in the analysis of remote sensing data from satellite, airborne and surface platforms. Multi-angular remote sensing measurements are required for the development and evaluation of BRDF models for improved characterization of surface properties. However, multi-angular data and the associated BRDF models are typically multidimensional involving multi-angular and multi-wavelength information. Effective visualization of such complex multidimensional measurements for different wavelength combinations is presently somewhat lacking in the literature, and could serve as a potentially useful research and teaching tool in aiding both interpretation and analysis of BRDF measurements. This article describes a newly developed software package in Python (PolarBRDF) to help visualize and analyze multi-angular data in polar and False Color Composite (FCC) forms. PolarBRDF also includes functionalities for computing important multi-angular reflectance/albedo parameters including spectral albedo, principal plane reflectance and spectral reflectance slope. Application of PolarBRDF is demonstrated using various case studies obtained from airborne multi-angular remote sensing measurements using NASA's Cloud Absorption Radiometer (CAR). Our visualization program also provides functionalities for untangling complex surface/atmosphere features embedded in pixel-based remote sensing measurements, such as the FCC imagery generation of BRDF measurements of grasslands in the presence of wild fire smoke and clouds. Furthermore, PolarBRDF also provides quantitative information of the angular distribution of scattered surface/atmosphere radiation, in the form of relevant BRDF variables such as sunglint, hotspot and scattering statistics.

  6. Reassessment of urbanization effect on surface air temperature trends at an urban station of North China

    NASA Astrophysics Data System (ADS)

    Bian, Tao; Ren, Guoyu

    2017-11-01

    Based on a homogenized data set of monthly mean temperature, minimum temperature, and maximum temperature at Shijiazhuang City Meteorological Station (Shijiazhuang station) and four rural meteorological stations selected applying a more sophisticated methodology, we reanalyzed the urbanization effects on annual, seasonal, and monthly mean surface air temperature (SAT) trends for updated time period 1960-2012 at the typical urban station in North China. The results showed that (1) urbanization effects on the long-term trends of annual mean SAT, minimum SAT, and diurnal temperature range (DTR) in the last 53 years reached 0.25, 0.47, and - 0.50 °C/decade, respectively, all statistically significant at the 0.001 confidence level, with the contributions from urbanization effects to the overall long-term trends reaching 67.8, 78.6, and 100%, respectively; (2) the urbanization effects on the trends of seasonal mean SAT, minimum SAT, and DTR were also large and statistically highly significant. Except for November and December, the urbanization effects on monthly mean SAT, minimum SAT, and DTR were also all statistically significant at the 0.05 confidence level; and (3) the annual, seasonal, and monthly mean maximum SAT series at the urban station registered a generally weaker and non-significant urbanization effect. The updated analysis evidenced that our previous work for this same urban station had underestimated the urbanization effect and its contribution to the overall changes in the SAT series. Many similar urban stations were being included in the current national and regional SAT data sets, and the results of this paper further indicated the importance and urgency for paying more attention to the urbanization bias in the monitoring and detection of global and regional SAT change based on the data sets.

  7. Comparisons of non-Gaussian statistical models in DNA methylation analysis.

    PubMed

    Ma, Zhanyu; Teschendorff, Andrew E; Yu, Hong; Taghia, Jalil; Guo, Jun

    2014-06-16

    As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance.

  8. Comparisons of Non-Gaussian Statistical Models in DNA Methylation Analysis

    PubMed Central

    Ma, Zhanyu; Teschendorff, Andrew E.; Yu, Hong; Taghia, Jalil; Guo, Jun

    2014-01-01

    As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance. PMID:24937687

  9. A perceptual space of local image statistics.

    PubMed

    Victor, Jonathan D; Thengone, Daniel J; Rizvi, Syed M; Conte, Mary M

    2015-12-01

    Local image statistics are important for visual analysis of textures, surfaces, and form. There are many kinds of local statistics, including those that capture luminance distributions, spatial contrast, oriented segments, and corners. While sensitivity to each of these kinds of statistics have been well-studied, much less is known about visual processing when multiple kinds of statistics are relevant, in large part because the dimensionality of the problem is high and different kinds of statistics interact. To approach this problem, we focused on binary images on a square lattice - a reduced set of stimuli which nevertheless taps many kinds of local statistics. In this 10-parameter space, we determined psychophysical thresholds to each kind of statistic (16 observers) and all of their pairwise combinations (4 observers). Sensitivities and isodiscrimination contours were consistent across observers. Isodiscrimination contours were elliptical, implying a quadratic interaction rule, which in turn determined ellipsoidal isodiscrimination surfaces in the full 10-dimensional space, and made predictions for sensitivities to complex combinations of statistics. These predictions, including the prediction of a combination of statistics that was metameric to random, were verified experimentally. Finally, check size had only a mild effect on sensitivities over the range from 2.8 to 14min, but sensitivities to second- and higher-order statistics was substantially lower at 1.4min. In sum, local image statistics form a perceptual space that is highly stereotyped across observers, in which different kinds of statistics interact according to simple rules. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. A perceptual space of local image statistics

    PubMed Central

    Victor, Jonathan D.; Thengone, Daniel J.; Rizvi, Syed M.; Conte, Mary M.

    2015-01-01

    Local image statistics are important for visual analysis of textures, surfaces, and form. There are many kinds of local statistics, including those that capture luminance distributions, spatial contrast, oriented segments, and corners. While sensitivity to each of these kinds of statistics have been well-studied, much less is known about visual processing when multiple kinds of statistics are relevant, in large part because the dimensionality of the problem is high and different kinds of statistics interact. To approach this problem, we focused on binary images on a square lattice – a reduced set of stimuli which nevertheless taps many kinds of local statistics. In this 10-parameter space, we determined psychophysical thresholds to each kind of statistic (16 observers) and all of their pairwise combinations (4 observers). Sensitivities and isodiscrimination contours were consistent across observers. Isodiscrimination contours were elliptical, implying a quadratic interaction rule, which in turn determined ellipsoidal isodiscrimination surfaces in the full 10-dimensional space, and made predictions for sensitivities to complex combinations of statistics. These predictions, including the prediction of a combination of statistics that was metameric to random, were verified experimentally. Finally, check size had only a mild effect on sensitivities over the range from 2.8 to 14 min, but sensitivities to second- and higher-order statistics was substantially lower at 1.4 min. In sum, local image statistics forms a perceptual space that is highly stereotyped across observers, in which different kinds of statistics interact according to simple rules. PMID:26130606

  11. Basin-scale heterogeneity in Antarctic precipitation and its impact on surface mass variability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fyke, Jeremy; Lenaerts, Jan T. M.; Wang, Hailong

    Annually averaged precipitation in the form of snow, the dominant term of the Antarctic Ice Sheet surface mass balance, displays large spatial and temporal variability. Here we present an analysis of spatial patterns of regional Antarctic precipitation variability and their impact on integrated Antarctic surface mass balance variability simulated as part of a preindustrial 1800-year global, fully coupled Community Earth System Model simulation. Correlation and composite analyses based on this output allow for a robust exploration of Antarctic precipitation variability. We identify statistically significant relationships between precipitation patterns across Antarctica that are corroborated by climate reanalyses, regional modeling and icemore » core records. These patterns are driven by variability in large-scale atmospheric moisture transport, which itself is characterized by decadal- to centennial-scale oscillations around the long-term mean. We suggest that this heterogeneity in Antarctic precipitation variability has a dampening effect on overall Antarctic surface mass balance variability, with implications for regulation of Antarctic-sourced sea level variability, detection of an emergent anthropogenic signal in Antarctic mass trends and identification of Antarctic mass loss accelerations.« less

  12. Comparison of surface roughness and chip characteristics obtained under different modes of lubrication during hard turning of AISI H13 tool work steel.

    NASA Astrophysics Data System (ADS)

    Raj, Anil; Wins, K. Leo Dev; Varadarajan, A. S.

    2016-09-01

    Surface roughness is one of the important parameters, which not only affects the service life of a component but also serves as a good index of machinability. Near Dry Machining, methods (NDM) are considered as sustainable alternative for workshops trying to bring down their dependence on cutting fluids and the hazards associated with their indiscriminate usage. The present work presents a comparison of the surface roughness and chip characteristics during hard turning of AISI H13 tool work steel using hard metal inserts under two popular NDM techniques namely the minimal fluid application and the Minimum Quantity Lubrication technique(MQL) using an experiment designed based on Taguchi's techniques. The statistical method of analysis of variance (ANOVA) was used to determine the relative significance of input parameters consisting of cutting speed, feed and depth of cut on the attainable surface finish and the chip characteristics. It was observed that the performance during minimal fluid application was better than that during MQL application.

  13. Basin-scale heterogeneity in Antarctic precipitation and its impact on surface mass variability

    DOE PAGES

    Fyke, Jeremy; Lenaerts, Jan T. M.; Wang, Hailong

    2017-11-15

    Annually averaged precipitation in the form of snow, the dominant term of the Antarctic Ice Sheet surface mass balance, displays large spatial and temporal variability. Here we present an analysis of spatial patterns of regional Antarctic precipitation variability and their impact on integrated Antarctic surface mass balance variability simulated as part of a preindustrial 1800-year global, fully coupled Community Earth System Model simulation. Correlation and composite analyses based on this output allow for a robust exploration of Antarctic precipitation variability. We identify statistically significant relationships between precipitation patterns across Antarctica that are corroborated by climate reanalyses, regional modeling and icemore » core records. These patterns are driven by variability in large-scale atmospheric moisture transport, which itself is characterized by decadal- to centennial-scale oscillations around the long-term mean. We suggest that this heterogeneity in Antarctic precipitation variability has a dampening effect on overall Antarctic surface mass balance variability, with implications for regulation of Antarctic-sourced sea level variability, detection of an emergent anthropogenic signal in Antarctic mass trends and identification of Antarctic mass loss accelerations.« less

  14. A New Femtosecond Laser-Based Three-Dimensional Tomography Technique

    NASA Astrophysics Data System (ADS)

    Echlin, McLean P.

    2011-12-01

    Tomographic imaging has dramatically changed science, most notably in the fields of medicine and biology, by producing 3D views of structures which are too complex to understand in any other way. Current tomographic techniques require extensive time both for post-processing and data collection. Femtosecond laser based tomographic techniques have been developed in both standard atmosphere (femtosecond laser-based serial sectioning technique - FSLSS) and in vacuum (Tri-Beam System) for the fast collection (10 5mum3/s) of mm3 sized 3D datasets. Both techniques use femtosecond laser pulses to selectively remove layer-by-layer areas of material with low collateral damage and a negligible heat affected zone. To the authors knowledge, femtosecond lasers have never been used to serial section and these techniques have been entirely and uniquely developed by the author and his collaborators at the University of Michigan and University of California Santa Barbara. The FSLSS was applied to measure the 3D distribution of TiN particles in a 4330 steel. Single pulse ablation morphologies and rates were measured and collected from literature. Simultaneous two-phase ablation of TiN and steel matrix was shown to occur at fluences of 0.9-2 J/cm2. Laser scanning protocols were developed minimizing surface roughness to 0.1-0.4 mum for laser-based sectioning. The FSLSS technique was used to section and 3D reconstruct titanium nitride (TiN) containing 4330 steel. Statistical analysis of 3D TiN particle sizes, distribution parameters, and particle density were measured. A methodology was developed to use the 3D datasets to produce statistical volume elements (SVEs) for toughness modeling. Six FSLSS TiN datasets were sub-sampled into 48 SVEs for statistical analysis and toughness modeling using the Rice-Tracey and Garrison-Moody models. A two-parameter Weibull analysis was performed and variability in the toughness data agreed well with Ruggieri et al. bulk toughness measurements. The Tri-Beam system combines the benefits of laser based material removal (speed, low-damage, automated) with detectors that collect chemical, structural, and topological information. Multi-modal sectioning information was collected after many laser scanning passes demonstrating the capability of the Tri-Beam system.

  15. Detailed analysis of grid-based molecular docking: A case study of CDOCKER-A CHARMm-based MD docking algorithm.

    PubMed

    Wu, Guosheng; Robertson, Daniel H; Brooks, Charles L; Vieth, Michal

    2003-10-01

    The influence of various factors on the accuracy of protein-ligand docking is examined. The factors investigated include the role of a grid representation of protein-ligand interactions, the initial ligand conformation and orientation, the sampling rate of the energy hyper-surface, and the final minimization. A representative docking method is used to study these factors, namely, CDOCKER, a molecular dynamics (MD) simulated-annealing-based algorithm. A major emphasis in these studies is to compare the relative performance and accuracy of various grid-based approximations to explicit all-atom force field calculations. In these docking studies, the protein is kept rigid while the ligands are treated as fully flexible and a final minimization step is used to refine the docked poses. A docking success rate of 74% is observed when an explicit all-atom representation of the protein (full force field) is used, while a lower accuracy of 66-76% is observed for grid-based methods. All docking experiments considered a 41-member protein-ligand validation set. A significant improvement in accuracy (76 vs. 66%) for the grid-based docking is achieved if the explicit all-atom force field is used in a final minimization step to refine the docking poses. Statistical analysis shows that even lower-accuracy grid-based energy representations can be effectively used when followed with full force field minimization. The results of these grid-based protocols are statistically indistinguishable from the detailed atomic dockings and provide up to a sixfold reduction in computation time. For the test case examined here, improving the docking accuracy did not necessarily enhance the ability to estimate binding affinities using the docked structures. Copyright 2003 Wiley Periodicals, Inc.

  16. A Novel Genome-Information Content-Based Statistic for Genome-Wide Association Analysis Designed for Next-Generation Sequencing Data

    PubMed Central

    Luo, Li; Zhu, Yun

    2012-01-01

    Abstract The genome-wide association studies (GWAS) designed for next-generation sequencing data involve testing association of genomic variants, including common, low frequency, and rare variants. The current strategies for association studies are well developed for identifying association of common variants with the common diseases, but may be ill-suited when large amounts of allelic heterogeneity are present in sequence data. Recently, group tests that analyze their collective frequency differences between cases and controls shift the current variant-by-variant analysis paradigm for GWAS of common variants to the collective test of multiple variants in the association analysis of rare variants. However, group tests ignore differences in genetic effects among SNPs at different genomic locations. As an alternative to group tests, we developed a novel genome-information content-based statistics for testing association of the entire allele frequency spectrum of genomic variation with the diseases. To evaluate the performance of the proposed statistics, we use large-scale simulations based on whole genome low coverage pilot data in the 1000 Genomes Project to calculate the type 1 error rates and power of seven alternative statistics: a genome-information content-based statistic, the generalized T2, collapsing method, multivariate and collapsing (CMC) method, individual χ2 test, weighted-sum statistic, and variable threshold statistic. Finally, we apply the seven statistics to published resequencing dataset from ANGPTL3, ANGPTL4, ANGPTL5, and ANGPTL6 genes in the Dallas Heart Study. We report that the genome-information content-based statistic has significantly improved type 1 error rates and higher power than the other six statistics in both simulated and empirical datasets. PMID:22651812

  17. A novel genome-information content-based statistic for genome-wide association analysis designed for next-generation sequencing data.

    PubMed

    Luo, Li; Zhu, Yun; Xiong, Momiao

    2012-06-01

    The genome-wide association studies (GWAS) designed for next-generation sequencing data involve testing association of genomic variants, including common, low frequency, and rare variants. The current strategies for association studies are well developed for identifying association of common variants with the common diseases, but may be ill-suited when large amounts of allelic heterogeneity are present in sequence data. Recently, group tests that analyze their collective frequency differences between cases and controls shift the current variant-by-variant analysis paradigm for GWAS of common variants to the collective test of multiple variants in the association analysis of rare variants. However, group tests ignore differences in genetic effects among SNPs at different genomic locations. As an alternative to group tests, we developed a novel genome-information content-based statistics for testing association of the entire allele frequency spectrum of genomic variation with the diseases. To evaluate the performance of the proposed statistics, we use large-scale simulations based on whole genome low coverage pilot data in the 1000 Genomes Project to calculate the type 1 error rates and power of seven alternative statistics: a genome-information content-based statistic, the generalized T(2), collapsing method, multivariate and collapsing (CMC) method, individual χ(2) test, weighted-sum statistic, and variable threshold statistic. Finally, we apply the seven statistics to published resequencing dataset from ANGPTL3, ANGPTL4, ANGPTL5, and ANGPTL6 genes in the Dallas Heart Study. We report that the genome-information content-based statistic has significantly improved type 1 error rates and higher power than the other six statistics in both simulated and empirical datasets.

  18. The impact of catchment source group classification on the accuracy of sediment fingerprinting outputs.

    PubMed

    Pulley, Simon; Foster, Ian; Collins, Adrian L

    2017-06-01

    The objective classification of sediment source groups is at present an under-investigated aspect of source tracing studies, which has the potential to statistically improve discrimination between sediment sources and reduce uncertainty. This paper investigates this potential using three different source group classification schemes. The first classification scheme was simple surface and subsurface groupings (Scheme 1). The tracer signatures were then used in a two-step cluster analysis to identify the sediment source groupings naturally defined by the tracer signatures (Scheme 2). The cluster source groups were then modified by splitting each one into a surface and subsurface component to suit catchment management goals (Scheme 3). The schemes were tested using artificial mixtures of sediment source samples. Controlled corruptions were made to some of the mixtures to mimic the potential causes of tracer non-conservatism present when using tracers in natural fluvial environments. It was determined how accurately the known proportions of sediment sources in the mixtures were identified after unmixing modelling using the three classification schemes. The cluster analysis derived source groups (2) significantly increased tracer variability ratios (inter-/intra-source group variability) (up to 2122%, median 194%) compared to the surface and subsurface groupings (1). As a result, the composition of the artificial mixtures was identified an average of 9.8% more accurately on the 0-100% contribution scale. It was found that the cluster groups could be reclassified into a surface and subsurface component (3) with no significant increase in composite uncertainty (a 0.1% increase over Scheme 2). The far smaller effects of simulated tracer non-conservatism for the cluster analysis based schemes (2 and 3) was primarily attributed to the increased inter-group variability producing a far larger sediment source signal that the non-conservatism noise (1). Modified cluster analysis based classification methods have the potential to reduce composite uncertainty significantly in future source tracing studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Utilization of Skylab EREP system for appraising changes in continental migratory bird habitat. [using multispectral band scanner

    NASA Technical Reports Server (NTRS)

    Gilmer, D. S. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. Surface water statistics using data obtained by supporting aircraft were generated. Signature extraction and refinement preliminary to wetland and associated upland vegetation recognition were accomplished, using a selected portion of the aircraft data. Final classification mapping and analysis of surface water trends will be accomplished.

  20. Experimental Design for a Sponge-Wipe Study to Relate the Recovery Efficiency and False Negative Rate to the Concentration of a Bacillus anthracis Surrogate for Six Surface Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, Gregory F.; Amidan, Brett G.; Krauter, Paula

    2011-05-01

    Two concerns were raised by the Government Accountability Office following the 2001 building contaminations via letters containing Bacillus anthracis (BA). These included the: 1) lack of validated sampling methods, and 2) need to use statistical sampling to quantify the confidence of no contamination when all samples have negative results. Critical to addressing these concerns is quantifying the false negative rate (FNR). The FNR may depend on the 1) method of contaminant deposition, 2) surface concentration of the contaminant, 3) surface material being sampled, 4) sample collection method, 5) sample storage/transportation conditions, 6) sample processing method, and 7) sample analytical method.more » A review of the literature found 17 laboratory studies that focused on swab, wipe, or vacuum samples collected from a variety of surface materials contaminated by BA or a surrogate, and used culture methods to determine the surface contaminant concentration. These studies quantified performance of the sampling and analysis methods in terms of recovery efficiency (RE) and not FNR (which left a major gap in available information). Quantifying the FNR under a variety of conditions is a key aspect of validating sample and analysis methods, and also for calculating the confidence in characterization or clearance decisions based on a statistical sampling plan. A laboratory study was planned to partially fill the gap in FNR results. This report documents the experimental design developed by Pacific Northwest National Laboratory and Sandia National Laboratories (SNL) for a sponge-wipe method. The testing was performed by SNL and is now completed. The study investigated the effects on key response variables from six surface materials contaminated with eight surface concentrations of a BA surrogate (Bacillus atrophaeus). The key response variables include measures of the contamination on test coupons of surface materials tested, contamination recovered from coupons by sponge-wipe samples, RE, and FNR. The experimental design involves 16 test runs, performed in two blocks of eight runs. Three surface materials (stainless steel, vinyl tile, and ceramic tile) were tested in the first block, while three other surface materials (plastic, painted wood paneling, and faux leather) were tested in the second block. The eight surface concentrations of the surrogate were randomly assigned to test runs within each block. Some of the concentrations were very low and presented challenges for deposition, sampling, and analysis. However, such tests are needed to investigate RE and FNR over the full range of concentrations of interest. In each run, there were 10 test coupons of each of the three surface materials. A positive control sample was generated at the same time as each test sample. The positive control results will be used to 1) calculate RE values for the wipe sampling and analysis method, and 2) fit RE- and FNR-concentration equations, for each of the six surface materials. Data analyses will support 1) estimating the FNR for each combination of contaminant concentration and surface material, 2) estimating the surface concentrations and their uncertainties of the contaminant for each combination of concentration and surface material, 3) estimating RE (%) and their uncertainties for each combination of contaminant concentration and surface material, 4) fitting FNR-concentration and RE-concentration equations for each of the six surface materials, 5) assessing goodness-of-fit of the equations, and 6) quantifying the uncertainty in FNR and RE predictions made with the fitted equations.« less

  1. Experimental Design for a Sponge-Wipe Study to Relate the Recovery Efficiency and False Negative Rate to the Concentration of a Bacillus anthracis Surrogate for Six Surface Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, Gregory F.; Amidan, Brett G.; Krauter, Paula

    2010-12-16

    Two concerns were raised by the Government Accountability Office following the 2001 building contaminations via letters containing Bacillus anthracis (BA). These included the: 1) lack of validated sampling methods, and 2) need to use statistical sampling to quantify the confidence of no contamination when all samples have negative results. Critical to addressing these concerns is quantifying the probability of correct detection (PCD) (or equivalently the false negative rate FNR = 1 - PCD). The PCD/FNR may depend on the 1) method of contaminant deposition, 2) surface concentration of the contaminant, 3) surface material being sampled, 4) sample collection method, 5)more » sample storage/transportation conditions, 6) sample processing method, and 7) sample analytical method. A review of the literature found 17 laboratory studies that focused on swab, wipe, or vacuum samples collected from a variety of surface materials contaminated by BA or a surrogate, and used culture methods to determine the surface contaminant concentration. These studies quantified performance of the sampling and analysis methods in terms of recovery efficiency (RE) and not PCD/FNR (which left a major gap in available information). Quantifying the PCD/FNR under a variety of conditions is a key aspect of validating sample and analysis methods, and also for calculating the confidence in characterization or clearance decisions based on a statistical sampling plan. A laboratory study was planned to partially fill the gap in PCD/FNR results. This report documents the experimental design developed by Pacific Northwest National Laboratory and Sandia National Laboratories (SNL) for a sponge-wipe method. The study will investigate the effects on key response variables from six surface materials contaminated with eight surface concentrations of a BA surrogate (Bacillus atrophaeus). The key response variables include measures of the contamination on test coupons of surface materials tested, contamination recovered from coupons by sponge-wipe samples, RE, and PCD/FNR. The experimental design involves 16 test runs, to be performed in two blocks of eight runs. Three surface materials (stainless steel, vinyl tile, and ceramic tile) were tested in the first block, while three other surface materials (plastic, painted wood paneling, and faux leather) will be tested in the second block. The eight surface concentrations of the surrogate were randomly assigned to test runs within each block. Some of the concentrations will be very low and may present challenges for deposition, sampling, and analysis. However, such tests are needed to investigate RE and PCD/FNR over the full range of concentrations of interest. In each run, there will be 10 test coupons of each of the three surface materials. A positive control sample will be generated prior to each test sample. The positive control results will be used to 1) calculate RE values for the wipe sampling and analysis method, and 2) fit RE- and PCD-concentration equations, for each of the six surface materials. Data analyses will support 1) estimating the PCD for each combination of contaminant concentration and surface material, 2) estimating the surface concentrations and their uncertainties of the contaminant for each combination of concentration and surface material, 3) estimating RE (%) and their uncertainties for each combination of contaminant concentration and surface material, 4) fitting PCD-concentration and RE-concentration equations for each of the six surface materials, 5) assessing goodness-of-fit of the equations, and 6) quantifying the uncertainty in PCD and RE predictions made with the fitted equations.« less

  2. Enamel surface topography analysis for diet discrimination. A methodology to enhance and select discriminative parameters

    NASA Astrophysics Data System (ADS)

    Francisco, Arthur; Blondel, Cécile; Brunetière, Noël; Ramdarshan, Anusha; Merceron, Gildas

    2018-03-01

    Tooth wear and, more specifically, dental microwear texture is a dietary proxy that has been used for years in vertebrate paleoecology and ecology. DMTA, dental microwear texture analysis, relies on a few parameters related to the surface complexity, anisotropy and heterogeneity of the enamel facets at the micrometric scale. Working with few but physically meaningful parameters helps in comparing published results and in defining levels for classification purposes. Other dental microwear approaches are based on ISO parameters and coupled with statistical tests to find the more relevant ones. The present study roughly utilizes most of the aforementioned parameters in their more or less modified form. But more than parameters, we here propose a new approach: instead of a single parameter characterizing the whole surface, we sample the surface and thus generate 9 derived parameters in order to broaden the parameter set. The identification of the most discriminative parameters is performed with an automated procedure which is an extended and refined version of the workflows encountered in some studies. The procedure in its initial form includes the most common tools, like the ANOVA and the correlation analysis, along with the required mathematical tests. The discrimination results show that a simplified form of the procedure is able to more efficiently identify the desired number of discriminative parameters. Also highlighted are some trends like the relevance of working with both height and spatial parameters, as well as the potential benefits of dimensionless surfaces. On a set of 45 surfaces issued from 45 specimens of three modern ruminants with differences in feeding preferences (grazing, leaf-browsing and fruit-eating), it is clearly shown that the level of wear discrimination is improved with the new methodology compared to the other ones.

  3. Water levels and groundwater and surface-water exchanges in lakes of the northeast Twin Cities Metropolitan Area, Minnesota, 2002 through 2015

    USGS Publications Warehouse

    Jones, Perry M.; Trost, Jared J.; Erickson, Melinda L.

    2016-10-19

    OverviewThis study assessed lake-water levels and regional and local groundwater and surface-water exchanges near northeast Twin Cities Metropolitan Area lakes applying three approaches: statistical analysis, field study, and groundwater-flow modeling.  Statistical analyses of lake levels were completed to assess the effect of physical setting and climate on lake-level fluctuations of selected lakes. A field study of groundwater and surface-water interactions in selected lakes was completed to (1) estimate potential percentages of surface-water contributions to well water across the northeast Twin Cities Metropolitan Area, (2) estimate general ages for waters extracted from the wells, and (3) assess groundwater inflow to lakes and lake-water outflow to aquifers downgradient from White Bear Lake.  Groundwater flow was simulated using a steady-state, groundwater-flow model to assess regional groundwater and surface-water exchanges and the effects of groundwater withdrawals, climate, and other factors on water levels of northeast Twin Cities Metropolitan Area lakes.

  4. A study of radiometric surface temperatures: Their fluctuations, distribution and meaning. [Voves, France

    NASA Technical Reports Server (NTRS)

    Perrier, A.; Itier, B.; Boissard, P. (Principal Investigator); Goillot, C.; Belluomo, P.; Valery, P.

    1980-01-01

    A consecutive night and day flight and measurements on the ground, were made in the region of Voves, south of Chartres. The statistical analysis of the thermal scanner data permitted the establishment of criteria for the homogeneity of surfaces. These criteria were used in defining the surface temperature values which are most representative for use in an energy balance approach to evapotranspiration (day) and heat balance (night). For a number of maize fields that airborne thermal scanner data permitted a detailed energy analysis of different fields of a same crop to be carried out. Such a detailed analysis was not necessary for a calculation of crop evapotranspiration which could be evaluated from the mean temperature of the crop surface. A differential analysis day night is of interest for enhancing the contrast between types of surfaces, as well as for a better definition of the daily energy balance. It should be stressed that, for a homogeneous region, a study such as the present one, could be carried out on a relatively small part of the total surface, as the results for a surface of 2.5 x 2 sq km were not significantly different from those obtained from a surface three times larger.

  5. The Correlation Between Atmospheric Dust Deposition to the Surface Ocean and SeaWiFS Ocean Color: A Global Satellite-Based Analysis

    NASA Technical Reports Server (NTRS)

    Erickson, D. J., III; Hernandez, J.; Ginoux, P.; Gregg, W.; Kawa, R.; Behrenfeld, M.; Esaias, W.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    Since the atmospheric deposition of iron has been linked to primary productivity in various oceanic regions, we have conducted an objective study of the correlation of dust deposition and satellite remotely sensed surface ocean chlorophyll concentrations. We present a global analysis of the correlation between atmospheric dust deposition derived from a satellite-based 3-D atmospheric transport model and SeaWiFs estimates of ocean color. We use the monthly mean dust deposition fields of Ginoux et al. which are based on a global model of dust generation and transport. This model is driven by atmospheric circulation from the Data Assimilation Office (DAO) for the period 1995-1998. This global dust model is constrained by several satellite estimates of standard circulation characteristics. We then perform an analysis of the correlation between the dust deposition and the 1998 SeaWIFS ocean color data for each 2.0 deg x 2.5 deg lat/long grid point, for each month of the year. The results are surprisingly robust. The region between 40 S and 60 S has correlation coefficients from 0.6 to 0.95, statistically significant at the 0.05 level. There are swaths of high correlation at the edges of some major ocean current systems. We interpret these correlations as reflecting areas that have shear related turbulence bringing nitrogen and phosphorus from depth into the surface ocean, and the atmospheric supply of iron provides the limiting nutrient and the correlation between iron deposition and surface ocean chlorophyll is high. There is a region in the western North Pacific with high correlation, reflecting the input of Asian dust to that region. The southern hemisphere has an average correlation coefficient of 0.72 compared that in the northern hemisphere of 0.42 consistent with present conceptual models of where atmospheric iron deposition may play a role in surface ocean biogeochemical cycles. The spatial structure of the correlation fields will be discussed within the context of guiding the design of field programs.

  6. A Matlab user interface for the statistically assisted fluid registration algorithm and tensor-based morphometry

    NASA Astrophysics Data System (ADS)

    Yepes-Calderon, Fernando; Brun, Caroline; Sant, Nishita; Thompson, Paul; Lepore, Natasha

    2015-01-01

    Tensor-Based Morphometry (TBM) is an increasingly popular method for group analysis of brain MRI data. The main steps in the analysis consist of a nonlinear registration to align each individual scan to a common space, and a subsequent statistical analysis to determine morphometric differences, or difference in fiber structure between groups. Recently, we implemented the Statistically-Assisted Fluid Registration Algorithm or SAFIRA,1 which is designed for tracking morphometric differences among populations. To this end, SAFIRA allows the inclusion of statistical priors extracted from the populations being studied as regularizers in the registration. This flexibility and degree of sophistication limit the tool to expert use, even more so considering that SAFIRA was initially implemented in command line mode. Here, we introduce a new, intuitive, easy to use, Matlab-based graphical user interface for SAFIRA's multivariate TBM. The interface also generates different choices for the TBM statistics, including both the traditional univariate statistics on the Jacobian matrix, and comparison of the full deformation tensors.2 This software will be freely disseminated to the neuroimaging research community.

  7. Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework.

    PubMed

    Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana

    2014-06-01

    Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd.

  8. Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework†

    PubMed Central

    Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana

    2014-01-01

    Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd. PMID:25505370

  9. Estimation of sensible and latent heat flux from natural sparse vegetation surfaces using surface renewal

    NASA Astrophysics Data System (ADS)

    Zapata, N.; Martínez-Cob, A.

    2001-12-01

    This paper reports a study undertaken to evaluate the feasibility of the surface renewal method to accurately estimate long-term evaporation from the playa and margins of an endorreic salty lagoon (Gallocanta lagoon, Spain) under semiarid conditions. High-frequency temperature readings were taken for two time lags ( r) and three measurement heights ( z) in order to get surface renewal sensible heat flux ( HSR) values. These values were compared against eddy covariance sensible heat flux ( HEC) values for a calibration period (25-30 July 2000). Error analysis statistics (index of agreement, IA; root mean square error, RMSE; and systematic mean square error, MSEs) showed that the agreement between HSR and HEC improved as measurement height decreased and time lag increased. Calibration factors α were obtained for all analyzed cases. The best results were obtained for the z=0.9 m ( r=0.75 s) case for which α=1.0 was observed. In this case, uncertainty was about 10% in terms of relative error ( RE). Latent heat flux values were obtained by solving the energy balance equation for both the surface renewal ( LESR) and the eddy covariance ( LEEC) methods, using HSR and HEC, respectively, and measurements of net radiation and soil heat flux. For the calibration period, error analysis statistics for LESR were quite similar to those for HSR, although errors were mostly at random. LESR uncertainty was less than 9%. Calibration factors were applied for a validation data subset (30 July-4 August 2000) for which meteorological conditions were somewhat different (higher temperatures and wind speed and lower solar and net radiation). Error analysis statistics for both HSR and LESR were quite good for all cases showing the goodness of the calibration factors. Nevertheless, the results obtained for the z=0.9 m ( r=0.75 s) case were still the best ones.

  10. Assimilation of altimeter data into a quasigeostrophic ocean model using optimal interpolation and eofs

    NASA Astrophysics Data System (ADS)

    Rienecker, M. M.; Adamec, D.

    1995-01-01

    An ensemble of fraternal-twin experiments is used to assess the utility of optimal interpolation and model-based vertical empirical orthogonal functions (eofs) of streamfunction variability to assimilate satellite altimeter data into ocean models. Simulated altimeter data are assimilated into a basin-wide 3-layer quasi-geostrophic model with a horizontal grid spacing of 15 km. The effects of bottom topography are included and the model is forced by a wind stress curl distribution which is constant in time. The simulated data are extracted, along altimeter tracks with spatial and temporal characteristics of Geosat, from a reference model ocean with a slightly different climatology from that generated by the model used for assimilation. The use of vertical eofs determined from the model-generated streamfunction variability is shown to be effective in aiding the model's dynamical extrapolation of the surface information throughout the rest of the water column. After a single repeat cycle (17 days), the analysis errors are reduced markedly from the initial level, by 52% in the surface layer, 41% in the second layer and 11% in the bottom layer. The largest differences between the assimilation analysis and the reference ocean are found in the nonlinear regime of the mid-latitude jet in all layers. After 100 days of assimilation, the error in the upper two layers has been reduced by over 50% and that in the bottom layer by 38%. The essence of the method is that the eofs capture the statistics of the dynamical balances in the model and ensure that this balance is not inappropriately disturbed during the assimilation process. This statistical balance includes any potential vorticity homogeneity which may be associated with the eddy stirring by mid-latitude surface jets.

  11. Probabilistic Thermal Analysis During Mars Reconnaissance Orbiter Aerobraking

    NASA Technical Reports Server (NTRS)

    Dec, John A.

    2007-01-01

    A method for performing a probabilistic thermal analysis during aerobraking has been developed. The analysis is performed on the Mars Reconnaissance Orbiter solar array during aerobraking. The methodology makes use of a response surface model derived from a more complex finite element thermal model of the solar array. The response surface is a quadratic equation which calculates the peak temperature for a given orbit drag pass at a specific location on the solar panel. Five different response surface equations are used, one of which predicts the overall maximum solar panel temperature, and the remaining four predict the temperatures of the solar panel thermal sensors. The variables used to define the response surface can be characterized as either environmental, material property, or modeling variables. Response surface variables are statistically varied in a Monte Carlo simulation. The Monte Carlo simulation produces mean temperatures and 3 sigma bounds as well as the probability of exceeding the designated flight allowable temperature for a given orbit. Response surface temperature predictions are compared with the Mars Reconnaissance Orbiter flight temperature data.

  12. Dental enamel defect diagnosis through different technology-based devices.

    PubMed

    Kobayashi, Tatiana Yuriko; Vitor, Luciana Lourenço Ribeiro; Carrara, Cleide Felício Carvalho; Silva, Thiago Cruvinel; Rios, Daniela; Machado, Maria Aparecida Andrade Moreira; Oliveira, Thais Marchini

    2018-06-01

    Dental enamel defects (DEDs) are faulty or deficient enamel formations of primary and permanent teeth. Changes during tooth development result in hypoplasia (a quantitative defect) and/or hypomineralisation (a qualitative defect). To compare technology-based diagnostic methods for detecting DEDs. Two-hundred and nine dental surfaces of anterior permanent teeth were selected in patients, 6-11 years of age, with cleft lip with/without cleft palate. First, a conventional clinical examination was conducted according to the modified Developmental Defects of Enamel Index (DDE Index). Dental surfaces were evaluated using an operating microscope and a fluorescence-based device. Interexaminer reproducibility was determined using the kappa test. To compare groups, McNemar's test was used. Cramer's V test was used for comparing the distribution of index codes obtained after classification of all dental surfaces. Cramer's V test revealed statistically significant differences (P < .0001) in the distribution of index codes obtained using the different methods; the coefficients were 0.365 for conventional clinical examination versus fluorescence, 0.961 for conventional clinical examination versus operating microscope and 0.358 for operating microscope versus fluorescence. The sensitivity of the operating microscope and fluorescence method was statistically significant (P = .008 and P < .0001, respectively). Otherwise, the results did not show statistically significant differences in accuracy and specificity for either the operating microscope or the fluorescence methods. This study suggests that the operating microscope performed better than the fluorescence-based device and could be an auxiliary method for the detection of DEDs. © 2017 FDI World Dental Federation.

  13. Shock and Vibration Symposium (59th) Held in Albuquerque, New Mexico on 18-20 October 1988. Volume 3

    DTIC Science & Technology

    1988-10-01

    N. F. Rieger Statistical Energy Analysis : An Overview of Its Development and Engineering Applications J. E. Manning DATA BASES DOE/DOD Environmental...Vibroacoustic Response Using the Finite Element Method and Statistical Energy Analysis F. L. Gloyna Study of Helium Effect on Spacecraft Random Vibration...Analysis S. A. Wilkerson vi DYNAMIC ANALYSIS Modeling of Vibration Transmission in a Damped Beam Structure Using Statistical Energy Analysis S. S

  14. Thermodynamic analysis of Bacillus subtilis endospore protonation using isothermal titration calorimetry

    NASA Astrophysics Data System (ADS)

    Harrold, Zoë R.; Gorman-Lewis, Drew

    2013-05-01

    Bacterial proton and metal adsorption reactions have the capacity to affect metal speciation and transport in aqueous environments. We coupled potentiometric titration and isothermal titration calorimetry (ITC) analyses to study Bacillus subtilis spore-proton adsorption. We modeled the potentiometric data using a four and five-site non-electrostatic surface complexation model (NE-SCM). Heats of spore surface protonation from coupled ITC analyses were used to determine site specific enthalpies of protonation based on NE-SCMs. The five-site model resulted in a substantially better model fit for the heats of protonation but did not significantly improve the potentiometric titration model fit. The improvement observed in the five-site protonation heat model suggests the presence of a highly exothermic protonation reaction circa pH 7 that cannot be resolved in the less sensitive potentiometric data. From the log Ks and enthalpies we calculated corresponding site specific entropies. Log Ks and site concentrations describing spore surface protonation are statistically equivalent to B. subtilis cell surface protonation constants. Spore surface protonation enthalpies, however, are more exothermic relative to cell based adsorption suggesting a different bonding environment. The thermodynamic parameters defined in this study provide insight on molecular scale spore-surface protonation reactions. Coupled ITC and potentiometric titrations can reveal highly exothermic, and possibly endothermic, adsorption reactions that are overshadowed in potentiometric models alone. Spore-proton adsorption NE-SCMs derived in this study provide a framework for future metal adsorption studies.

  15. Broadband Noise Predictions Based on a New Aeroacoustic Formulation

    NASA Technical Reports Server (NTRS)

    Casper, J.; Farassat, F.

    2002-01-01

    A new analytic result in acoustics called 'Formulation 1B,' proposed by Farassat, is used to compute the loading noise from an unsteady surface pressure distribution on a thin airfoil in the time domain. This formulation is a new solution of the Ffowcs Williams-Hawkings equation with the loading source term. The formulation contains a far-field surface integral that depends on the time derivative and the surface gradient of the pressure on the airfoil, as well as a contour integral on the boundary of the airfoil surface. As a first test case, the new formulation is used to compute the noise radiated from a flat plate, moving through a sinusoidal gust of constant frequency. The unsteady surface pressure for this test case is specified analytically from a result that is based on linear airfoil theory. This test case is used to examine the velocity scaling properties of Formulation 1B, and to demonstrate its equivalence to Formulation 1A, of Farassat. The new acoustic formulation, again with an analytic surface pressure, is then used to predict broadband noise radiated from an airfoil immersed in homogeneous turbulence. The results are compared with experimental data previously reported by Paterson and Amiet. Good agreement between predictions and measurements is obtained. The predicted results also agree very well with those of Paterson and Amiet, who used a frequency-domain approach. Finally, an alternative form of Formulation 1B is described for statistical analysis of broadband noise.

  16. Optical Neasurements Of Diamond-Turned Surfaces

    NASA Astrophysics Data System (ADS)

    Politch, Jacob

    1989-07-01

    We describe here a system for measuring very accurately diamond-turned surfaces. This system is based on heterodyne interfercmetry and measures surface height variations with an accuracy of 4A, and the spatial resolution is 1 micrometer. Fran the measured data we have calculated the statistical properties of the surface - enabling us to identify the spatial frequencies caused by the vibrations of the diamond - turning machine and the measuring machine as well as the frequency of the grid.

  17. Analysis of statistical and standard algorithms for detecting muscle onset with surface electromyography.

    PubMed

    Tenan, Matthew S; Tweedell, Andrew J; Haynes, Courtney A

    2017-01-01

    The timing of muscle activity is a commonly applied analytic method to understand how the nervous system controls movement. This study systematically evaluates six classes of standard and statistical algorithms to determine muscle onset in both experimental surface electromyography (EMG) and simulated EMG with a known onset time. Eighteen participants had EMG collected from the biceps brachii and vastus lateralis while performing a biceps curl or knee extension, respectively. Three established methods and three statistical methods for EMG onset were evaluated. Linear envelope, Teager-Kaiser energy operator + linear envelope and sample entropy were the established methods evaluated while general time series mean/variance, sequential and batch processing of parametric and nonparametric tools, and Bayesian changepoint analysis were the statistical techniques used. Visual EMG onset (experimental data) and objective EMG onset (simulated data) were compared with algorithmic EMG onset via root mean square error and linear regression models for stepwise elimination of inferior algorithms. The top algorithms for both data types were analyzed for their mean agreement with the gold standard onset and evaluation of 95% confidence intervals. The top algorithms were all Bayesian changepoint analysis iterations where the parameter of the prior (p0) was zero. The best performing Bayesian algorithms were p0 = 0 and a posterior probability for onset determination at 60-90%. While existing algorithms performed reasonably, the Bayesian changepoint analysis methodology provides greater reliability and accuracy when determining the singular onset of EMG activity in a time series. Further research is needed to determine if this class of algorithms perform equally well when the time series has multiple bursts of muscle activity.

  18. An Automated Method of Scanning Probe Microscopy (SPM) Data Analysis and Reactive Site Tracking for Mineral-Water Interface Reactions Observed at the Nanometer Scale

    NASA Astrophysics Data System (ADS)

    Campbell, B. D.; Higgins, S. R.

    2008-12-01

    Developing a method for bridging the gap between macroscopic and microscopic measurements of reaction kinetics at the mineral-water interface has important implications in geological and chemical fields. Investigating these reactions on the nanometer scale with SPM is often limited by image analysis and data extraction due to the large quantity of data usually obtained in SPM experiments. Here we present a computer algorithm for automated analysis of mineral-water interface reactions. This algorithm automates the analysis of sequential SPM images by identifying the kinetically active surface sites (i.e., step edges), and by tracking the displacement of these sites from image to image. The step edge positions in each image are readily identified and tracked through time by a standard edge detection algorithm followed by statistical analysis on the Hough Transform of the edge-mapped image. By quantifying this displacement as a function of time, the rate of step edge displacement is determined. Furthermore, the total edge length, also determined from analysis of the Hough Transform, combined with the computed step speed, yields the surface area normalized rate of the reaction. The algorithm was applied to a study of the spiral growth of the calcite(104) surface from supersaturated solutions, yielding results almost 20 times faster than performing this analysis by hand, with results being statistically similar for both analysis methods. This advance in analysis of kinetic data from SPM images will facilitate the building of experimental databases on the microscopic kinetics of mineral-water interface reactions.

  19. Statistical optimization of process parameters for lipase-catalyzed synthesis of triethanolamine-based esterquats using response surface methodology in 2-liter bioreactor.

    PubMed

    Masoumi, Hamid Reza Fard; Basri, Mahiran; Kassim, Anuar; Abdullah, Dzulkefly Kuang; Abdollahi, Yadollah; Abd Gani, Siti Salwa; Rezaee, Malahat

    2013-01-01

    Lipase-catalyzed production of triethanolamine-based esterquat by esterification of oleic acid (OA) with triethanolamine (TEA) in n-hexane was performed in 2 L stirred-tank reactor. A set of experiments was designed by central composite design to process modeling and statistically evaluate the findings. Five independent process variables, including enzyme amount, reaction time, reaction temperature, substrates molar ratio of OA to TEA, and agitation speed, were studied under the given conditions designed by Design Expert software. Experimental data were examined for normality test before data processing stage and skewness and kurtosis indices were determined. The mathematical model developed was found to be adequate and statistically accurate to predict the optimum conversion of product. Response surface methodology with central composite design gave the best performance in this study, and the methodology as a whole has been proven to be adequate for the design and optimization of the enzymatic process.

  20. A statistical shape model of the human second cervical vertebra.

    PubMed

    Clogenson, Marine; Duff, John M; Luethi, Marcel; Levivier, Marc; Meuli, Reto; Baur, Charles; Henein, Simon

    2015-07-01

    Statistical shape and appearance models play an important role in reducing the segmentation processing time of a vertebra and in improving results for 3D model development. Here, we describe the different steps in generating a statistical shape model (SSM) of the second cervical vertebra (C2) and provide the shape model for general use by the scientific community. The main difficulties in its construction are the morphological complexity of the C2 and its variability in the population. The input dataset is composed of manually segmented anonymized patient computerized tomography (CT) scans. The alignment of the different datasets is done with the procrustes alignment on surface models, and then, the registration is cast as a model-fitting problem using a Gaussian process. A principal component analysis (PCA)-based model is generated which includes the variability of the C2. The SSM was generated using 92 CT scans. The resulting SSM was evaluated for specificity, compactness and generalization ability. The SSM of the C2 is freely available to the scientific community in Slicer (an open source software for image analysis and scientific visualization) with a module created to visualize the SSM using Statismo, a framework for statistical shape modeling. The SSM of the vertebra allows the shape variability of the C2 to be represented. Moreover, the SSM will enable semi-automatic segmentation and 3D model generation of the vertebra, which would greatly benefit surgery planning.

  1. Corrosion of Type 7075-T73 Aluminum in a 10% HNO3 + Fe2(SO4)3 Deoxidizer Solution

    NASA Astrophysics Data System (ADS)

    Savas, Terence P.; Earthman, James C.

    2009-03-01

    Localized corrosion damage in Type 7075-T73 aluminum was investigated for a HNO3 + Fe2(SO4)3 deoxidizer solution which is frequently used for surface pretreatment prior to anodizing. The corrosion damage was quantified in the time domain using the electrochemical noise resistance ( Rn) and in the frequency domain using the spectral noise impedance ( Rsn). The Rsn was derived from an equivalent electrical circuit model that represented the corrosion cell implemented in the present study. These data are correlated to scanning electron microscopy (SEM) examinations and corresponding statistical analysis based on digital image analysis of the corroded surfaces. Other data used to better understand the corrosion mechanisms include the open circuit potential (OCP) and coupling-current time records. Based on statistical analysis of the pit structures for 600 and 1200 s exposures, the best fit was achieved with a 3-paramater lognormal distribution. It was observed for the 1200 s exposure that a small population of pits continued to grow beyond a threshold critical size of 10 μm. In addition, significant grain boundary attack was observed after 1200 s exposure. These data are in good agreement with the electrochemical data. Specifically, the Rn was computed to be 295 and 96 Ω-cm2 for 600 and 1200 s exposures, respectively. The calculated value of Rsn, theoretically shown to be equal to Rn in the low frequency limit, was higher than Rn for a 1200 s exposure period. However, better agreement between the Rn and Rsn was found for frequencies above 0.01 Hz. Experimental results on the measurement performance for potassium chloride (KCl) saturated double-junction Ag/AgCl and single-junction Hg/Hg2Cl2 reference electrodes in the low-pH deoxidizer solution are also compared.

  2. Rainfall Prediction of Indian Peninsula: Comparison of Time Series Based Approach and Predictor Based Approach using Machine Learning Techniques

    NASA Astrophysics Data System (ADS)

    Dash, Y.; Mishra, S. K.; Panigrahi, B. K.

    2017-12-01

    Prediction of northeast/post monsoon rainfall which occur during October, November and December (OND) over Indian peninsula is a challenging task due to the dynamic nature of uncertain chaotic climate. It is imperative to elucidate this issue by examining performance of different machine leaning (ML) approaches. The prime objective of this research is to compare between a) statistical prediction using historical rainfall observations and global atmosphere-ocean predictors like Sea Surface Temperature (SST) and Sea Level Pressure (SLP) and b) empirical prediction based on a time series analysis of past rainfall data without using any other predictors. Initially, ML techniques have been applied on SST and SLP data (1948-2014) obtained from NCEP/NCAR reanalysis monthly mean provided by the NOAA ESRL PSD. Later, this study investigated the applicability of ML methods using OND rainfall time series for 1948-2014 and forecasted up to 2018. The predicted values of aforementioned methods were verified using observed time series data collected from Indian Institute of Tropical Meteorology and the result revealed good performance of ML algorithms with minimal error scores. Thus, it is found that both statistical and empirical methods are useful for long range climatic projections.

  3. Persistent Elongated Particle Total Surface Area Dose to Rat Pleura is Optimum Predictor of Mesothelioma Incidence

    EPA Science Inventory

    Based on preliminary statistical analyses of 29 reanalyzed (quantitative TEM) diverse elongated particle (EP) test samples from the well known and often cited study of Stanton et al. 1981, total surface area (TSA) of biodurable EPs was reported at the 2008 Johnson Conference to b...

  4. A model for generating Surface EMG signal of m. Tibialis Anterior.

    PubMed

    Siddiqi, Ariba; Kumar, Dinesh; Arjunan, Sridhar P

    2014-01-01

    A model that simulates surface electromyogram (sEMG) signal of m. Tibialis Anterior has been developed and tested. This has a firing rate equation that is based on experimental findings. It also has a recruitment threshold that is based on observed statistical distribution. Importantly, it has considered both, slow and fast type which has been distinguished based on their conduction velocity. This model has assumed that the deeper unipennate half of the muscle does not contribute significantly to the potential induced on the surface of the muscle and has approximated the muscle to have parallel structure. The model was validated by comparing the simulated and the experimental sEMG signal recordings. Experiments were conducted on eight subjects who performed isometric dorsiflexion at 10, 20, 30, 50, 75, and 100% maximal voluntary contraction. Normalized root mean square and median frequency of the experimental and simulated EMG signal were computed and the slopes of the linearity with the force were statistically analyzed. The gradients were found to be similar (p>0.05) for both experimental and simulated sEMG signal, validating the proposed model.

  5. Application of a statistical design to the optimization of parameters and culture medium for alpha-amylase production by Aspergillus oryzae CBS 819.72 grown on gruel (wheat grinding by-product).

    PubMed

    Kammoun, Radhouane; Naili, Belgacem; Bejar, Samir

    2008-09-01

    The production optimization of alpha-amylase (E.C.3.2.1.1) from Aspergillus oryzae CBS 819.72 fungus, using a by-product of wheat grinding (gruel) as sole carbon source, was performed with statistical methodology based on three experimental designs. The optimisation of temperature, agitation and inoculum size was attempted using a Box-Behnken design under the response surface methodology. The screening of nineteen nutrients for their influence on alpha-amylase production was achieved using a Plackett-Burman design. KH(2)PO(4), urea, glycerol, (NH(4))(2)SO(4), CoCl(2), casein hydrolysate, soybean meal hydrolysate, MgSO(4) were selected based on their positive influence on enzyme formation. The optimized nutrients concentration was obtained using a Taguchi experimental design and the analysis of the data predicts a theoretical increase in the alpha-amylase expression of 73.2% (from 40.1 to 151.1 U/ml). These conditions were validated experimentally and revealed an enhanced alpha-amylase yield of 72.7%.

  6. Non-Born-Oppenheimer molecular dynamics of the spin-forbidden reaction O(3P) + CO(X 1Σ+) → CO2(tilde X{}^1Σ _g^ +)

    NASA Astrophysics Data System (ADS)

    Jasper, Ahren W.; Dawes, Richard

    2013-10-01

    The lowest-energy singlet (1 1A') and two lowest-energy triplet (1 3A' and 1 3A″) electronic states of CO2 are characterized using dynamically weighted multireference configuration interaction (dw-MRCI+Q) electronic structure theory calculations extrapolated to the complete basis set (CBS) limit. Global analytic representations of the dw-MRCI+Q/CBS singlet and triplet surfaces and of their CASSCF/aug-cc-pVQZ spin-orbit coupling surfaces are obtained via the interpolated moving least squares (IMLS) semiautomated surface fitting method. The spin-forbidden kinetics of the title reaction is calculated using the coupled IMLS surfaces and coherent switches with decay of mixing non-Born-Oppenheimer molecular dynamics. The calculated spin-forbidden association rate coefficient (corresponding to the high pressure limit of the rate coefficient) is 7-35 times larger at 1000-5000 K than the rate coefficient used in many detailed chemical models of combustion. A dynamical analysis of the multistate trajectories is presented. The trajectory calculations reveal direct (nonstatistical) and indirect (statistical) spin-forbidden reaction mechanisms and may be used to test the suitability of transition-state-theory-like statistical methods for spin-forbidden kinetics. Specifically, we consider the appropriateness of the "double passage" approximation, of assuming statistical distributions of seam crossings, and of applications of the unified statistical model for spin-forbidden reactions.

  7. Elemental, microstructural, and mechanical characterization of high gold orthodontic brackets after intraoral aging.

    PubMed

    Hersche, Sepp; Sifakakis, Iosif; Zinelis, Spiros; Eliades, Theodore

    2017-02-01

    The purpose of the present study was to investigate the elemental composition, the microstructure, and the selected mechanical properties of high gold orthodontic brackets after intraoral aging. Thirty Incognito™ (3M Unitek, Bad Essen, Germany) lingual brackets were studied, 15 brackets as received (control group) and 15 brackets retrieved from different patients after orthodontic treatment. The surface of the wing area was examined by scanning electron microscopy (SEM). Backscattered electron imaging (BEI) was performed, and the elemental composition was determined by X-ray EDS analysis (EDX). After appropriate metallographic preparation, the mechanical properties tested were Martens hardness (HM), indentation modulus (EIT), elastic index (ηIT), and Vickers hardness (HV). These properties were determined employing instrumented indentation testing (IIT) with a Vickers indenter. The results were statistically analyzed by unpaired t-test (α=0.05). There were no statistically significant differences evidenced in surface morphology and elemental content between the control and the experimental group. These two groups of brackets showed no statistically significant difference in surface morphology. Moreover, the mean values of HM, EIT, ηIT, and HV did not reach statistical significance between the groups (p>0.05). Under the limitations of this study, it may be concluded that the surface elemental content and microstructure as well as the evaluated mechanical properties of the Incognito™ lingual brackets remain unaffected by intraoral aging.

  8. The leaf angle distribution of natural plant populations: assessing the canopy with a novel software tool.

    PubMed

    Müller-Linow, Mark; Pinto-Espinosa, Francisco; Scharr, Hanno; Rascher, Uwe

    2015-01-01

    Three-dimensional canopies form complex architectures with temporally and spatially changing leaf orientations. Variations in canopy structure are linked to canopy function and they occur within the scope of genetic variability as well as a reaction to environmental factors like light, water and nutrient supply, and stress. An important key measure to characterize these structural properties is the leaf angle distribution, which in turn requires knowledge on the 3-dimensional single leaf surface. Despite a large number of 3-d sensors and methods only a few systems are applicable for fast and routine measurements in plants and natural canopies. A suitable approach is stereo imaging, which combines depth and color information that allows for easy segmentation of green leaf material and the extraction of plant traits, such as leaf angle distribution. We developed a software package, which provides tools for the quantification of leaf surface properties within natural canopies via 3-d reconstruction from stereo images. Our approach includes a semi-automatic selection process of single leaves and different modes of surface characterization via polygon smoothing or surface model fitting. Based on the resulting surface meshes leaf angle statistics are computed on the whole-leaf level or from local derivations. We include a case study to demonstrate the functionality of our software. 48 images of small sugar beet populations (4 varieties) have been analyzed on the base of their leaf angle distribution in order to investigate seasonal, genotypic and fertilization effects on leaf angle distributions. We could show that leaf angle distributions change during the course of the season with all varieties having a comparable development. Additionally, different varieties had different leaf angle orientation that could be separated in principle component analysis. In contrast nitrogen treatment had no effect on leaf angles. We show that a stereo imaging setup together with the appropriate image processing tools is capable of retrieving the geometric leaf surface properties of plants and canopies. Our software package provides whole-leaf statistics but also a local estimation of leaf angles, which may have great potential to better understand and quantify structural canopy traits for guided breeding and optimized crop management.

  9. Patellar segmentation from 3D magnetic resonance images using guided recursive ray-tracing for edge pattern detection

    NASA Astrophysics Data System (ADS)

    Cheng, Ruida; Jackson, Jennifer N.; McCreedy, Evan S.; Gandler, William; Eijkenboom, J. J. F. A.; van Middelkoop, M.; McAuliffe, Matthew J.; Sheehan, Frances T.

    2016-03-01

    The paper presents an automatic segmentation methodology for the patellar bone, based on 3D gradient recalled echo and gradient recalled echo with fat suppression magnetic resonance images. Constricted search space outlines are incorporated into recursive ray-tracing to segment the outer cortical bone. A statistical analysis based on the dependence of information in adjacent slices is used to limit the search in each image to between an outer and inner search region. A section based recursive ray-tracing mechanism is used to skip inner noise regions and detect the edge boundary. The proposed method achieves higher segmentation accuracy (0.23mm) than the current state-of-the-art methods with the average dice similarity coefficient of 96.0% (SD 1.3%) agreement between the auto-segmentation and ground truth surfaces.

  10. Standardization of a Volumetric Displacement Measurement for Two-Body Abrasion Scratch Test Data Analysis

    NASA Technical Reports Server (NTRS)

    Kobrick, Ryan L.; Klaus, David M.; Street, Kenneth W., Jr.

    2010-01-01

    A limitation has been identified in the existing test standards used for making controlled, two-body abrasion scratch measurements based solely on the width of the resultant score on the surface of the material. A new, more robust method is proposed for analyzing a surface scratch that takes into account the full three-dimensional profile of the displaced material. To accomplish this, a set of four volume displacement metrics are systematically defined by normalizing the overall surface profile to statistically denote the area of relevance, termed the Zone of Interaction (ZOI). From this baseline, depth of the trough and height of the ploughed material are factored into the overall deformation assessment. Proof of concept data were collected and analyzed to demonstrate the performance of this proposed methodology. This technique takes advantage of advanced imaging capabilities that now allow resolution of the scratched surface to be quantified in greater detail than was previously achievable. A quantified understanding of fundamental particle-material interaction is critical to anticipating how well components can withstand prolonged use in highly abrasive environments, specifically for our intended applications on the surface of the Moon and other planets or asteroids, as well as in similarly demanding, harsh terrestrial settings

  11. Cortex-based inter-subject analysis of iEEG and fMRI data sets: application to sustained task-related BOLD and gamma responses.

    PubMed

    Esposito, Fabrizio; Singer, Neomi; Podlipsky, Ilana; Fried, Itzhak; Hendler, Talma; Goebel, Rainer

    2013-02-01

    Linking regional metabolic changes with fluctuations in the local electromagnetic fields directly on the surface of the human cerebral cortex is of tremendous importance for a better understanding of detailed brain processes. Functional magnetic resonance imaging (fMRI) and intra-cranial electro-encephalography (iEEG) measure two technically unrelated but spatially and temporally complementary sets of functional descriptions of human brain activity. In order to allow fine-grained spatio-temporal human brain mapping at the population-level, an effective comparative framework for the cortex-based inter-subject analysis of iEEG and fMRI data sets is needed. We combined fMRI and iEEG recordings of the same patients with epilepsy during alternated intervals of passive movie viewing and music listening to explore the degree of local spatial correspondence and temporal coupling between blood oxygen level dependent (BOLD) fMRI changes and iEEG spectral power modulations across the cortical surface after cortex-based inter-subject alignment. To this purpose, we applied a simple model of the iEEG activity spread around each electrode location and the cortex-based inter-subject alignment procedure to transform discrete iEEG measurements into cortically distributed group patterns by establishing a fine anatomic correspondence of many iEEG cortical sites across multiple subjects. Our results demonstrate the feasibility of a multi-modal inter-subject cortex-based distributed analysis for combining iEEG and fMRI data sets acquired from multiple subjects with the same experimental paradigm but with different iEEG electrode coverage. The proposed iEEG-fMRI framework allows for improved group statistics in a common anatomical space and preserves the dynamic link between the temporal features of the two modalities. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Estimation of Mouse Organ Locations Through Registration of a Statistical Mouse Atlas With Micro-CT Images

    PubMed Central

    Stout, David B.; Chatziioannou, Arion F.

    2012-01-01

    Micro-CT is widely used in preclinical studies of small animals. Due to the low soft-tissue contrast in typical studies, segmentation of soft tissue organs from noncontrast enhanced micro-CT images is a challenging problem. Here, we propose an atlas-based approach for estimating the major organs in mouse micro-CT images. A statistical atlas of major trunk organs was constructed based on 45 training subjects. The statistical shape model technique was used to include inter-subject anatomical variations. The shape correlations between different organs were described using a conditional Gaussian model. For registration, first the high-contrast organs in micro-CT images were registered by fitting the statistical shape model, while the low-contrast organs were subsequently estimated from the high-contrast organs using the conditional Gaussian model. The registration accuracy was validated based on 23 noncontrast-enhanced and 45 contrast-enhanced micro-CT images. Three different accuracy metrics (Dice coefficient, organ volume recovery coefficient, and surface distance) were used for evaluation. The Dice coefficients vary from 0.45 ± 0.18 for the spleen to 0.90 ± 0.02 for the lungs, the volume recovery coefficients vary from for the liver to 1.30 ± 0.75 for the spleen, the surface distances vary from 0.18 ± 0.01 mm for the lungs to 0.72 ± 0.42 mm for the spleen. The registration accuracy of the statistical atlas was compared with two publicly available single-subject mouse atlases, i.e., the MOBY phantom and the DIGIMOUSE atlas, and the results proved that the statistical atlas is more accurate than the single atlases. To evaluate the influence of the training subject size, different numbers of training subjects were used for atlas construction and registration. The results showed an improvement of the registration accuracy when more training subjects were used for the atlas construction. The statistical atlas-based registration was also compared with the thin-plate spline based deformable registration, commonly used in mouse atlas registration. The results revealed that the statistical atlas has the advantage of improving the estimation of low-contrast organs. PMID:21859613

  13. ADHESION OF AN ENDODONTIC SEALER TO DENTIN AND GUTTA-PERCHA: SHEAR AND PUSH-OUT BOND STRENGTH MEASUREMENTS AND SEM ANALYSIS

    PubMed Central

    Teixeira, Cleonice Silveira; Alfredo, Edson; Thomé, Luis Henrique de Camargo; Gariba-Silva, Ricardo; Silva-Sousa, Yara T. Correa; Sousa, Manoel Damião

    2009-01-01

    The use of an adequate method for evaluation of the adhesion of root canal filling materials provides more reliable results to allow comparison of the materials and substantiate their clinical choice. The aims of this study were to compare the shear bond strength (SBS) test and push-out test for evaluation of the adhesion of an epoxy-based endodontic sealer (AH Plus) to dentin and guttapercha, and to assess the failure modes on the debonded surfaces by means of scanning electron microscopy (SEM). Three groups were established (n=7): in group 1, root cylinders obtained from human canines were embedded in acrylic resin and had their canals prepared and filled with sealer; in group 2, longitudinal sections of dentin cylinders were embedded in resin with the canal surface smoothed and turned upwards; in group 3, gutta-percha cylinders were embedded in resin. Polyethylene tubes filled with sealer were positioned on the polished surface of the specimens (groups 2 and 3). The push-out test (group 1) and the SBS test (groups 2 and 3) were performed in an Instron universal testing machine running at crosshead speed of 1 mm/min. Means (±SD) in MPa were: G1 (8.8±1.13), G2 (5.9±1.05) and G3 (3.8±0.55). Statistical analysis by ANOVA and Student's t-test (α=0.05) revealed statistically significant differences (p<0.01) among the groups. SEM analysis showed a predominance of adhesive and mixed failures of AH Plus sealer. The tested surface affected significantly the results with the sealer reaching higher bond strength to dentin than to guttapercha with the SBS test. The comparison of the employed methodologies showed that the SBS test produced significantly lower bond strength values than the push-out test, was skilful in determining the adhesion of AH Plus sealer to dentin and gutta-percha, and required specimens that could be easily prepared for SEM, presenting as a viable alternative for further experiments. PMID:19274399

  14. An impression cytology based study of ocular surface in an urban population.

    PubMed

    Mukhopadhyay, Somnath; Dutta, Jayanta; Mitra, Jayati; Prakash, Ratnesh; Datta, Himadri

    2013-04-01

    To assess the health of ocular surface in a defined urban population, conjunctival goblet cell density and degree of surface squamous metaplasia were utilized as study tools. Two thousand names of those aged between 20 and 79 years from the 2006 electoral register in ward number 63 of Kolkata Corporation area were initially selected. Normal healthy human volunteers without any history of ocular surface disorder were recruited and divided into five age-groups. Impression cytology samples were obtained from interpalpebral part of bulbar conjunctiva from all the participants fixated and stained by a single observer. A stratified, clustered, disproportionate, random sampling method was used. The software used in the statistical analysis was EPI Info. The tests applied were t test and ANOVA. A variation in the number of goblet cells according to gender (women having less cells) and age (20-30 years group having the highest number of cells) was found. Those working outdoors were found to have fewer goblet cells compared to those who stay indoors. The majority of the people had grade 1 cytological appearance in both males and females. There was no statistically significant difference in Nelson's grading with age. People using coal and kerosene to cook were found to have a smaller goblet cell density than those who cooked on LPG or those who did not cook at all. Besides age and sex, environmental factors like the method of cooking and occupational variables (like outdoor activity, prolonged period of computer use, etc.) modify the health of the ocular surface. The results of this study will help put these findings into perspective as public health problems.

  15. High-speed detection of DNA translocation in nanopipettes

    NASA Astrophysics Data System (ADS)

    Fraccari, Raquel L.; Ciccarella, Pietro; Bahrami, Azadeh; Carminati, Marco; Ferrari, Giorgio; Albrecht, Tim

    2016-03-01

    We present a high-speed electrical detection scheme based on a custom-designed CMOS amplifier which allows the analysis of DNA translocation in glass nanopipettes on a microsecond timescale. Translocation of different DNA lengths in KCl electrolyte provides a scaling factor of the DNA translocation time equal to p = 1.22, which is different from values observed previously with nanopipettes in LiCl electrolyte or with nanopores. Based on a theoretical model involving electrophoresis, hydrodynamics and surface friction, we show that the experimentally observed range of p-values may be the result of, or at least be affected by DNA adsorption and friction between the DNA and the substrate surface.We present a high-speed electrical detection scheme based on a custom-designed CMOS amplifier which allows the analysis of DNA translocation in glass nanopipettes on a microsecond timescale. Translocation of different DNA lengths in KCl electrolyte provides a scaling factor of the DNA translocation time equal to p = 1.22, which is different from values observed previously with nanopipettes in LiCl electrolyte or with nanopores. Based on a theoretical model involving electrophoresis, hydrodynamics and surface friction, we show that the experimentally observed range of p-values may be the result of, or at least be affected by DNA adsorption and friction between the DNA and the substrate surface. Electronic supplementary information (ESI) available: Gel electrophoresis confirming lengths and purity of DNA samples, comparison between Axopatch 200B and custom-built setup, comprehensive low-noise amplifier characterization, representative I-V curves of nanopipettes used, typical scatter plots of τ vs. peak amplitude for the four LDNA's used, table of most probable τ values, a comparison between different fitting models for the DNA translocation time distribution, further details on the stochastic numerical simulation of the scaling statistics and the derivation of the extended model for the length dependence of τ. See DOI: 10.1039/c5nr08634e

  16. Analyses of global sea surface temperature 1856-1991

    NASA Astrophysics Data System (ADS)

    Kaplan, Alexey; Cane, Mark A.; Kushnir, Yochanan; Clement, Amy C.; Blumenthal, M. Benno; Rajagopalan, Balaji

    1998-08-01

    Global analyses of monthly sea surface temperature (SST) anomalies from 1856 to 1991 are produced using three statistically based methods: optimal smoothing (OS), the Kaiman filter (KF) and optimal interpolation (OI). Each of these is accompanied by estimates of the error covariance of the analyzed fields. The spatial covariance function these methods require is estimated from the available data; the timemarching model is a first-order autoregressive model again estimated from data. The data input for the analyses are monthly anomalies from the United Kingdom Meteorological Office historical sea surface temperature data set (MOHSST5) [Parker et al., 1994] of the Global Ocean Surface Temperature Atlas (GOSTA) [Bottomley et al., 1990]. These analyses are compared with each other, with GOSTA, and with an analysis generated by projection (P) onto a set of empirical orthogonal functions (as in Smith et al. [1996]). In theory, the quality of the analyses should rank in the order OS, KF, OI, P, and GOSTA. It is found that the first four give comparable results in the data-rich periods (1951-1991), but at times when data is sparse the first three differ significantly from P and GOSTA. At these times the latter two often have extreme and fluctuating values, prima facie evidence of error. The statistical schemes are also verified against data not used in any of the analyses (proxy records derived from corals and air temperature records from coastal and island stations). We also present evidence that the analysis error estimates are indeed indicative of the quality of the products. At most times the OS and KF products are close to the OI product, but at times of especially poor coverage their use of information from other times is advantageous. The methods appear to reconstruct the major features of the global SST field from very sparse data. Comparison with other indications of the El Niño-Southern Oscillation cycle show that the analyses provide usable information on interannual variability as far back as the 1860s.

  17. Prediction of rainfall anomalies during the dry to wet transition season over the Southern Amazonia using machine learning tools

    NASA Astrophysics Data System (ADS)

    Shan, X.; Zhang, K.; Zhuang, Y.; Fu, R.; Hong, Y.

    2017-12-01

    Seasonal prediction of rainfall during the dry-to-wet transition season in austral spring (September-November) over southern Amazonia is central for improving planting crops and fire mitigation in that region. Previous studies have identified the key large-scale atmospheric dynamic and thermodynamics pre-conditions during the dry season (June-August) that influence the rainfall anomalies during the dry to wet transition season over Southern Amazonia. Based on these key pre-conditions during dry season, we have evaluated several statistical models and developed a Neural Network based statistical prediction system to predict rainfall during the dry to wet transition for Southern Amazonia (5-15°S, 50-70°W). Multivariate Empirical Orthogonal Function (EOF) Analysis is applied to the following four fields during JJA from the ECMWF Reanalysis (ERA-Interim) spanning from year 1979 to 2015: geopotential height at 200 hPa, surface relative humidity, convective inhibition energy (CIN) index and convective available potential energy (CAPE), to filter out noise and highlight the most coherent spatial and temporal variations. The first 10 EOF modes are retained for inputs to the statistical models, accounting for at least 70% of the total variance in the predictor fields. We have tested several linear and non-linear statistical methods. While the regularized Ridge Regression and Lasso Regression can generally capture the spatial pattern and magnitude of rainfall anomalies, we found that that Neural Network performs best with an accuracy greater than 80%, as expected from the non-linear dependence of the rainfall on the large-scale atmospheric thermodynamic conditions and circulation. Further tests of various prediction skill metrics and hindcasts also suggest this Neural Network prediction approach can significantly improve seasonal prediction skill than the dynamic predictions and regression based statistical predictions. Thus, this statistical prediction system could have shown potential to improve real-time seasonal rainfall predictions in the future.

  18. The impact of particle size and initial solid loading on thermochemical pretreatment of wheat straw for improving sugar recovery.

    PubMed

    Rojas-Rejón, Oscar A; Sánchez, Arturo

    2014-07-01

    This work studies the effect of initial solid load (4-32 %; w/v, DS) and particle size (0.41-50 mm) on monosaccharide yield of wheat straw subjected to dilute H(2)SO(4) (0.75 %, v/v) pretreatment and enzymatic saccharification. Response surface methodology (RSM) based on a full factorial design (FFD) was used for the statistical analysis of pretreatment and enzymatic hydrolysis. The highest xylose yield obtained during pretreatment (ca. 86 %; of theoretical) was achieved at 4 % (w/v, DS) and 25 mm. The solid fraction obtained from the first set of experiments was subjected to enzymatic hydrolysis at constant enzyme dosage (17 FPU/g); statistical analysis revealed that glucose yield was favored with solids pretreated at low initial solid loads and small particle sizes. Dynamic experiments showed that glucose yield did not increase after 48 h of enzymatic hydrolysis. Once established pretreatment conditions, experiments were carried out with several initial solid loading (4-24 %; w/v, DS) and enzyme dosages (5-50 FPU/g). Two straw sizes (0.41 and 50 mm) were used for verification purposes. The highest glucose yield (ca. 55 %; of theoretical) was achieved at 4 % (w/v, DS), 0.41 mm and 50 FPU/g. Statistical analysis of experiments showed that at low enzyme dosage, particle size had a remarkable effect over glucose yield and initial solid load was the main factor for glucose yield.

  19. Thermodynamics of rough colloidal surfaces

    NASA Astrophysics Data System (ADS)

    Goldstein, Raymond E.; Halsey, Thomas C.; Leibig, Michael

    1991-03-01

    In Debye-Hückel theory, the free energy of an electric double layer near a colloidal (or any other) surface can be related to the statistics of random walks near that surface. We present a numerical method based on this correspondence for the calculation of the double-layer free energy for an arbitrary charged or conducting surface. For self-similar surfaces, we propose a scaling law for the behavior of the free energy as a function of the screening length and the surface dimension. This scaling law is verified by numerical computation. Capacitance measurements on rough surfaces of, e.g., colloids can test these predictions.

  20. Model-based image analysis of a tethered Brownian fibre for shear stress sensing

    PubMed Central

    2017-01-01

    The measurement of fluid dynamic shear stress acting on a biologically relevant surface is a challenging problem, particularly in the complex environment of, for example, the vasculature. While an experimental method for the direct detection of wall shear stress via the imaging of a synthetic biology nanorod has recently been developed, the data interpretation so far has been limited to phenomenological random walk modelling, small-angle approximation, and image analysis techniques which do not take into account the production of an image from a three-dimensional subject. In this report, we develop a mathematical and statistical framework to estimate shear stress from rapid imaging sequences based firstly on stochastic modelling of the dynamics of a tethered Brownian fibre in shear flow, and secondly on a novel model-based image analysis, which reconstructs fibre positions by solving the inverse problem of image formation. This framework is tested on experimental data, providing the first mechanistically rational analysis of the novel assay. What follows further develops the established theory for an untethered particle in a semi-dilute suspension, which is of relevance to, for example, the study of Brownian nanowires without flow, and presents new ideas in the field of multi-disciplinary image analysis. PMID:29212755

  1. The effect of cleaning substances on the surface of denture base material.

    PubMed

    Žilinskas, Juozas; Junevičius, Jonas; Česaitis, Kęstutis; Junevičiūtė, Gabrielė

    2013-12-11

    The aim of this study was to evaluate the effect of substances used for hygienic cleaning of dentures on the surface of the denture base material. Meliodent Heat Cure (Heraeus-Kulzer, Germany) heat-polymerized acrylic resin was used to produce plates with all the characteristics of removable denture bases (subsequently, "plates"). Oral-B Complete toothbrushes of various brush head types were fixed to a device that imitated tooth brushing movements; table salt and baking soda (frequently used by patients to improve tooth brushing results), toothpaste ("Colgate Total"), and water were also applied. Changes in plate surfaces were monitored by measuring surface reflection alterations on spectrometry. Measurements were conducted before the cleaning and at 2 and 6 hours after cleaning. No statistically significant differences were found between the 3 test series. All 3 plates used in the study underwent statistically significant (p<0.05 changed)--the reflection became poorer. The plates were most affected by the medium-bristle toothbrush with baking soda--the total reflection reduction was 4.82 ± 0.1%; among toothbrushes with toothpaste, the hard-type toothbrush had the greatest reflection-reducing effect--4.6 ± 0.05%, while the toothbrush with table salt inflicted the least damage (3.5 ± 0.16%) due to the presence of rounded crystals between the bristles and the resin surface. Toothbrushes with water had a uniform negative effect on the plate surface - 3.8 9 ± 0.07%. All substances used by the patients caused surface abrasion of the denture base material, which reduced the reflection; a hard toothbrush with toothpaste had the greatest abrasive effect, while soft toothbrushes inflicted the least damage.

  2. Spectral analysis of groove spacing on Ganymede

    NASA Technical Reports Server (NTRS)

    Grimm, R. E.

    1984-01-01

    The technique used to analyze groove spacing on Ganymede is presented. Data from Voyager images are used determine the surface topography and position of the grooves. Power spectal estimates are statistically analyzed and sample data is included.

  3. A Statistical Discrimination Experiment for Eurasian Events Using a Twenty-Seven-Station Network

    DTIC Science & Technology

    1980-07-08

    to test the effectiveness of a multivariate method of analysis for distinguishing earthquakes from explosions. The data base for the experiment...to test the effectiveness of a multivariate method of analysis for distinguishing earthquakes from explosions. The data base for the experiment...the weight assigned to each variable whenever a new one is added. Jennrich, R. I. (1977). Stepwise discriminant analysis , in Statistical Methods for

  4. Remote sensing of atmospheric water content from Bhaskara SAMIR data. [using statistical linear regression analysis

    NASA Technical Reports Server (NTRS)

    Gohil, B. S.; Hariharan, T. A.; Sharma, A. K.; Pandey, P. C.

    1982-01-01

    The 19.35 GHz and 22.235 GHz passive microwave radiometers (SAMIR) on board the Indian satellite Bhaskara have provided very useful data. From these data has been demonstrated the feasibility of deriving atmospheric and ocean surface parameters such as water vapor content, liquid water content, rainfall rate and ocean surface winds. Different approaches have been tried for deriving the atmospheric water content. The statistical and empirical methods have been used by others for the analysis of the Nimbus data. A simulation technique has been attempted for the first time for 19.35 GHz and 22.235 GHz radiometer data. The results obtained from three different methods are compared with radiosonde data. A case study of a tropical depression has been undertaken to demonstrate the capability of Bhaskara SAMIR data to show the variation of total water vapor and liquid water contents.

  5. From Sub-basin to Grid Scale Soil Moisture Disaggregation in SMART, A Semi-distributed Hydrologic Modeling Framework

    NASA Astrophysics Data System (ADS)

    Ajami, H.; Sharma, A.

    2016-12-01

    A computationally efficient, semi-distributed hydrologic modeling framework is developed to simulate water balance at a catchment scale. The Soil Moisture and Runoff simulation Toolkit (SMART) is based upon the delineation of contiguous and topologically connected Hydrologic Response Units (HRUs). In SMART, HRUs are delineated using thresholds obtained from topographic and geomorphic analysis of a catchment, and simulation elements are distributed cross sections or equivalent cross sections (ECS) delineated in first order sub-basins. ECSs are formulated by aggregating topographic and physiographic properties of the part or entire first order sub-basins to further reduce computational time in SMART. Previous investigations using SMART have shown that temporal dynamics of soil moisture are well captured at a HRU level using the ECS delineation approach. However, spatial variability of soil moisture within a given HRU is ignored. Here, we examined a number of disaggregation schemes for soil moisture distribution in each HRU. The disaggregation schemes are either based on topographic based indices or a covariance matrix obtained from distributed soil moisture simulations. To assess the performance of the disaggregation schemes, soil moisture simulations from an integrated land surface-groundwater model, ParFlow.CLM in Baldry sub-catchment, Australia are used. ParFlow is a variably saturated sub-surface flow model that is coupled to the Common Land Model (CLM). Our results illustrate that the statistical disaggregation scheme performs better than the methods based on topographic data in approximating soil moisture distribution at a 60m scale. Moreover, the statistical disaggregation scheme maintains temporal correlation of simulated daily soil moisture while preserves the mean sub-basin soil moisture. Future work is focused on assessing the performance of this scheme in catchments with various topographic and climate settings.

  6. A Parametric Model of Shoulder Articulation for Virtual Assessment of Space Suit Fit

    NASA Technical Reports Server (NTRS)

    Kim, K. Han; Young, Karen S.; Bernal, Yaritza; Boppana, Abhishektha; Vu, Linh Q.; Benson, Elizabeth A.; Jarvis, Sarah; Rajulu, Sudhakar L.

    2016-01-01

    Suboptimal suit fit is a known risk factor for crewmember shoulder injury. Suit fit assessment is however prohibitively time consuming and cannot be generalized across wide variations of body shapes and poses. In this work, we have developed a new design tool based on the statistical analysis of body shape scans. This tool is aimed at predicting the skin deformation and shape variations for any body size and shoulder pose for a target population. This new process, when incorporated with CAD software, will enable virtual suit fit assessments, predictively quantifying the contact volume, and clearance between the suit and body surface at reduced time and cost.

  7. Metal Matrix Composite Material by Direct Metal Deposition

    NASA Astrophysics Data System (ADS)

    Novichenko, D.; Marants, A.; Thivillon, L.; Bertrand, P. H.; Smurov, I.

    Direct Metal Deposition (DMD) is a laser cladding process for producing a protective coating on the surface of a metallic part or manufacturing layer-by-layer parts in a single-step process. The objective of this work is to demonstrate the possibility to create carbide-reinforced metal matrix composite objects. Powders of steel 16NCD13 with different volume contents of titanium carbide are tested. On the base of statistical analysis, a laser cladding processing map is constructed. Relationships between the different content of titanium carbide in a powder mixture and the material microstructure are found. Mechanism of formation of various precipitated titanium carbides is investigated.

  8. Changes in the frequency of extreme air pollution events over the Eastern United States and Europe

    NASA Astrophysics Data System (ADS)

    Rieder, H. E.; Fiore, A. M.; Fang, Y.; Staehelin, J.

    2011-12-01

    Over the past few decades, thresholds for national air quality standards, intended to protect public health and welfare, have been lowered repeatedly. At the same time observations, over Europe and the Eastern U.S., demonstrate that extreme air pollution events (high O3 and PM2.5) are typically associated with stagnation events. Recent work showed that in a changing climate high air pollution events are likely to increase in frequency and duration. Within this work we examine meteorological and surface ozone observations from CASTNet over the U.S. and EMEP over Europe and "idealized" simulations with the GFDL AM3 chemistry-climate model, which isolate the role of climate change on air quality. Specifically, we examine an "idealized 1990s" simulation, forced with 20-year mean monthly climatologies for sea surface temperatures and sea ice from observations for 1981-2000, and an "idealized 2090s" simulation forced by the observed climatologies plus the multi-model mean changes in sea surface temperature and sea ice simulated by 19 IPCC AR-4 models under the A1B scenario for 2081-2100. With innovative statistical tools (empirical orthogonal functions (EOFs) and statistics of extremes (EVT)), we analyze the frequency distribution of past, present and future extreme air pollution events over the Eastern United States and Europe. The upper tail of observed values at individual stations (e.g., within the CASTNet), i.e., the extremes (maximum daily 8-hour average (MDA8) O3>60ppb) are poorly described by a Gaussian distribution. However, further analysis showed that applying Peak-Over-Threshold-models, better capture the extremes and allows us to estimate return levels of pollution events above certain threshold values of interest. We next apply EOF analysis to identify regions that vary coherently within the ground-based monitoring networks. Over the United States, the first EOF obtained from the model in both the 1990s and 2090s idealized simulations identifies the Northeast as a region that varies coherently. Correlation analysis reveals that this EOF pattern is most strongly expressed in association with high surface temperature and high surface pressure conditions, consistent with previous work showing that observed O3 episodes over this area reflect the combined impacts of stagnation and increased chemical production. Next steps include the extension of this analysis applying EVT tools to the principal component time series associated with this EOF. The combination of EOF and EVT tools applied to the GFDL AM3 1990s vs. 2090s idealized simulations will enable us to quantify changes in the return levels of air pollution extremes. Therefore the combination of observational data and numerical and statistical models should allow us to identify key driving forces between high air pollution events and to estimate changes in the frequency of such events under different climate change scenarios.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moritzer, E., E-mail: elmar.moritzer@ktp.upb.de; Leister, C., E-mail: elmar.moritzer@ktp.upb.de

    The industrial use of atmospheric pressure plasmas in the plastics processing industry has increased significantly in recent years. Users of this treatment process have the possibility to influence the target values (e.g. bond strength or surface energy) with the help of kinematic and electrical parameters. Until now, systematic procedures have been used with which the parameters can be adapted to the process or product requirements but only by very time-consuming methods. For this reason, the relationship between influencing values and target values will be examined based on the example of a pretreatment in the bonding process with the help ofmore » statistical experimental design. Because of the large number of parameters involved, the analysis is restricted to the kinematic and electrical parameters. In the experimental tests, the following factors are taken as parameters: gap between nozzle and substrate, treatment velocity (kinematic data), voltage and duty cycle (electrical data). The statistical evaluation shows significant relationships between the parameters and surface energy in the case of polypropylene. An increase in the voltage and duty cycle increases the polar proportion of the surface energy, while a larger gap and higher velocity leads to lower energy levels. The bond strength of the overlapping bond is also significantly influenced by the voltage, velocity and gap. The direction of their effects is identical with those of the surface energy. In addition to the kinematic influences of the motion of an atmospheric pressure plasma jet, it is therefore especially important that the parameters for the plasma production are taken into account when designing the pretreatment processes.« less

  10. A note on generalized Genome Scan Meta-Analysis statistics

    PubMed Central

    Koziol, James A; Feng, Anne C

    2005-01-01

    Background Wise et al. introduced a rank-based statistical technique for meta-analysis of genome scans, the Genome Scan Meta-Analysis (GSMA) method. Levinson et al. recently described two generalizations of the GSMA statistic: (i) a weighted version of the GSMA statistic, so that different studies could be ascribed different weights for analysis; and (ii) an order statistic approach, reflecting the fact that a GSMA statistic can be computed for each chromosomal region or bin width across the various genome scan studies. Results We provide an Edgeworth approximation to the null distribution of the weighted GSMA statistic, and, we examine the limiting distribution of the GSMA statistics under the order statistic formulation, and quantify the relevance of the pairwise correlations of the GSMA statistics across different bins on this limiting distribution. We also remark on aggregate criteria and multiple testing for determining significance of GSMA results. Conclusion Theoretical considerations detailed herein can lead to clarification and simplification of testing criteria for generalizations of the GSMA statistic. PMID:15717930

  11. Comparative Effect of Different Polymerization Techniques on the Flexural and Surface Properties of Acrylic Denture Bases.

    PubMed

    Gad, Mohammed M; Fouda, Shaimaa M; ArRejaie, Aws S; Al-Thobity, Ahmad M

    2017-05-22

    Polymerization techniques have been modified to improve physical and mechanical properties of polymethylmethacrylate (PMMA) denture base, as have the laboratory procedures that facilitate denture construction techniques. The purpose of the present study was to investigate the effect of autoclave polymerization on flexural strength, elastic modulus, surface roughness, and the hardness of PMMA denture base resins. Major Base and Vertex Implacryl heat-polymerized acrylic resins were used to fabricate 180 specimens. According to the polymerization technique, tested groups were divided into: group I (water-bath polymerization), group II (short autoclave polymerization cycle, 60°C for 30 minutes, then 130°C for 10 minutes), and group III (long autoclave polymerization cycle, 60°C for 30 minutes, then 130°C for 20 minutes). Each group was divided into two subgroups based on the materials used. Flexural strength and elastic modulus were determined by a three-point bending test. Surface roughness and hardness were evaluated with a profilometer and Vickers hardness (VH) test, respectively. One-way ANOVA and the Tukey-Kramer multiple-comparison test were used for results analysis, which were statistically significant at p ≤ 0.05. Autoclave polymerization showed a significant increase in flexural strength and hardness of the two resins (p < 0.05). The elastic modulus showed a significant increase in the major base resin, while a significant decrease was seen for Vertex Implacryl in all groups (p < 0.05); however, there was no significant difference in surface roughness between autoclave polymerization and water-bath polymerization (p > 0.05). Autoclave polymerization significantly increased the flexural properties and hardness of PMMA denture bases, while the surface roughness was within acceptable clinical limits. For a long autoclave polymerization cycle, it could be used as an alternative to water-bath polymerization. © 2017 by the American College of Prosthodontists.

  12. Multivariate analysis of subsurface radiometric data in Rongsohkham area, East Khasi Hills district, Meghalaya (India): implication on uranium exploration.

    PubMed

    Kukreti, B M; Pandey, Pradeep; Singh, R V

    2012-08-01

    Non-coring based exploratory drilling was under taken in the sedimentary environment of Rangsohkham block, East Khasi Hills district to examine the eastern extension of existing uranium resources located at Domiasiat and Wakhyn in the Mahadek basin of Meghalaya (India). Although radiometric survey and radiometric analysis of surface grab/channel samples in the block indicate high uranium content but the gamma ray logging results of exploratory boreholes in the block, did not obtain the expected results. To understand this abrupt discontinuity between the two sets of data (surface and subsurface) multivariate statistical analysis of primordial radioactive elements (K(40), U(238) and Th(232)) was performed using the concept of representative subsurface samples, drawn from the randomly selected 11 boreholes of this block. The study was performed to a high confidence level (99%), and results are discussed for assessing the U and Th behavior in the block. Results not only confirm the continuation of three distinct geological formations in the area but also the uranium bearing potential in the Mahadek sandstone of the eastern part of Mahadek Basin. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. Runoff potentiality of a watershed through SCS and functional data analysis technique.

    PubMed

    Adham, M I; Shirazi, S M; Othman, F; Rahman, S; Yusop, Z; Ismail, Z

    2014-01-01

    Runoff potentiality of a watershed was assessed based on identifying curve number (CN), soil conservation service (SCS), and functional data analysis (FDA) techniques. Daily discrete rainfall data were collected from weather stations in the study area and analyzed through lowess method for smoothing curve. As runoff data represents a periodic pattern in each watershed, Fourier series was introduced to fit the smooth curve of eight watersheds. Seven terms of Fourier series were introduced for the watersheds 5 and 8, while 8 terms of Fourier series were used for the rest of the watersheds for the best fit of data. Bootstrapping smooth curve analysis reveals that watersheds 1, 2, 3, 6, 7, and 8 are with monthly mean runoffs of 29, 24, 22, 23, 26, and 27 mm, respectively, and these watersheds would likely contribute to surface runoff in the study area. The purpose of this study was to transform runoff data into a smooth curve for representing the surface runoff pattern and mean runoff of each watershed through statistical method. This study provides information of runoff potentiality of each watershed and also provides input data for hydrological modeling.

  14. Runoff Potentiality of a Watershed through SCS and Functional Data Analysis Technique

    PubMed Central

    Adham, M. I.; Shirazi, S. M.; Othman, F.; Rahman, S.; Yusop, Z.; Ismail, Z.

    2014-01-01

    Runoff potentiality of a watershed was assessed based on identifying curve number (CN), soil conservation service (SCS), and functional data analysis (FDA) techniques. Daily discrete rainfall data were collected from weather stations in the study area and analyzed through lowess method for smoothing curve. As runoff data represents a periodic pattern in each watershed, Fourier series was introduced to fit the smooth curve of eight watersheds. Seven terms of Fourier series were introduced for the watersheds 5 and 8, while 8 terms of Fourier series were used for the rest of the watersheds for the best fit of data. Bootstrapping smooth curve analysis reveals that watersheds 1, 2, 3, 6, 7, and 8 are with monthly mean runoffs of 29, 24, 22, 23, 26, and 27 mm, respectively, and these watersheds would likely contribute to surface runoff in the study area. The purpose of this study was to transform runoff data into a smooth curve for representing the surface runoff pattern and mean runoff of each watershed through statistical method. This study provides information of runoff potentiality of each watershed and also provides input data for hydrological modeling. PMID:25152911

  15. The Importance of Statistical Modeling in Data Analysis and Inference

    ERIC Educational Resources Information Center

    Rollins, Derrick, Sr.

    2017-01-01

    Statistical inference simply means to draw a conclusion based on information that comes from data. Error bars are the most commonly used tool for data analysis and inference in chemical engineering data studies. This work demonstrates, using common types of data collection studies, the importance of specifying the statistical model for sound…

  16. Statistical fluctuations of an ocean surface inferred from shoes and ships

    NASA Astrophysics Data System (ADS)

    Lerche, Ian; Maubeuge, Frédéric

    1995-12-01

    This paper shows that it is possible to roughly estimate some ocean properties using simple time-dependent statistical models of ocean fluctuations. Based on a real incident, the loss by a vessel of a Nike shoes container in the North Pacific Ocean, a statistical model was tested on data sets consisting of the Nike shoes found by beachcombers a few months later. This statistical treatment of the shoes' motion allows one to infer velocity trends of the Pacific Ocean, together with their fluctuation strengths. The idea is to suppose that there is a mean bulk flow speed that can depend on location on the ocean surface and time. The fluctuations of the surface flow speed are then treated as statistically random. The distribution of shoes is described in space and time using Markov probability processes related to the mean and fluctuating ocean properties. The aim of the exercise is to provide some of the properties of the Pacific Ocean that are otherwise calculated using a sophisticated numerical model, OSCURS, where numerous data are needed. Relevant quantities are sharply estimated, which can be useful to (1) constrain output results from OSCURS computations, and (2) elucidate the behavior patterns of ocean flow characteristics on long time scales.

  17. Air-flow distortion and turbulence statistics near an animal facility

    NASA Astrophysics Data System (ADS)

    Prueger, J. H.; Eichinger, W. E.; Hipps, L. E.; Hatfield, J. L.; Cooper, D. I.

    The emission and dispersion of particulates and gases from concentrated animal feeding operations (CAFO) at local to regional scales is a current issue in science and society. The transport of particulates, odors and toxic chemical species from the source into the local and eventually regional atmosphere is largely determined by turbulence. Any models that attempt to simulate the dispersion of particles must either specify or assume various statistical properties of the turbulence field. Statistical properties of turbulence are well documented for idealized boundary layers above uniform surfaces. However, an animal production facility is a complex surface with structures that act as bluff bodies that distort the turbulence intensity near the buildings. As a result, the initial release and subsequent dispersion of effluents in the region near a facility will be affected by the complex nature of the surface. Previous Lidar studies of plume dispersion over the facility used in this study indicated that plumes move in complex yet organized patterns that would not be explained by the properties of turbulence generally assumed in models. The objective of this study was to characterize the near-surface turbulence statistics in the flow field around an array of animal confinement buildings. Eddy covariance towers were erected in the upwind, within the building array and downwind regions of the flow field. Substantial changes in turbulence intensity statistics and turbulence-kinetic energy (TKE) were observed as the mean wind flow encountered the building structures. Spectra analysis demonstrated unique distribution of the spectral energy in the vertical profile above the buildings.

  18. Variability in source sediment contributions by applying different statistic test for a Pyrenean catchment.

    PubMed

    Palazón, L; Navas, A

    2017-06-01

    Information on sediment contribution and transport dynamics from the contributing catchments is needed to develop management plans to tackle environmental problems related with effects of fine sediment as reservoir siltation. In this respect, the fingerprinting technique is an indirect technique known to be valuable and effective for sediment source identification in river catchments. Large variability in sediment delivery was found in previous studies in the Barasona catchment (1509 km 2 , Central Spanish Pyrenees). Simulation results with SWAT and fingerprinting approaches identified badlands and agricultural uses as the main contributors to sediment supply in the reservoir. In this study the <63 μm sediment fraction from the surface reservoir sediments (2 cm) are investigated following the fingerprinting procedure to assess how the use of different statistical procedures affects the amounts of source contributions. Three optimum composite fingerprints were selected to discriminate between source contributions based in land uses/land covers from the same dataset by the application of (1) discriminant function analysis; and its combination (as second step) with (2) Kruskal-Wallis H-test and (3) principal components analysis. Source contribution results were different between assessed options with the greatest differences observed for option using #3, including the two step process: principal components analysis and discriminant function analysis. The characteristics of the solutions by the applied mixing model and the conceptual understanding of the catchment showed that the most reliable solution was achieved using #2, the two step process of Kruskal-Wallis H-test and discriminant function analysis. The assessment showed the importance of the statistical procedure used to define the optimum composite fingerprint for sediment fingerprinting applications. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Optimization of Coolant Technique Conditions for Machining A319 Aluminium Alloy Using Response Surface Method (RSM)

    NASA Astrophysics Data System (ADS)

    Zainal Ariffin, S.; Razlan, A.; Ali, M. Mohd; Efendee, A. M.; Rahman, M. M.

    2018-03-01

    Background/Objectives: The paper discusses about the optimum cutting parameters with coolant techniques condition (1.0 mm nozzle orifice, wet and dry) to optimize surface roughness, temperature and tool wear in the machining process based on the selected setting parameters. The selected cutting parameters for this study were the cutting speed, feed rate, depth of cut and coolant techniques condition. Methods/Statistical Analysis Experiments were conducted and investigated based on Design of Experiment (DOE) with Response Surface Method. The research of the aggressive machining process on aluminum alloy (A319) for automotive applications is an effort to understand the machining concept, which widely used in a variety of manufacturing industries especially in the automotive industry. Findings: The results show that the dominant failure mode is the surface roughness, temperature and tool wear when using 1.0 mm nozzle orifice, increases during machining and also can be alternative minimize built up edge of the A319. The exploration for surface roughness, productivity and the optimization of cutting speed in the technical and commercial aspects of the manufacturing processes of A319 are discussed in automotive components industries for further work Applications/Improvements: The research result also beneficial in minimizing the costs incurred and improving productivity of manufacturing firms. According to the mathematical model and equations, generated by CCD based RSM, experiments were performed and cutting coolant condition technique using size nozzle can reduces tool wear, surface roughness and temperature was obtained. Results have been analyzed and optimization has been carried out for selecting cutting parameters, shows that the effectiveness and efficiency of the system can be identified and helps to solve potential problems.

  20. Modeling Ka-band low elevation angle propagation statistics

    NASA Technical Reports Server (NTRS)

    Russell, Thomas A.; Weinfield, John; Pearson, Chris; Ippolito, Louis J.

    1995-01-01

    The statistical variability of the secondary atmospheric propagation effects on satellite communications cannot be ignored at frequencies of 20 GHz or higher, particularly if the propagation margin allocation is such that link availability falls below 99 percent. The secondary effects considered in this paper are gaseous absorption, cloud absorption, and tropospheric scintillation; rain attenuation is the primary effect. Techniques and example results are presented for estimation of the overall combined impact of the atmosphere on satellite communications reliability. Statistical methods are employed throughout and the most widely accepted models for the individual effects are used wherever possible. The degree of correlation between the effects is addressed and some bounds on the expected variability in the combined effects statistics are derived from the expected variability in correlation. Example estimates are presented of combined effects statistics in the Washington D.C. area of 20 GHz and 5 deg elevation angle. The statistics of water vapor are shown to be sufficient for estimation of the statistics of gaseous absorption at 20 GHz. A computer model based on monthly surface weather is described and tested. Significant improvement in prediction of absorption extremes is demonstrated with the use of path weather data instead of surface data.

Top