Science.gov

Sample records for advanced statistical techniques

  1. Classification of human colonic tissues using FTIR spectra and advanced statistical techniques

    NASA Astrophysics Data System (ADS)

    Zwielly, A.; Argov, S.; Salman, A.; Bogomolny, E.; Mordechai, S.

    2010-04-01

    One of the major public health hazards is colon cancer. There is a great necessity to develop new methods for early detection of cancer. If colon cancer is detected and treated early, cure rate of more than 90% can be achieved. In this study we used FTIR microscopy (MSP), which has shown a good potential in the last 20 years in the fields of medical diagnostic and early detection of abnormal tissues. Large database of FTIR microscopic spectra was acquired from 230 human colonic biopsies. Five different subgroups were included in our database, normal and cancer tissues as well as three stages of benign colonic polyps, namely, mild, moderate and severe polyps which are precursors of carcinoma. In this study we applied advanced mathematical and statistical techniques including principal component analysis (PCA) and linear discriminant analysis (LDA), on human colonic FTIR spectra in order to differentiate among the mentioned subgroups' tissues. Good classification accuracy between normal, polyps and cancer groups was achieved with approximately 85% success rate. Our results showed that there is a great potential of developing FTIR-micro spectroscopy as a simple, reagent-free viable tool for early detection of colon cancer in particular the early stages of premalignancy among the benign colonic polyps.

  2. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  3. COLLABORATIVE RESEARCH:USING ARM OBSERVATIONS & ADVANCED STATISTICAL TECHNIQUES TO EVALUATE CAM3 CLOUDS FOR DEVELOPMENT OF STOCHASTIC CLOUD-RADIATION

    SciTech Connect

    Somerville, Richard

    2013-08-22

    The long-range goal of several past and current projects in our DOE-supported research has been the development of new and improved parameterizations of cloud-radiation effects and related processes, using ARM data, and the implementation and testing of these parameterizations in global models. The main objective of the present project being reported on here has been to develop and apply advanced statistical techniques, including Bayesian posterior estimates, to diagnose and evaluate features of both observed and simulated clouds. The research carried out under this project has been novel in two important ways. The first is that it is a key step in the development of practical stochastic cloud-radiation parameterizations, a new category of parameterizations that offers great promise for overcoming many shortcomings of conventional schemes. The second is that this work has brought powerful new tools to bear on the problem, because it has been a collaboration between a meteorologist with long experience in ARM research (Somerville) and a mathematician who is an expert on a class of advanced statistical techniques that are well-suited for diagnosing model cloud simulations using ARM observations (Shen).

  4. Large ensemble modeling of the last deglacial retreat of the West Antarctic Ice Sheet: comparison of simple and advanced statistical techniques

    NASA Astrophysics Data System (ADS)

    Pollard, David; Chang, Won; Haran, Murali; Applegate, Patrick; DeConto, Robert

    2016-05-01

    A 3-D hybrid ice-sheet model is applied to the last deglacial retreat of the West Antarctic Ice Sheet over the last ˜ 20 000 yr. A large ensemble of 625 model runs is used to calibrate the model to modern and geologic data, including reconstructed grounding lines, relative sea-level records, elevation-age data and uplift rates, with an aggregate score computed for each run that measures overall model-data misfit. Two types of statistical methods are used to analyze the large-ensemble results: simple averaging weighted by the aggregate score, and more advanced Bayesian techniques involving Gaussian process-based emulation and calibration, and Markov chain Monte Carlo. The analyses provide sea-level-rise envelopes with well-defined parametric uncertainty bounds, but the simple averaging method only provides robust results with full-factorial parameter sampling in the large ensemble. Results for best-fit parameter ranges and envelopes of equivalent sea-level rise with the simple averaging method agree well with the more advanced techniques. Best-fit parameter ranges confirm earlier values expected from prior model tuning, including large basal sliding coefficients on modern ocean beds.

  5. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of...

  6. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of...

  7. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of...

  8. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of...

  9. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of...

  10. Advanced Communication Processing Techniques

    NASA Astrophysics Data System (ADS)

    Scholtz, Robert A.

    This document contains the proceedings of the workshop Advanced Communication Processing Techniques, held May 14 to 17, 1989, near Ruidoso, New Mexico. Sponsored by the Army Research Office (under Contract DAAL03-89-G-0016) and organized by the Communication Sciences Institute of the University of Southern California, the workshop had as its objective to determine those applications of intelligent/adaptive communication signal processing that have been realized and to define areas of future research. We at the Communication Sciences Institute believe that there are two emerging areas which deserve considerably more study in the near future: (1) Modulation characterization, i.e., the automation of modulation format recognition so that a receiver can reliably demodulate a signal without using a priori information concerning the signal's structure, and (2) the incorporation of adaptive coding into communication links and networks. (Encoders and decoders which can operate with a wide variety of codes exist, but the way to utilize and control them in links and networks is an issue). To support these two new interest areas, one must have both a knowledge of (3) the kinds of channels and environments in which the systems must operate, and of (4) the latest adaptive equalization techniques which might be employed in these efforts.

  11. Intermediate/Advanced Research Design and Statistics

    NASA Technical Reports Server (NTRS)

    Ploutz-Snyder, Robert

    2009-01-01

    The purpose of this module is To provide Institutional Researchers (IRs) with an understanding of the principles of advanced research design and the intermediate/advanced statistical procedures consistent with such designs

  12. Advanced qualification techniques

    SciTech Connect

    Winokur, P.S; Shaneyfelt, M.R.; Meisenheimer, T.L.; Fleetwood, D.M.

    1993-12-01

    This paper demonstrates use of the Qualified Manufacturers List (QML) methodology to qualify commercial and military microelectronics for use in space applications. QML ``builds in`` the hardness of product through statistical process control (SPC) of technology parameters relevant to the radiation response, test structure to integrated circuit (IC) correlations, and techniques for extrapolating laboratory test results to low-dose-rate space scenarios. Each of these elements is demonstrated and shown to be a cost-effective alternative to expensive end-of-line IC testing. Several examples of test structured-IC correlations are provided and recent work on complications arising from transistor scaling and geometry is discussed. The use of a 10-keV x-ray wafer-level test system to support SPC and establish ``process capability`` is illustrated and a comparison of 10-keV x-ray and Co{sup 60} gamma irradiations is provided for a wide range of CMOS technologies. The x-ray tester is shown to be cost-effective and its use in lot acceptance/qualification is recommended. Finally, a comparison is provided between MIL-STD-883D, Test Method 1019.4, which governs the testing of packaged semiconductor microcircuits in the DoD, and ESA/SSC Basic Specification No. 22900, Europe`s Total Dose Steady-State Irradiation Test Method. Test Method 1019.4 focuses on conservative estimates of MOS hardness for space and tactical applications, while Basic Specification 22900 focuses on improved simulation of low-dose-rate space environments.

  13. Advanced qualification techniques

    NASA Astrophysics Data System (ADS)

    Winokur, P. S.; Shaneyfelt, M. R.; Meisenheimer, T. L.; Fleetwood, D. M.

    This paper demonstrates use of the Qualified Manufacturers List (QML) methodology to qualify commercial and military microelectronics for use in space applications. QML 'builds in' the hardness of product through statistical process control (SPC) of technology parameters relevant to the radiation response, test structure to integrated circuit (IC) correlations, and techniques for extrapolating laboratory test results to low-dose-rate space scenarios. Each of these elements is demonstrated and shown to be a cost-effective alternative to expensive end-of-line IC testing. Several examples of test structured-IC correlations are provided and recent work on complications arising from transistor scaling and geometry is discussed. The use of a 10-keV x-ray wafer-level test system to support SPC and establish 'process capability' is illustrated and a comparison of 10-keV x-ray and Co-60 gamma irradiations is provided for a wide range of CMOS technologies. The x-ray tester is shown to be cost-effective and its use in lot acceptance/qualification is recommended. Finally, a comparison is provided between MIL-STD-883D, Test Method 1019.4, which governs the testing of packaged semiconductor microcircuits in the DoD, and ESA/SSC Basic Specification No. 22900, Europe's Total Dose Steady-State Irradiation Test Method. Test Method 1019.4 focuses on conservative estimates of MOS hardness for space and tactical applications, while Basic Specification 22900 focuses on improved simulation of low-dose-rate space environments.

  14. Advanced qualification techniques

    SciTech Connect

    Winokur, P.S.; Shaneyfelt, M.R.; Meisenheimer, T.L.; Fleetwood, D.M. )

    1994-06-01

    This paper demonstrates use of the Qualified Manufacturers List (QML) methodology to qualify commercial and military microelectronics for use in space applications. QML ''builds in'' the hardness of product through statistical process control (SPC) of technology parameters relevant to the radiation response, test structure to integrated circuit (IC) correlations, and techniques for extrapolating laboratory test results to low-dose-rate space scenarios. Each of these elements is demonstrated and shown to be a cost-effective alternative to expensive end-of-line IC testing. Several examples of test structure-to-IC correlations are provided and recent work on complications arising from transistor scaling and geometry is discussed. The use of a 10-keV x-ray wafer-level test system to support SPC and establish ''process capability'' is illustrated and a comparison of 10-kev x-ray wafer-level test system to support SPC and establish ''process capability'' is illustrated and a comparison of 10-keV x-ray and Co[sup 60] gamma irradiations is provided for a wide range of CMOS technologies. The x-ray tester is shown to be cost-effective and its use in lot acceptance/qualification is recommended. Finally, a comparison is provided between MIL-STD-883, Test Method 1019.4, which governs the testing of packaged semiconductor microcircuits in the DoD, and ESA/SCC Basic Specification No. 22900, Europe's Total Dose Steady-State Irradiation Test Method. Test Method 1019.4 focuses on conservative estimates of MOS hardness for space and tactical applications, while Basic Specification 22900 focuses on improved simulation of low-dose-rate space environments.

  15. Advanced qualification techniques

    NASA Astrophysics Data System (ADS)

    Winokur, P. S.; Shaneyfelt, M. R.; Meisenheimer, T. L.; Fleetwood, D. M.

    1994-06-01

    This paper demonstrates use of the Qualified Manufacturers List (QML) methodology to qualify commercial and military microelectronics for use in space applications. QML 'builds in' the hardness of product through statistical process control (SPC) of technology parameters relevant to the radiation response, test structure to integrated circuit (IC) correlations, and techniques for extrapolating laboratory test results to low-dose-rate space scenarios. Each of these elements is demonstrated and shown to be a cost-effective alternative to expensive end-of-line IC testing. Several examples of test structure-to-IC correlations are provided and recent work on complications arising from transistor scaling and geometry is discussed. The use of a 10-keV x-ray wafer-level test system to support SPC and establish 'process capability' is illustrated and a comparison of 10-keV x-ray and Co-60 gamma irradiations is provided for a wide range of CMOS technologies. The x-ray tester is shown to be cost-effective and its use in lot acceptance/qualification is recommended. Finally, a comparison is provided between MIL-STD-883, Test Method 1019.4, which governs the testing of packaged semiconductor microcircuits in the DoD, and ESA/SCC Basic Specification No. 22900, Europe's Total Dose Steady-State Irradiation Test Method. Test Method 1019.4 focuses on conservative estimates of MOS hardness for space and tactical applications, while Basic Specification 22900 focuses on improved simulation of low-dose-rate space environments.

  16. Advanced Coating Removal Techniques

    NASA Technical Reports Server (NTRS)

    Seibert, Jon

    2006-01-01

    An important step in the repair and protection against corrosion damage is the safe removal of the oxidation and protective coatings without further damaging the integrity of the substrate. Two such methods that are proving to be safe and effective in this task are liquid nitrogen and laser removal operations. Laser technology used for the removal of protective coatings is currently being researched and implemented in various areas of the aerospace industry. Delivering thousands of focused energy pulses, the laser ablates the coating surface by heating and dissolving the material applied to the substrate. The metal substrate will reflect the laser and redirect the energy to any remaining protective coating, thus preventing any collateral damage the substrate may suffer throughout the process. Liquid nitrogen jets are comparable to blasting with an ultra high-pressure water jet but without the residual liquid that requires collection and removal .As the liquid nitrogen reaches the surface it is transformed into gaseous nitrogen and reenters the atmosphere without any contamination to surrounding hardware. These innovative technologies simplify corrosion repair by eliminating hazardous chemicals and repetitive manual labor from the coating removal process. One very significant advantage is the reduction of particulate contamination exposure to personnel. With the removal of coatings adjacent to sensitive flight hardware, a benefit of each technique for the space program is that no contamination such as beads, water, or sanding residue is left behind when the job is finished. One primary concern is the safe removal of coatings from thin aluminum honeycomb face sheet. NASA recently conducted thermal testing on liquid nitrogen systems and found that no damage occurred on 1/6", aluminum substrates. Wright Patterson Air Force Base in conjunction with Boeing and NASA is currently testing the laser remOval technique for process qualification. Other applications of liquid

  17. Advanced Wavefront Control Techniques

    SciTech Connect

    Olivier, S S; Brase, J M; Avicola, K; Thompson, C A; Kartz, M W; Winters, S; Hartley, R; Wihelmsen, J; Dowla, F V; Carrano, C J; Bauman, B J; Pennington, D M; Lande, D; Sawvel, R M; Silva, D A; Cooke, J B; Brown, C G

    2001-02-21

    this project, work was performed in four areas (1) advanced modeling tools for deformable mirrors (2) low-order wavefront correctors with Alvarez lenses, (3) a direct phase measuring heterdyne wavefront sensor, and (4) high-spatial-frequency wavefront control using spatial light modulators.

  18. A Comparison of the Performance of Advanced Statistical Techniques for the Refinement of Day-ahead and Longer NWP-based Wind Power Forecasts

    NASA Astrophysics Data System (ADS)

    Zack, J. W.

    2015-12-01

    Predictions from Numerical Weather Prediction (NWP) models are the foundation for wind power forecasts for day-ahead and longer forecast horizons. The NWP models directly produce three-dimensional wind forecasts on their respective computational grids. These can be interpolated to the location and time of interest. However, these direct predictions typically contain significant systematic errors ("biases"). This is due to a variety of factors including the limited space-time resolution of the NWP models and shortcomings in the model's representation of physical processes. It has become common practice to attempt to improve the raw NWP forecasts by statistically adjusting them through a procedure that is widely known as Model Output Statistics (MOS). The challenge is to identify complex patterns of systematic errors and then use this knowledge to adjust the NWP predictions. The MOS-based improvements are the basis for much of the value added by commercial wind power forecast providers. There are an enormous number of statistical approaches that can be used to generate the MOS adjustments to the raw NWP forecasts. In order to obtain insight into the potential value of some of the newer and more sophisticated statistical techniques often referred to as "machine learning methods" a MOS-method comparison experiment has been performed for wind power generation facilities in 6 wind resource areas of California. The underlying NWP models that provided the raw forecasts were the two primary operational models of the US National Weather Service: the GFS and NAM models. The focus was on 1- and 2-day ahead forecasts of the hourly wind-based generation. The statistical methods evaluated included: (1) screening multiple linear regression, which served as a baseline method, (2) artificial neural networks, (3) a decision-tree approach called random forests, (4) gradient boosted regression based upon an decision-tree algorithm, (5) support vector regression and (6) analog ensemble

  19. Techniques in Advanced Language Teaching.

    ERIC Educational Resources Information Center

    Ager, D. E.

    1967-01-01

    For ease of presentation, advanced grammar teaching techniques are briefly considered under the headings of structuralism (belief in the effectiveness of presenting grammar rules) and contextualism (belief in the maximum use by students of what they know in the target language). The structuralist's problem of establishing a syllabus is discussed…

  20. Identifying the Drivers and Occurrence of Historical and Future Extreme Air-quality Events in the United States Using Advanced Statistical Techniques

    NASA Astrophysics Data System (ADS)

    Porter, W. C.; Heald, C. L.; Cooley, D. S.; Russell, B. T.

    2013-12-01

    Episodes of air-quality extremes are known to be heavily influenced by meteorological conditions, but traditional statistical analysis techniques focused on means and standard deviations may not capture important relationships at the tails of these two respective distributions. Using quantile regression (QR) and extreme value theory (EVT), methodologies specifically developed to examine the behavior of heavy-tailed phenomena, we analyze extremes in the multi-decadal record of ozone (O3) and fine particulate matter (PM2.5) in the United States. We investigate observations from the Air Quality System (AQS) and Interagency Monitoring of Protected Visual Environments (IMPROVE) networks for connections to meteorological drivers, as provided by the National Center for Environmental Prediction (NCEP) North American Regional Reanalysis (NARR) product. Through regional characterization by quantile behavior and EVT modeling of the meteorological covariates most responsible for extreme levels of O3 and PM2.5, we estimate pollutant exceedance frequencies and uncertainties in the United States under current and projected future climates, highlighting those meteorological covariates and interactions whose influence on air-quality extremes differs most significantly from the behavior of the bulk of the distribution. As current policy may be influenced by air-quality projections, we then compare these estimated frequencies to those produced by NCAR's Community Earth System Model (CESM) identifying regions, covariates, and species whose extreme behavior may not be adequately captured by current models.

  1. Advances in wound debridement techniques.

    PubMed

    Nazarko, Linda

    2015-06-01

    Dead and devitalised tissue interferes with the process of wound healing. Debridement is a natural process that occurs in all wounds and is crucial to healing; it reduces the bacterial burden in a wound and promotes effective inflammatory responses that encourage the formation of healthy granulation tissue (Wolcott et al, 2009). Wound care should be part of holistic patient care. Recent advances in debridement techniques include: biosurgery, hydrosurgery, mechanical debridement, and ultrasound. Biosurgery and mechanical debridement can be practiced by nonspecialist nurses and can be provided in a patient's home, thus increasing the patient's access to debridement therapy and accelerating wound healing.

  2. Statistical and Economic Techniques for Site-specific Nematode Management.

    PubMed

    Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L

    2014-03-01

    Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.

  3. Statistical problems in design technique validation

    SciTech Connect

    Cohen, J.S.

    1980-04-01

    This work is concerned with the statistical validation process for measuring the accuracy of design techniques for solar energy systems. This includes a discussion of the statistical variability inherent in the design and measurement processes and the way in which this variability can dictate the choice of experimental design, choice of data, accuracy of the results, and choice of questions that can be reliably answered in such a study. The approach here is primarily concerned with design procedure validation in the context of the realistic process of system desig, where the discrepancy between measured and predicted results is due to limitations in the mathematical models employed by the procedures and the inaccuracies of input data. A set of guidelines for successful validation methodologies is discussed, and a simplified validation methodology for domestic hot water heaters is presented.

  4. Enhanced bio-manufacturing through advanced multivariate statistical technologies.

    PubMed

    Martin, E B; Morris, A J

    2002-11-13

    The paper describes the interrogation of data, from a reaction vessel producing an active pharmaceutical ingredient (API), using advanced multivariate statistical techniques. Due to the limited number of batches available, data augmentation was used to increase the number of batches thereby enabling the extraction of more subtle process behaviour from the data. A second methodology investigated was that of multi-group modelling. This allowed between cluster variability to be removed, thus allowing attention to focus on within process variability. The paper describes how the different approaches enabled the realisation of a better understanding of the factors causing the onset of an impurity formation to be obtained as well demonstrating the power of multivariate statistical data analysis techniques to provide an enhanced understanding of the process.

  5. Air Quality Forecasting through Different Statistical and Artificial Intelligence Techniques

    NASA Astrophysics Data System (ADS)

    Mishra, D.; Goyal, P.

    2014-12-01

    Urban air pollution forecasting has emerged as an acute problem in recent years because there are sever environmental degradation due to increase in harmful air pollutants in the ambient atmosphere. In this study, there are different types of statistical as well as artificial intelligence techniques are used for forecasting and analysis of air pollution over Delhi urban area. These techniques are principle component analysis (PCA), multiple linear regression (MLR) and artificial neural network (ANN) and the forecasting are observed in good agreement with the observed concentrations through Central Pollution Control Board (CPCB) at different locations in Delhi. But such methods suffers from disadvantages like they provide limited accuracy as they are unable to predict the extreme points i.e. the pollution maximum and minimum cut-offs cannot be determined using such approach. Also, such methods are inefficient approach for better output forecasting. But with the advancement in technology and research, an alternative to the above traditional methods has been proposed i.e. the coupling of statistical techniques with artificial Intelligence (AI) can be used for forecasting purposes. The coupling of PCA, ANN and fuzzy logic is used for forecasting of air pollutant over Delhi urban area. The statistical measures e.g., correlation coefficient (R), normalized mean square error (NMSE), fractional bias (FB) and index of agreement (IOA) of the proposed model are observed in better agreement with the all other models. Hence, the coupling of statistical and artificial intelligence can be use for the forecasting of air pollutant over urban area.

  6. Advanced Bode Plot Techniques for Ultrasonic Transducers

    NASA Astrophysics Data System (ADS)

    DeAngelis, D. A.; Schulze, G. W.

    The Bode plot, displayed as either impedance or admittance versus frequency, is the most basic test used by ultrasonic transducer designers. With simplicity and ease-of-use, Bode plots are ideal for baseline comparisons such as spacing of parasitic modes or impedance, but quite often the subtleties that manifest as poor process control are hard to interpret or are nonexistence. In-process testing of transducers is time consuming for quantifying statistical aberrations, and assessments made indirectly via the workpiece are difficult. This research investigates the use of advanced Bode plot techniques to compare ultrasonic transducers with known "good" and known "bad" process performance, with the goal of a-priori process assessment. These advanced techniques expand from the basic constant voltage versus frequency sweep to include constant current and constant velocity interrogated locally on transducer or tool; they also include up and down directional frequency sweeps to quantify hysteresis effects like jumping and dropping phenomena. The investigation focuses solely on the common PZT8 piezoelectric material used with welding transducers for semiconductor wire bonding. Several metrics are investigated such as impedance, displacement/current gain, velocity/current gain, displacement/voltage gain and velocity/voltage gain. The experimental and theoretical research methods include Bode plots, admittance loops, laser vibrometry and coupled-field finite element analysis.

  7. Writing to Learn Statistics in an Advanced Placement Statistics Course

    ERIC Educational Resources Information Center

    Northrup, Christian Glenn

    2012-01-01

    This study investigated the use of writing in a statistics classroom to learn if writing provided a rich description of problem-solving processes of students as they solved problems. Through analysis of 329 written samples provided by students, it was determined that writing provided a rich description of problem-solving processes and enabled…

  8. Advanced techniques of laser telemetry

    NASA Astrophysics Data System (ADS)

    Donati, S.; Gilardini, A.

    The relationships which govern a laser telemeter; noise sources; and measurement accuracy with pulsed and sinusoidal intensity modulation techniques are discussed. Developments in telemetry instrumention and optical detection are considered. Meteorological interferometers, geodimeters, and military telemeters are described. Propagation attenuation and signal to noise ratios are treated. It is shown that accuracy depends on the product of measurement time and received power. The frequency scanning technique of CW and long pulse telemetry; multifrequency techniques; pulse compression; and vernier technique are outlined.

  9. Splitting advancement genioplasty: a new genioplasty technique.

    PubMed

    Celik, M; Tuncer, S; Büyükçayir, I

    1999-08-01

    A new genioplasty technique has been described and performed on 16 patients since 1995. The technique has been developed to avoid some undesired results of the current osseous genioplasty techniques and to achieve a more natural appearance in advancement genioplasty. According to the authors' technique, a rectangular part of the outer table of the mentum is split away from the mandible, and is advanced and fixated to the mandible. This technique can be used for advancement cases but not for reduction genioplasty. This technique was performed on 16 patients with only minor complications, including one case of wound dehiscence, one hematoma, and one case of osteomyelitis, which was managed with systemic antibiotic therapy. Aesthetic results were found to be satisfactory according to an evaluation by the authors. When the results were evaluated using pre- and postoperative photos, lip position and projection of the mentum were found to be natural in shape appearance. During the late postoperative period, the new bone formation between the advanced segment and the mandible was demonstrated radiographically. Advantages of the technique include having more contact surfaces for bony healing, a natural position of the lower lip, more natural projection of the mentum, tridimensional movement of the mentum, and improvement in the soft tissue of the neck. The disadvantages of the technique are the potential risk of infection due to dead space from the advancement, manipulation problems during surgery, and possible mental nerve injury. Splitting advancement genioplasty was found to be a useful technique for advancement genioplasty. Splitting advancement genioplasty is a more physiological osteotomy technique than most of osseous genioplasty techniques. PMID:10454320

  10. Application of multivariate statistical techniques in microbial ecology.

    PubMed

    Paliy, O; Shankar, V

    2016-03-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large-scale ecological data sets. In particular, noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amount of data, powerful statistical techniques of multivariate analysis are well suited to analyse and interpret these data sets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular data set. In this review, we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and data set structure.

  11. Application of multivariate statistical techniques in microbial ecology.

    PubMed

    Paliy, O; Shankar, V

    2016-03-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large-scale ecological data sets. In particular, noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amount of data, powerful statistical techniques of multivariate analysis are well suited to analyse and interpret these data sets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular data set. In this review, we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and data set structure. PMID:26786791

  12. Conceptualizing a Framework for Advanced Placement Statistics Teaching Knowledge

    ERIC Educational Resources Information Center

    Haines, Brenna

    2015-01-01

    The purpose of this article is to sketch a conceptualization of a framework for Advanced Placement (AP) Statistics Teaching Knowledge. Recent research continues to problematize the lack of knowledge and preparation among secondary level statistics teachers. The College Board's AP Statistics course continues to grow and gain popularity, but is a…

  13. Stitching Techniques Advance Optics Manufacturing

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Because NASA depends on the fabrication and testing of large, high-quality aspheric (nonspherical) optics for applications like the James Webb Space Telescope, it sought an improved method for measuring large aspheres. Through Small Business Innovation Research (SBIR) awards from Goddard Space Flight Center, QED Technologies, of Rochester, New York, upgraded and enhanced its stitching technology for aspheres. QED developed the SSI-A, which earned the company an R&D 100 award, and also developed a breakthrough machine tool called the aspheric stitching interferometer. The equipment is applied to advanced optics in telescopes, microscopes, cameras, medical scopes, binoculars, and photolithography."

  14. Advanced Spectroscopy Technique for Biomedicine

    NASA Astrophysics Data System (ADS)

    Zhao, Jianhua; Zeng, Haishan

    This chapter presents an overview of the applications of optical spectroscopy in biomedicine. We focus on the optical design aspects of advanced biomedical spectroscopy systems, Raman spectroscopy system in particular. Detailed components and system integration are provided. As examples, two real-time in vivo Raman spectroscopy systems, one for skin cancer detection and the other for endoscopic lung cancer detection, and an in vivo confocal Raman spectroscopy system for skin assessment are presented. The applications of Raman spectroscopy in cancer diagnosis of the skin, lung, colon, oral cavity, gastrointestinal tract, breast, and cervix are summarized.

  15. Statistical Analysis Techniques for Small Sample Sizes

    NASA Technical Reports Server (NTRS)

    Navard, S. E.

    1984-01-01

    The small sample sizes problem which is encountered when dealing with analysis of space-flight data is examined. Because of such a amount of data available, careful analyses are essential to extract the maximum amount of information with acceptable accuracy. Statistical analysis of small samples is described. The background material necessary for understanding statistical hypothesis testing is outlined and the various tests which can be done on small samples are explained. Emphasis is on the underlying assumptions of each test and on considerations needed to choose the most appropriate test for a given type of analysis.

  16. Advanced Algorithms and Statistics for MOS Surveys

    NASA Astrophysics Data System (ADS)

    Bolton, A. S.

    2016-10-01

    This paper presents an individual view on the current state of computational data processing and statistics for inference and discovery in multi-object spectroscopic surveys, supplemented by a historical perspective and a few present-day applications. It is more op-ed than review, and hopefully more readable as a result.

  17. Advance Report of Final Mortality Statistics, 1985.

    ERIC Educational Resources Information Center

    Monthly Vital Statistics Report, 1987

    1987-01-01

    This document presents mortality statistics for 1985 for the entire United States. Data analysis and discussion of these factors is included: death and death rates; death rates by age, sex, and race; expectation of life at birth and at specified ages; causes of death; infant mortality; and maternal mortality. Highlights reported include: (1) the…

  18. Advanced techniques in abdominal surgery.

    PubMed Central

    Monson, J R

    1993-01-01

    Almost every abdominal organ is now amenable to laparoscopic surgery. Laparoscopic appendicectomy is a routine procedure which also permits identification of other conditions initially confused with an inflamed appendix. However, assessment of appendiceal inflammation is more difficult. Almost all colonic procedures can be performed laparoscopically, at least partly, though resection for colonic cancer is still controversial. For simple patch repair of perforated duodenal ulcers laparoscopy is ideal, and inguinal groin hernia can be repaired satisfactorily with a patch of synthetic mesh. Many upper abdominal procedures, however, still take more time than the open operations. These techniques reduce postoperative pain and the incidence of wound infections and allow a much earlier return to normal activity compared with open surgery. They have also brought new disciplines: surgeons must learn different hand-eye coordination, meticulous haemostasis is needed to maintain picture quality, and delivery of specimens may be problematic. The widespread introduction of laparoscopic techniques has emphasised the need for adequate training (operations that were straight-forward open procedures may require considerable laparoscopic expertise) and has raised questions about trainee surgeons acquiring adequate experience of open procedures. Images FIG 9 p1347-a p1347-b p1349-a p1350-a p1350-b PMID:8257893

  19. Advanced prosthetic techniques for below knee amputations.

    PubMed

    Staats, T B

    1985-02-01

    Recent advances in the evaluation of the amputation stump, the materials that are available for prosthetic application, techniques of improving socket fit, and prosthetic finishings promise to dramatically improve amputee function. Precision casting techniques for providing optimal fit of the amputation stump using materials such as alginate are described. The advantages of transparent check sockets for fitting the complicated amputation stump are described. Advances in research that promise to provide more functional prosthetic feet and faster and more reliable socket molding are the use of CAD-CAM (computer aided design-computer aided manufacturing) and the use of gait analysis techniques to aid in the alignment of the prosthesis after socket fitting. Finishing techniques to provide a more natural appearing prosthesis are described. These advances will gradually spread to the entire prosthetic profession.

  20. Advanced sialoendoscopy techniques, rare findings, and complications.

    PubMed

    Nahlieli, Oded

    2009-12-01

    This article presents and discusses advanced minimally invasive sialoendoscopy and combined methods: endoscopy, endoscopic-assisted techniques, and external-lithotripsy combined procedures. It also presents rare situations and complications encountered during sialoendoscopic procedures. Sialoendoscopy is a relatively novel technique, which adds significant new dimensions to the surgeon's armamentarium for management of inflammatory salivary gland diseases. Because of the rapid development in minimally invasive surgical techniques, surgeons are capable of more facilely treating complicated inflammatory and obstructive conditions of the salivary glands.

  1. Recent advances in statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Heron, K. H.

    1992-01-01

    Statistical Energy Analysis (SEA) has traditionally been developed using modal summation and averaging approach, and has led to the need for many restrictive SEA assumptions. The assumption of 'weak coupling' is particularly unacceptable when attempts are made to apply SEA to structural coupling. It is now believed that this assumption is more a function of the modal formulation rather than a necessary formulation of SEA. The present analysis ignores this restriction and describes a wave approach to the calculation of plate-plate coupling loss factors. Predictions based on this method are compared with results obtained from experiments using point excitation on one side of an irregular six-sided box structure. Conclusions show that the use and calculation of infinite transmission coefficients is the way forward for the development of a purely predictive SEA code.

  2. RBF-based technique for statistical demodulation of pathological tremor.

    PubMed

    Gianfelici, Francesco

    2013-10-01

    This paper presents an innovative technique based on the joint approximation capabilities of radial basis function (RBF) networks and the estimation capability of the multivariate iterated Hilbert transform (IHT) for the statistical demodulation of pathological tremor from electromyography (EMG) signals in patients with Parkinson's disease. We define a stochastic model of the multichannel high-density surface EMG by means of the RBF networks applied to the reconstruction of the stochastic process (characterizing the disease) modeled by the multivariate relationships generated by the Karhunen-Loéve transform in Hilbert spaces. Next, we perform a demodulation of the entire random field by means of the estimation capability of the multivariate IHT in a statistical setting. The proposed method is applied to both simulated signals and data recorded from three Parkinsonian patients and the results show that the amplitude modulation components of the tremor oscillation can be estimated with signal-to-noise ratio close to 30 dB with root-mean-square error for the estimates of the tremor instantaneous frequency. Additionally, the comparisons with a large number of techniques based on all the combinations of the RBF, extreme learning machine, backpropagation, support vector machine used in the first step of the algorithm; and IHT, empirical mode decomposition, multiband energy separation algorithm, periodic algebraic separation and energy demodulation used in the second step of the algorithm, clearly show the effectiveness of our technique. These results show that the proposed approach is a potential useful tool for advanced neurorehabilitation technologies that aim at tremor characterization and suppression. PMID:24808594

  3. Hybrid mesh generation using advancing reduction technique

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This study presents an extension of the application of the advancing reduction technique to the hybrid mesh generation. The proposed algorithm is based on a pre-generated rectangle mesh (RM) with a certain orientation. The intersection points between the two sets of perpendicular mesh lines in RM an...

  4. Techniques in teaching statistics : linking research production and research use.

    SciTech Connect

    Martinez-Moyano, I .; Smith, A.

    2012-01-01

    In the spirit of closing the 'research-practice gap,' the authors extend evidence-based principles to statistics instruction in social science graduate education. The authors employ a Delphi method to survey experienced statistics instructors to identify teaching techniques to overcome the challenges inherent in teaching statistics to students enrolled in practitioner-oriented master's degree programs. Among the teaching techniques identi?ed as essential are using real-life examples, requiring data collection exercises, and emphasizing interpretation rather than results. Building on existing research, preliminary interviews, and the ?ndings from the study, the authors develop a model describing antecedents to the strength of the link between research and practice.

  5. The Importance of Introductory Statistics Students Understanding Appropriate Sampling Techniques

    ERIC Educational Resources Information Center

    Menil, Violeta C.

    2005-01-01

    In this paper the author discusses the meaning of sampling, the reasons for sampling, the Central Limit Theorem, and the different techniques of sampling. Practical and relevant examples are given to make the appropriate sampling techniques understandable to students of Introductory Statistics courses. With a thorough knowledge of sampling…

  6. Recent advancement of turbulent flow measurement techniques

    NASA Technical Reports Server (NTRS)

    Battle, T.; Wang, P.; Cheng, D. Y.

    1974-01-01

    Advancements of the fluctuating density gradient cross beam laser Schlieren technique, the fluctuating line-reversal temperature measurement and the development of the two-dimensional drag-sensing probe to a three-dimensional drag-sensing probe are discussed. The three-dimensionality of the instantaneous momentum vector can shed some light on the nature of turbulence especially with swirling flow. All three measured fluctuating quantities (density, temperature, and momentum) can provide valuable information for theoreticians.

  7. Advanced Tools and Techniques for Formal Techniques in Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    2005-01-01

    This is the final technical report for grant number NAG-1-02101. The title of this grant was "Advanced Tools and Techniques for Formal Techniques In Aerospace Systems". The principal investigator on this grant was Dr. John C. Knight of the Computer Science Department, University of Virginia, Charlottesville, Virginia 22904-4740. This report summarizes activities under the grant during the period 7/01/2002 to 9/30/2004. This report is organized as follows. In section 2, the technical background of the grant is summarized. Section 3 lists accomplishments and section 4 lists students funded under the grant. In section 5, we present a list of presentations given at various academic and research institutions about the research conducted. Finally, a list of publications generated under this grant is included in section 6.

  8. Tools for the advancement of undergraduate statistics education

    NASA Astrophysics Data System (ADS)

    Schaffner, Andrew Alan

    To keep pace with advances in applied statistics and to maintain literate consumers of quantitative analyses, statistics educators stress the need for change in the classroom (Cobb, 1992; Garfield, 1993, 1995; Moore, 1991a; Snee, 1993; Steinhorst and Keeler, 1995). These authors stress a more concept oriented undergraduate introductory statistics course which emphasizes true understanding over mechanical skills. Drawing on recent educational research, this dissertation attempts to realize this vision by developing tools and pedagogy to assist statistics instructors. This dissertation describes statistical facets, pieces of statistical understanding that are building blocks of knowledge, and discusses DIANA, a World-Wide Web tool for diagnosing facets. Further, I show how facets may be incorporated into course design through the development of benchmark lessons based on the principles of collaborative learning (diSessa and Minstrell, 1995; Cohen, 1994; Reynolds et al., 1995; Bruer, 1993; von Glasersfeld, 1991) and activity based courses (Jones, 1991; Yackel, Cobb and Wood, 1991). To support benchmark lessons and collaborative learning in large classes I describe Virtual Benchmark Instruction, benchmark lessons which take place on a structured hypertext bulletin board using the technology of the World-Wide Web. Finally, I present randomized experiments which suggest that these educational developments are effective in a university introductory statistics course.

  9. Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.

    ERIC Educational Resources Information Center

    Dunlap, Dale

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…

  10. Advanced decision aiding techniques applicable to space

    NASA Technical Reports Server (NTRS)

    Kruchten, Robert J.

    1987-01-01

    RADC has had an intensive program to show the feasibility of applying advanced technology to Air Force decision aiding situations. Some aspects of the program, such as Satellite Autonomy, are directly applicable to space systems. For example, RADC has shown the feasibility of decision aids that combine the advantages of laser disks and computer generated graphics; decision aids that interface object-oriented programs with expert systems; decision aids that solve path optimization problems; etc. Some of the key techniques that could be used in space applications are reviewed. Current applications are reviewed along with their advantages and disadvantages, and examples are given of possible space applications. The emphasis is to share RADC experience in decision aiding techniques.

  11. Statistical Techniques for Efficient Indexing and Retrieval of Document Images

    ERIC Educational Resources Information Center

    Bhardwaj, Anurag

    2010-01-01

    We have developed statistical techniques to improve the performance of document image search systems where the intermediate step of OCR based transcription is not used. Previous research in this area has largely focused on challenges pertaining to generation of small lexicons for processing handwritten documents and enhancement of poor quality…

  12. Advanced AE Techniques in Composite Materials Research

    NASA Technical Reports Server (NTRS)

    Prosser, William H.

    1996-01-01

    Advanced, waveform based acoustic emission (AE) techniques have been successfully used to evaluate damage mechanisms in laboratory testing of composite coupons. An example is presented in which the initiation of transverse matrix cracking was monitored. In these tests, broad band, high fidelity acoustic sensors were used to detect signals which were then digitized and stored for analysis. Analysis techniques were based on plate mode wave propagation characteristics. This approach, more recently referred to as Modal AE, provides an enhanced capability to discriminate and eliminate noise signals from those generated by damage mechanisms. This technique also allows much more precise source location than conventional, threshold crossing arrival time determination techniques. To apply Modal AE concepts to the interpretation of AE on larger composite specimens or structures, the effects of modal wave propagation over larger distances and through structural complexities must be well characterized and understood. To demonstrate these effects, measurements of the far field, peak amplitude attenuation of the extensional and flexural plate mode components of broad band simulated AE signals in large composite panels are discussed. These measurements demonstrated that the flexural mode attenuation is dominated by dispersion effects. Thus, it is significantly affected by the thickness of the composite plate. Furthermore, the flexural mode attenuation can be significantly larger than that of the extensional mode even though its peak amplitude consists of much lower frequency components.

  13. Advanced flow MRI: emerging techniques and applications.

    PubMed

    Markl, M; Schnell, S; Wu, C; Bollache, E; Jarvis, K; Barker, A J; Robinson, J D; Rigsby, C K

    2016-08-01

    Magnetic resonance imaging (MRI) techniques provide non-invasive and non-ionising methods for the highly accurate anatomical depiction of the heart and vessels throughout the cardiac cycle. In addition, the intrinsic sensitivity of MRI to motion offers the unique ability to acquire spatially registered blood flow simultaneously with the morphological data, within a single measurement. In clinical routine, flow MRI is typically accomplished using methods that resolve two spatial dimensions in individual planes and encode the time-resolved velocity in one principal direction, typically oriented perpendicular to the two-dimensional (2D) section. This review describes recently developed advanced MRI flow techniques, which allow for more comprehensive evaluation of blood flow characteristics, such as real-time flow imaging, 2D multiple-venc phase contrast MRI, four-dimensional (4D) flow MRI, quantification of complex haemodynamic properties, and highly accelerated flow imaging. Emerging techniques and novel applications are explored. In addition, applications of these new techniques for the improved evaluation of cardiovascular (aorta, pulmonary arteries, congenital heart disease, atrial fibrillation, coronary arteries) as well as cerebrovascular disease (intra-cranial arteries and veins) are presented. PMID:26944696

  14. Advances in Statistical Methods for Substance Abuse Prevention Research

    PubMed Central

    MacKinnon, David P.; Lockwood, Chondra M.

    2010-01-01

    The paper describes advances in statistical methods for prevention research with a particular focus on substance abuse prevention. Standard analysis methods are extended to the typical research designs and characteristics of the data collected in prevention research. Prevention research often includes longitudinal measurement, clustering of data in units such as schools or clinics, missing data, and categorical as well as continuous outcome variables. Statistical methods to handle these features of prevention data are outlined. Developments in mediation, moderation, and implementation analysis allow for the extraction of more detailed information from a prevention study. Advancements in the interpretation of prevention research results include more widespread calculation of effect size and statistical power, the use of confidence intervals as well as hypothesis testing, detailed causal analysis of research findings, and meta-analysis. The increased availability of statistical software has contributed greatly to the use of new methods in prevention research. It is likely that the Internet will continue to stimulate the development and application of new methods. PMID:12940467

  15. Lightweight and Statistical Techniques for Petascale PetaScale Debugging

    SciTech Connect

    Miller, Barton

    2014-06-30

    This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errors in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis

  16. Advances in nanodiagnostic techniques for microbial agents.

    PubMed

    Syed, Muhammad Ali

    2014-01-15

    Infectious diseases account for millions of sufferings and deaths in both developing as well as developed countries with a substantial economic loss. Massive increase in world population and international travel has facilitated their spread from one part of the world to other areas, making them one of the most significant global health risks. Furthermore, detection of bioterrorism agents in water, food and environmental samples as well traveler's baggage is a great challenge of the time for security purpose. Prevention strategies against infectious agents demand rapid and accurate detection and identification of the causative agents with highest sensitivity which should be equally available in different parts of the globe. Similarly, rapid and early diagnosis of infectious diseases has always been indispensable for their prompt cure and management, which has stimulated scientists to develop highly sophisticated techniques over centuries and the efforts continue unabated. Conventional diagnostic techniques are time consuming, tedious, expensive, less sensitive, and unsuitable for field situations. Nanodiagnostic assays have been promising for early, sensitive, point-of-care and cost-effective detection of microbial agents. There has been an explosive research in this area of science in last two decades yielding highly fascinating results. This review highlights some of the advancements made in the field of nanotechnology based assays for microbial detection since 2005 along with providing the basic understanding. PMID:24012709

  17. Advanced techniques in current signature analysis

    SciTech Connect

    Smith, S.F.; Castleberry, K.N.

    1992-03-01

    In general, both ac and dc motors can be characterized as weakly nonlinear systems, in which both linear and nonlinear effects occur simultaneously. Fortunately, the nonlinearities are generally well behaved and understood and an be handled via several standard mathematical techniques already well developed in the systems modeling area; examples are piecewise linear approximations and Volterra series representations. Field measurements of numerous motors and motor-driven systems confirm the rather complex nature of motor current spectra and illustrate both linear and nonlinear effects (including line harmonics and modulation components). Although previous current signature analysis (CSA) work at Oak Ridge and other sites has principally focused on the modulation mechanisms and detection methods (AM, PM, and FM), more recent studies have been conducted on linear spectral components (those appearing in the electric current at their actual frequencies and not as modulation sidebands). For example, large axial-flow compressors ({approximately}3300 hp) in the US gaseous diffusion uranium enrichment plants exhibit running-speed ({approximately}20 Hz) and high-frequency vibrational information (>1 kHz) in their motor current spectra. Several signal-processing techniques developed to facilitate analysis of these components, including specialized filtering schemes, are presented. Finally, concepts for the designs of advanced digitally based CSA units are offered, which should serve to foster the development of much more computationally capable ``smart`` CSA instrumentation in the next several years. 3 refs.

  18. Inverse lithography technique for advanced CMOS nodes

    NASA Astrophysics Data System (ADS)

    Villaret, Alexandre; Tritchkov, Alexander; Entradas, Jorge; Yesilada, Emek

    2013-04-01

    Resolution Enhancement Techniques have continuously improved over the last decade, driven by the ever growing constraints of lithography process. Despite the large number of RET applied, some hotspot configurations remain challenging for advanced nodes due to aggressive design rules. Inverse Lithography Technique (ILT) is evaluated here as a substitute to the dense OPC baseline. Indeed ILT has been known for several years for its near-to-ideal mask quality, while also being potentially more time consuming in terms of OPC run and mask processing. We chose to evaluate Mentor Graphics' ILT engine "pxOPCTM" on both lines and via hotspot configurations. These hotspots were extracted from real 28nm test cases where the dense OPC solution is not satisfactory. For both layer types, the reference OPC consists of a dense OPC engine coupled to rule-based and/or model-based assist generation method. The same CM1 model is used for the reference and the ILT OPC. ILT quality improvement is presented through Optical Rule Check (ORC) results with various adequate detectors. Several mask manufacturing rule constraints (MRC) are considered for the ILT solution and their impact on process ability is checked after mask processing. A hybrid OPC approach allowing localized ILT usage is presented in order to optimize both quality and runtime. A real mask is prepared and fabricated with this method. Finally, results analyzed on silicon are presented to compare localized ILT to reference dense OPC.

  19. Statistical Modeling of Photovoltaic Reliability Using Accelerated Degradation Techniques (Poster)

    SciTech Connect

    Lee, J.; Elmore, R.; Jones, W.

    2011-02-01

    We introduce a cutting-edge life-testing technique, accelerated degradation testing (ADT), for PV reliability testing. The ADT technique is a cost-effective and flexible reliability testing method with multiple (MADT) and Step-Stress (SSADT) variants. In an environment with limited resources, including equipment (chambers), test units, and testing time, these techniques can provide statistically rigorous prediction of lifetime and other interesting parameters, such as failure rate, warranty time, mean time to failure, degradation rate, activation energy, acceleration factor, and upper limit level of stress. J-V characterization can be used for degradation data and the generalized Eyring model can be used for the thermal-humidity stress condition. The SSADT model can be constructed based on the cumulative damage model (CEM), which assumes that the remaining test united are failed according to cumulative density function of current stress level regardless of the history on previous stress levels.

  20. Combining heuristic and statistical techniques in landslide hazard assessments

    NASA Astrophysics Data System (ADS)

    Cepeda, Jose; Schwendtner, Barbara; Quan, Byron; Nadim, Farrokh; Diaz, Manuel; Molina, Giovanni

    2014-05-01

    As a contribution to the Global Assessment Report 2013 - GAR2013, coordinated by the United Nations International Strategy for Disaster Reduction - UNISDR, a drill-down exercise for landslide hazard assessment was carried out by entering the results of both heuristic and statistical techniques into a new but simple combination rule. The data available for this evaluation included landslide inventories, both historical and event-based. In addition to the application of a heuristic method used in the previous editions of GAR, the availability of inventories motivated the use of statistical methods. The heuristic technique is largely based on the Mora & Vahrson method, which estimates hazard as the product of susceptibility and triggering factors, where classes are weighted based on expert judgment and experience. Two statistical methods were also applied: the landslide index method, which estimates weights of the classes for the susceptibility and triggering factors based on the evidence provided by the density of landslides in each class of the factors; and the weights of evidence method, which extends the previous technique to include both positive and negative evidence of landslide occurrence in the estimation of weights for the classes. One key aspect during the hazard evaluation was the decision on the methodology to be chosen for the final assessment. Instead of opting for a single methodology, it was decided to combine the results of the three implemented techniques using a combination rule based on a normalization of the results of each method. The hazard evaluation was performed for both earthquake- and rainfall-induced landslides. The country chosen for the drill-down exercise was El Salvador. The results indicate that highest hazard levels are concentrated along the central volcanic chain and at the centre of the northern mountains.

  1. Line identification studies using traditional techniques and wavelength coincidence statistics

    NASA Technical Reports Server (NTRS)

    Cowley, Charles R.; Adelman, Saul J.

    1990-01-01

    Traditional line identification techniques result in the assignment of individual lines to an atomic or ionic species. These methods may be supplemented by wavelength coincidence statistics (WCS). The strength and weakness of these methods are discussed using spectra of a number of normal and peculiar B and A stars that have been studied independently by both methods. The present results support the overall findings of some earlier studies. WCS would be most useful in a first survey, before traditional methods have been applied. WCS can quickly make a global search for all species and in this way may enable identifications of an unexpected spectrum that could easily be omitted entirely from a traditional study. This is illustrated by O I. WCS is a subject to well known weakness of any statistical technique, for example, a predictable number of spurious results are to be expected. The danger of small number statistics are illustrated. WCS is at its best relative to traditional methods in finding a line-rich atomic species that is only weakly present in a complicated stellar spectrum.

  2. A Hierarchical Statistic Methodology for Advanced Memory System Evaluation

    SciTech Connect

    Sun, X.-J.; He, D.; Cameron, K.W.; Luo, Y.

    1999-04-12

    Advances in technology have resulted in a widening of the gap between computing speed and memory access time. Data access time has become increasingly important for computer system design. Various hierarchical memory architectures have been developed. The performance of these advanced memory systems, however, varies with applications and problem sizes. How to reach an optimal cost/performance design eludes researchers still. In this study, the authors introduce an evaluation methodology for advanced memory systems. This methodology is based on statistical factorial analysis and performance scalability analysis. It is two fold: it first determines the impact of memory systems and application programs toward overall performance; it also identifies the bottleneck in a memory hierarchy and provides cost/performance comparisons via scalability analysis. Different memory systems can be compared in terms of mean performance or scalability over a range of codes and problem sizes. Experimental testing has been performed extensively on the Department of Energy's Accelerated Strategic Computing Initiative (ASCI) machines and benchmarks available at the Los Alamos National Laboratory to validate this newly proposed methodology. Experimental and analytical results show this methodology is simple and effective. It is a practical tool for memory system evaluation and design. Its extension to general architectural evaluation and parallel computer systems are possible and should be further explored.

  3. Statistical Techniques for Assessing water‐quality effects of BMPs

    USGS Publications Warehouse

    Walker, John F.

    1994-01-01

    Little has been published on the effectiveness of various management practices in small rural lakes and streams at the watershed scale. In this study, statistical techniques were used to test for changes in water‐quality data from watersheds where best management practices (BMPs) were implemented. Reductions in data variability due to climate and seasonality were accomplished through the use of regression methods. This study discusses the merits of using storm‐mass‐transport data as a means of improving the ability to detect BMP effects on stream‐water quality. Statistical techniques were applied to suspended‐sediment records from three rural watersheds in Illinois for the period 1981–84. None of the techniques identified changes in suspended sediment, primarily because of the small degree of BMP implementation and because of potential errors introduced through the estimation of storm‐mass transport. A Monte Carlo sensitivity analysis was used to determine the level of discrete change that could be detected for each watershed. In all cases, the use of regressions improved the ability to detect trends.Read More: http://ascelibrary.org/doi/abs/10.1061/(ASCE)0733-9437(1994)120:2(334)

  4. Statistical optimisation techniques in fatigue signal editing problem

    SciTech Connect

    Nopiah, Z. M.; Osman, M. H.; Baharin, N.; Abdullah, S.

    2015-02-03

    Success in fatigue signal editing is determined by the level of length reduction without compromising statistical constraints. A great reduction rate can be achieved by removing small amplitude cycles from the recorded signal. The long recorded signal sometimes renders the cycle-to-cycle editing process daunting. This has encouraged researchers to focus on the segment-based approach. This paper discusses joint application of the Running Damage Extraction (RDE) technique and single constrained Genetic Algorithm (GA) in fatigue signal editing optimisation.. In the first section, the RDE technique is used to restructure and summarise the fatigue strain. This technique combines the overlapping window and fatigue strain-life models. It is designed to identify and isolate the fatigue events that exist in the variable amplitude strain data into different segments whereby the retention of statistical parameters and the vibration energy are considered. In the second section, the fatigue data editing problem is formulated as a constrained single optimisation problem that can be solved using GA method. The GA produces the shortest edited fatigue signal by selecting appropriate segments from a pool of labelling segments. Challenges arise due to constraints on the segment selection by deviation level over three signal properties, namely cumulative fatigue damage, root mean square and kurtosis values. Experimental results over several case studies show that the idea of solving fatigue signal editing within a framework of optimisation is effective and automatic, and that the GA is robust for constrained segment selection.

  5. Statistics and Machine Learning based Outlier Detection Techniques for Exoplanets

    NASA Astrophysics Data System (ADS)

    Goel, Amit; Montgomery, Michele

    2015-08-01

    Architectures of planetary systems are observable snapshots in time that can indicate formation and dynamic evolution of planets. The observable key parameters that we consider are planetary mass and orbital period. If planet masses are significantly less than their host star masses, then Keplerian Motion is defined as P^2 = a^3 where P is the orbital period in units of years and a is the orbital period in units of Astronomical Units (AU). Keplerian motion works on small scales such as the size of the Solar System but not on large scales such as the size of the Milky Way Galaxy. In this work, for confirmed exoplanets of known stellar mass, planetary mass, orbital period, and stellar age, we analyze Keplerian motion of systems based on stellar age to seek if Keplerian motion has an age dependency and to identify outliers. For detecting outliers, we apply several techniques based on statistical and machine learning methods such as probabilistic, linear, and proximity based models. In probabilistic and statistical models of outliers, the parameters of a closed form probability distributions are learned in order to detect the outliers. Linear models use regression analysis based techniques for detecting outliers. Proximity based models use distance based algorithms such as k-nearest neighbour, clustering algorithms such as k-means, or density based algorithms such as kernel density estimation. In this work, we will use unsupervised learning algorithms with only the proximity based models. In addition, we explore the relative strengths and weaknesses of the various techniques by validating the outliers. The validation criteria for the outliers is if the ratio of planetary mass to stellar mass is less than 0.001. In this work, we present our statistical analysis of the outliers thus detected.

  6. Seasonal drought predictability in Portugal using statistical-dynamical techniques

    NASA Astrophysics Data System (ADS)

    Ribeiro, A. F. S.; Pires, C. A. L.

    2016-08-01

    Atmospheric forecasting and predictability are important to promote adaption and mitigation measures in order to minimize drought impacts. This study estimates hybrid (statistical-dynamical) long-range forecasts of the regional drought index SPI (3-months) over homogeneous regions from mainland Portugal, based on forecasts from the UKMO operational forecasting system, with lead-times up to 6 months. ERA-Interim reanalysis data is used for the purpose of building a set of SPI predictors integrating recent past information prior to the forecast launching. Then, the advantage of combining predictors with both dynamical and statistical background in the prediction of drought conditions at different lags is evaluated. A two-step hybridization procedure is performed, in which both forecasted and observed 500 hPa geopotential height fields are subjected to a PCA in order to use forecasted PCs and persistent PCs as predictors. A second hybridization step consists on a statistical/hybrid downscaling to the regional SPI, based on regression techniques, after the pre-selection of the statistically significant predictors. The SPI forecasts and the added value of combining dynamical and statistical methods are evaluated in cross-validation mode, using the R2 and binary event scores. Results are obtained for the four seasons and it was found that winter is the most predictable season, and that most of the predictive power is on the large-scale fields from past observations. The hybridization improves the downscaling based on the forecasted PCs, since they provide complementary information (though modest) beyond that of persistent PCs. These findings provide clues about the predictability of the SPI, particularly in Portugal, and may contribute to the predictability of crops yields and to some guidance on users (such as farmers) decision making process.

  7. Comparison of Three Statistical Classification Techniques for Maser Identification

    NASA Astrophysics Data System (ADS)

    Manning, Ellen M.; Holland, Barbara R.; Ellingsen, Simon P.; Breen, Shari L.; Chen, Xi; Humphries, Melissa

    2016-04-01

    We applied three statistical classification techniques-linear discriminant analysis (LDA), logistic regression, and random forests-to three astronomical datasets associated with searches for interstellar masers. We compared the performance of these methods in identifying whether specific mid-infrared or millimetre continuum sources are likely to have associated interstellar masers. We also discuss the interpretability of the results of each classification technique. Non-parametric methods have the potential to make accurate predictions when there are complex relationships between critical parameters. We found that for the small datasets the parametric methods logistic regression and LDA performed best, for the largest dataset the non-parametric method of random forests performed with comparable accuracy to parametric techniques, rather than any significant improvement. This suggests that at least for the specific examples investigated here accuracy of the predictions obtained is not being limited by the use of parametric models. We also found that for LDA, transformation of the data to match a normal distribution led to a significant improvement in accuracy. The different classification techniques had significant overlap in their predictions; further astronomical observations will enable the accuracy of these predictions to be tested.

  8. Indications and general techniques for lasers in advanced operative laparoscopy.

    PubMed

    Dorsey, J H

    1991-09-01

    Lasers are but one of the several energy delivery systems used by the operative laparoscopist in the performance of advanced operative laparoscopy. Safety is a key factor in the selection of a laser because the tissue damage produced by this instrument is absolutely predictable. The surgeon must be totally familiar with the chosen wavelength and its tissue reaction if this safety factor is to be realized. Other instruments complement the use of lasers in advanced operative laparoscopy, and without thorough knowledge of all available techniques and instruments, the operative laparoscopist will not achieve the full potential of this specialty. It is beyond the scope of this issue on gynecologic laser surgery to present all of the useful nonlaser techniques. Suffice it to say that we often use laser, loop ligature, sutures, hemoclips, bipolar electricity, hydrodissection, and endocoagulation during the course of a day in the operating room and sometimes during one case. As enthusiasm for advanced operative laparoscopy grows and endoscopic capability increases, more complicated and prolonged surgical feats are reported. Radical hysterectomy and lymphadenectomy have been performed by the laparoscopic route, and endoscopic management of ovarian tumors also has been reported. At this moment, these must be viewed as "show and tell" procedures unsupported by statistics to demonstrate any advantage (or disadvantage) when compared with conventional surgical methods. The time required of advanced operative laparoscopy for any given procedure is certainly an important factor. Prolonged operative and anesthesia time certainly can negate the supposed benefit of small incisions and minimally invasive surgery. What goes on inside the abdomen is certainly the most important part of advanced operative laparoscopy. Good surgeons must recognize their own limitations and the limitations of available technology. The operative laparoscopist must know when to quit and institute a

  9. Advances in procedural techniques--antegrade.

    PubMed

    Wilson, William; Spratt, James C

    2014-05-01

    There have been many technological advances in antegrade CTO PCI, but perhaps most importantly has been the evolution of the "hybrid' approach where ideally there exists a seamless interplay of antegrade wiring, antegrade dissection re-entry and retrograde approaches as dictated by procedural factors. Antegrade wire escalation with intimal tracking remains the preferred initial strategy in short CTOs without proximal cap ambiguity. More complex CTOs, however, usually require either a retrograde or an antegrade dissection re-entry approach, or both. Antegrade dissection re-entry is well suited to long occlusions where there is a healthy distal vessel and limited "interventional" collaterals. Early use of a dissection re-entry strategy will increase success rates, reduce complications, and minimise radiation exposure, contrast use as well as procedural times. Antegrade dissection can be achieved with a knuckle wire technique or the CrossBoss catheter whilst re-entry will be achieved in the most reproducible and reliable fashion by the Stingray balloon/wire. It should be avoided where there is potential for loss of large side branches. It remains to be seen whether use of newer dissection re-entry strategies will be associated with lower restenosis rates compared with the more uncontrolled subintimal tracking strategies such as STAR and whether stent insertion in the subintimal space is associated with higher rates of late stent malapposition and stent thrombosis. It is to be hoped that the algorithms, which have been developed to guide CTO operators, allow for a better transfer of knowledge and skills to increase uptake and acceptance of CTO PCI as a whole. PMID:24694104

  10. Advanced statistical methods for the definition of new staging models.

    PubMed

    Kates, Ronald; Schmitt, Manfred; Harbeck, Nadia

    2003-01-01

    Adequate staging procedures are the prerequisite for individualized therapy concepts in cancer, particularly in the adjuvant setting. Molecular staging markers tend to characterize specific, fundamental disease processes to a greater extent than conventional staging markers. At the biological level, the course of the disease will almost certainly involve interactions between multiple underlying processes. Since new therapeutic strategies tend to target specific processes as well, their impact will also involve interactions. Hence, assessment of the prognostic impact of new markers and their utilization for prediction of response to therapy will require increasingly sophisticated statistical tools that are capable of detecting and modeling complicated interactions. Because they are designed to model arbitrary interactions, neural networks offer a promising approach to improved staging. However, the typical clinical data environment poses severe challenges to high-performance survival modeling using neural nets, particularly the key problem of maintaining good generalization. Nonetheless, it turns out that by using newly developed methods to minimize unnecessary complexity in the neural network representation of disease course, it is possible to obtain models with high predictive performance. This performance has been validated on both simulated and real patient data sets. There are important applications for design of studies involving targeted therapy concepts and for identification of the improvement in decision support resulting from new staging markers. In this article, advantages of advanced statistical methods such as neural networks for definition of new staging models will be illustrated using breast cancer as an example.

  11. The statistical analysis techniques to support the NGNP fuel performance experiments

    SciTech Connect

    Binh T. Pham; Jeffrey J. Einerson

    2013-10-01

    This paper describes the development and application of statistical analysis techniques to support the Advanced Gas Reactor (AGR) experimental program on Next Generation Nuclear Plant (NGNP) fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel temperature) is regulated by the He–Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the NGNP Data Management and Analysis System for automated processing and qualification of the AGR measured data. The neutronic and thermal code simulation results are used for comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the fuel temperature within a given range.

  12. Bringing Advanced Computational Techniques to Energy Research

    SciTech Connect

    Mitchell, Julie C

    2012-11-17

    Please find attached our final technical report for the BACTER Institute award. BACTER was created as a graduate and postdoctoral training program for the advancement of computational biology applied to questions of relevance to bioenergy research.

  13. Statistical techniques for signal generation: the Australian experience.

    PubMed

    Purcell, Patrick; Barty, Simon

    2002-01-01

    National voluntary reporting systems generate large volumes of clinical data pertinent to drug safety. Currently descriptive statistical techniques are used to assist in the detection of drug safety 'signals'. Australian data have been coded according to guidelines formulated almost 30 years ago and which have resulted in many drugs which are not associated with an adverse drug reaction or 'innocent bystander' drugs being recorded as 'suspected' in individual reports. In this paper we explore the application of an iterative probability filtering algorithm titled 'PROFILE'. This serves to identify the 'signals' and remove the 'innocent bystander' drugs, thus providing a clearer view of the drugs most likely to have caused the reactions. Reaction terms analysed include neutropenia, agranulocytosis, hypotension, hypertension, myocardial infarction, neuroleptic malignant syndrome, and rectal haemorrhage. In this version of PROFILE, Fishers exact test has been used as the statistical tool but other methods could be used in future. Advantages and limitations of the method and its assumptions are discussed together with the rationale underlying the method and suggestions for further enhancements.

  14. Multidirectional mobilities: Advanced measurement techniques and applications

    NASA Astrophysics Data System (ADS)

    Ivarsson, Lars Holger

    Today high noise-and-vibration comfort has become a quality sign of products in sectors such as the automotive industry, aircraft, components, households and manufacturing. Consequently, already in the design phase of products, tools are required to predict the final vibration and noise levels. These tools have to be applicable over a wide frequency range with sufficient accuracy. During recent decades a variety of tools have been developed such as transfer path analysis (TPA), input force estimation, substructuring, coupling by frequency response functions (FRF) and hybrid modelling. While these methods have a well-developed theoretical basis, their application combined with experimental data often suffers from a lack of information concerning rotational DOFs. In order to measure response in all 6 DOFs (including rotation), a sensor has been developed, whose special features are discussed in the thesis. This transducer simplifies the response measurements, although in practice the excitation of moments appears to be more difficult. Several excitation techniques have been developed to enable measurement of multidirectional mobilities. For rapid and simple measurement of the loaded mobility matrix, a MIMO (Multiple Input Multiple Output) technique is used. The technique has been tested and validated on several structures of different complexity. A second technique for measuring the loaded 6-by-6 mobility matrix has been developed. This technique employs a model of the excitation set-up, and with this model the mobility matrix is determined from sequential measurements. Measurements on ``real'' structures show that both techniques give results of similar quality, and both are recommended for practical use. As a further step, a technique for measuring the unloaded mobilities is presented. It employs the measured loaded mobility matrix in order to calculate compensation forces and moments, which are later applied in order to compensate for the loading of the

  15. Multivariate mixed linear model analysis of longitudinal data: an information-rich statistical technique for analyzing disease resistance data

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The mixed linear model (MLM) is currently among the most advanced and flexible statistical modeling techniques and its use in tackling problems in plant pathology has begun surfacing in the literature. The longitudinal MLM is a multivariate extension that handles repeatedly measured data, such as r...

  16. Metrology Optical Power Budgeting in SIM Using Statistical Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Kuan, Gary M

    2008-01-01

    The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.

  17. Advances in laparoscopic urologic surgery techniques

    PubMed Central

    Abdul-Muhsin, Haidar M.; Humphreys, Mitchell R.

    2016-01-01

    The last two decades witnessed the inception and exponential implementation of key technological advancements in laparoscopic urology. While some of these technologies thrived and became part of daily practice, others are still hindered by major challenges. This review was conducted through a comprehensive literature search in order to highlight some of the most promising technologies in laparoscopic visualization, augmented reality, and insufflation. Additionally, this review will provide an update regarding the current status of single-site and natural orifice surgery in urology. PMID:27134743

  18. Advances in laparoscopic urologic surgery techniques.

    PubMed

    Abdul-Muhsin, Haidar M; Humphreys, Mitchell R

    2016-01-01

    The last two decades witnessed the inception and exponential implementation of key technological advancements in laparoscopic urology. While some of these technologies thrived and became part of daily practice, others are still hindered by major challenges. This review was conducted through a comprehensive literature search in order to highlight some of the most promising technologies in laparoscopic visualization, augmented reality, and insufflation. Additionally, this review will provide an update regarding the current status of single-site and natural orifice surgery in urology. PMID:27134743

  19. Source apportionment advances using polar plots of bivariate correlation and regression statistics

    NASA Astrophysics Data System (ADS)

    Grange, Stuart K.; Lewis, Alastair C.; Carslaw, David C.

    2016-11-01

    This paper outlines the development of enhanced bivariate polar plots that allow the concentrations of two pollutants to be compared using pair-wise statistics for exploring the sources of atmospheric pollutants. The new method combines bivariate polar plots, which provide source characteristic information, with pair-wise statistics that provide information on how two pollutants are related to one another. The pair-wise statistics implemented include weighted Pearson correlation and slope from two linear regression methods. The development uses a Gaussian kernel to locally weight the statistical calculations on a wind speed-direction surface together with variable-scaling. Example applications of the enhanced polar plots are presented by using routine air quality data for two monitoring sites in London, United Kingdom for a single year (2013). The London examples demonstrate that the combination of bivariate polar plots, correlation, and regression techniques can offer considerable insight into air pollution source characteristics, which would be missed if only scatter plots and mean polar plots were used for analysis. Specifically, using correlation and slopes as pair-wise statistics, long-range transport processes were isolated and black carbon (BC) contributions to PM2.5 for a kerbside monitoring location were quantified. Wider applications and future advancements are also discussed.

  20. [Advanced online search techniques and dedicated search engines for physicians].

    PubMed

    Nahum, Yoav

    2008-02-01

    In recent years search engines have become an essential tool in the work of physicians. This article will review advanced search techniques from the world of information specialists, as well as some advanced search engine operators that may help physicians improve their online search capabilities, and maximize the yield of their searches. This article also reviews popular dedicated scientific and biomedical literature search engines.

  1. Advanced optical imaging techniques for neurodevelopment.

    PubMed

    Wu, Yicong; Christensen, Ryan; Colón-Ramos, Daniel; Shroff, Hari

    2013-12-01

    Over the past decade, developmental neuroscience has been transformed by the widespread application of confocal and two-photon fluorescence microscopy. Even greater progress is imminent, as recent innovations in microscopy now enable imaging with increased depth, speed, and spatial resolution; reduced phototoxicity; and in some cases without external fluorescent probes. We discuss these new techniques and emphasize their dramatic impact on neurobiology, including the ability to image neurons at depths exceeding 1mm, to observe neurodevelopment noninvasively throughout embryogenesis, and to visualize neuronal processes or structures that were previously too small or too difficult to target with conventional microscopy.

  2. Advanced Optical Imaging Techniques for Neurodevelopment

    PubMed Central

    Wu, Yicong; Christensen, Ryan; Colón-Ramos, Daniel; Shroff, Hari

    2013-01-01

    Over the past decade, developmental neuroscience has been transformed by the widespread application of confocal and two-photon fluorescence microscopy. Even greater progress is imminent, as recent innovations in microscopy now enable imaging with increased depth, speed, and spatial resolution; reduced phototoxicity; and in some cases without external fluorescent probes. We discuss these new techniques and emphasize their dramatic impact on neurobiology, including the ability to image neurons at depths exceeding 1 mm, to observe neurodevelopment noninvasively throughout embryogenesis, and to visualize neuronal processes or structures that were previously too small or too difficult to target with conventional microscopy. PMID:23831260

  3. Advanced ultrasonic techniques for local tumor hyperthermia.

    PubMed

    Lele, P P

    1989-05-01

    Scanned, intensity-modulated, focused ultrasound (SIMFU) presently is the modality of choice for localized, controlled heating of deep as well as superficial tumors noninvasively. With the present SIMFU system, it was possible to heat 88 per cent of deep tumors up to 12 cm in depth and 15 cm in diameter, to 43 degrees C in 3 to 4 minutes. The infiltrative tumor margins could be heated to the desired therapeutic temperature. The temperature outside the treatment field fell off sharply. Excellent objective responses were obtained without local or systemic toxicity. Multiinstitutional clinical trials of local hyperthermia by this promising technique are clearly warranted.

  4. Air pollution monitoring by advanced spectroscopic techniques.

    PubMed

    Hodgeson, J A; McClenny, W A; Hanst, P L

    1973-10-19

    The monitoring requirements related to air pollution are many and varied. The molecules of concern differ greatly in their chemical and physical properties, in the nature of their environment, and in their concentration ranges. Furthermore, the application may have specific requirements such as rapid response time, ultrasensitivity, multipollutant capability, or capability for remote measurements. For these reasons, no single spectroscopic technique appears to offer a panacea for all monitoring needs. Instead we have attempted to demonstrate in the above discussion that, regardless of the difficulty and complexity of the monitoring problems, spectroscopy offers many tools by which such problems may be solved.

  5. Advanced analysis techniques for uranium assay

    SciTech Connect

    Geist, W. H.; Ensslin, Norbert; Carrillo, L. A.; Beard, C. A.

    2001-01-01

    Uranium has a negligible passive neutron emission rate making its assay practicable only with an active interrogation method. The active interrogation uses external neutron sources to induce fission events in the uranium in order to determine the mass. This technique requires careful calibration with standards that are representative of the items to be assayed. The samples to be measured are not always well represented by the available standards which often leads to large biases. A technique of active multiplicity counting is being developed to reduce some of these assay difficulties. Active multiplicity counting uses the measured doubles and triples count rates to determine the neutron multiplication (f4) and the product of the source-sample coupling ( C ) and the 235U mass (m). Since the 35U mass always appears in the multiplicity equations as the product of Cm, the coupling needs to be determined before the mass can be known. A relationship has been developed that relates the coupling to the neutron multiplication. The relationship is based on both an analytical derivation and also on empirical observations. To determine a scaling constant present in this relationship, known standards must be used. Evaluation of experimental data revealed an improvement over the traditional calibration curve analysis method of fitting the doubles count rate to the 235Um ass. Active multiplicity assay appears to relax the requirement that the calibration standards and unknown items have the same chemical form and geometry.

  6. Advanced automated char image analysis techniques

    SciTech Connect

    Tao Wu; Edward Lester; Michael Cloke

    2006-05-15

    Char morphology is an important characteristic when attempting to understand coal behavior and coal burnout. In this study, an augmented algorithm has been proposed to identify char types using image analysis. On the basis of a series of image processing steps, a char image is singled out from the whole image, which then allows the important major features of the char particle to be measured, including size, porosity, and wall thickness. The techniques for automated char image analysis have been tested against char images taken from ICCP Char Atlas as well as actual char particles derived from pyrolyzed char samples. Thirty different chars were prepared in a drop tube furnace operating at 1300{sup o}C, 1% oxygen, and 100 ms from 15 different world coals sieved into two size fractions (53-75 and 106-125 {mu}m). The results from this automated technique are comparable with those from manual analysis, and the additional detail from the automated sytem has potential use in applications such as combustion modeling systems. Obtaining highly detailed char information with automated methods has traditionally been hampered by the difficulty of automatic recognition of individual char particles. 20 refs., 10 figs., 3 tabs.

  7. What's Funny about Statistics? A Technique for Reducing Student Anxiety.

    ERIC Educational Resources Information Center

    Schacht, Steven; Stewart, Brad J.

    1990-01-01

    Studied the use of humorous cartoons to reduce the anxiety levels of students in statistics classes. Used the Mathematics Anxiety Rating Scale (MARS) to measure the level of student anxiety before and after a statistics course. Found that there was a significant reduction in levels of mathematics anxiety after the course. (SLM)

  8. Advances in Testing the Statistical Significance of Mediation Effects

    ERIC Educational Resources Information Center

    Mallinckrodt, Brent; Abraham, W. Todd; Wei, Meifen; Russell, Daniel W.

    2006-01-01

    P. A. Frazier, A. P. Tix, and K. E. Barron (2004) highlighted a normal theory method popularized by R. M. Baron and D. A. Kenny (1986) for testing the statistical significance of indirect effects (i.e., mediator variables) in multiple regression contexts. However, simulation studies suggest that this method lacks statistical power relative to some…

  9. Recent advances in DNA sequencing techniques

    NASA Astrophysics Data System (ADS)

    Singh, Rama Shankar

    2013-06-01

    Successful mapping of the draft human genome in 2001 and more recent mapping of the human microbiome genome in 2012 have relied heavily on the parallel processing of the second generation/Next Generation Sequencing (NGS) DNA machines at a cost of several millions dollars and long computer processing times. These have been mainly biochemical approaches. Here a system analysis approach is used to review these techniques by identifying the requirements, specifications, test methods, error estimates, repeatability, reliability and trends in the cost reduction. The first generation, NGS and the Third Generation Single Molecule Real Time (SMART) detection sequencing methods are reviewed. Based on the National Human Genome Research Institute (NHGRI) data, the achieved cost reduction of 1.5 times per yr. from Sep. 2001 to July 2007; 7 times per yr., from Oct. 2007 to Apr. 2010; and 2.5 times per yr. from July 2010 to Jan 2012 are discussed.

  10. Laparoscopic ureteral reimplantation: a simplified dome advancement technique.

    PubMed

    Lima, Guilherme C; Rais-Bahrami, Soroush; Link, Richard E; Kavoussi, Louis R

    2005-12-01

    Laparoscopic Boari flap reimplantation has been used to treat long distal ureteral strictures. This technique requires extensive bladder mobilization and complex intracorporeal suturing. This demonstrates a novel laparoscopic bladder dome advancement approach for ureteral reimplantation. This technique obviates the need for bladder pedicle dissection and simplifies the required suturing.

  11. Evaluation of Advanced Retrieval Techniques in an Experimental Online Catalog.

    ERIC Educational Resources Information Center

    Larson, Ray R.

    1992-01-01

    Discusses subject searching problems in online library catalogs; explains advanced information retrieval (IR) techniques; and describes experiments conducted on a test collection database, CHESHIRE (California Hybrid Extended SMART for Hypertext and Information Retrieval Experimentation), which was created to evaluate IR techniques in online…

  12. Innovative Tools Advance Revolutionary Weld Technique

    NASA Technical Reports Server (NTRS)

    2009-01-01

    The iconic, orange external tank of the space shuttle launch system not only contains the fuel used by the shuttle s main engines during liftoff but also comprises the shuttle s backbone, supporting the space shuttle orbiter and solid rocket boosters. Given the tank s structural importance and the extreme forces (7.8 million pounds of thrust load) and temperatures it encounters during launch, the welds used to construct the tank must be highly reliable. Variable polarity plasma arc welding, developed for manufacturing the external tank and later employed for building the International Space Station, was until 1994 the best process for joining the aluminum alloys used during construction. That year, Marshall Space Flight Center engineers began experimenting with a relatively new welding technique called friction stir welding (FSW), developed in 1991 by The Welding Institute, of Cambridge, England. FSW differs from traditional fusion welding in that it is a solid-state welding technique, using frictional heat and motion to join structural components without actually melting any of the material. The weld is created by a shouldered pin tool that is plunged into the seam of the materials to be joined. The tool traverses the line while rotating at high speeds, generating friction that heats and softens but does not melt the metal. (The heat produced approaches about 80 percent of the metal s melting temperature.) The pin tool s rotation crushes and stirs the plasticized metal, extruding it along the seam as the tool moves forward. The material cools and consolidates, resulting in a weld with superior mechanical properties as compared to those weld properties of fusion welds. The innovative FSW technology promises a number of attractive benefits. Because the welded materials are not melted, many of the undesirables associated with fusion welding porosity, cracking, shrinkage, and distortion of the weld are minimized or avoided. The process is more energy efficient, safe

  13. Correlation techniques and measurements of wave-height statistics

    NASA Technical Reports Server (NTRS)

    Guthart, H.; Taylor, W. C.; Graf, K. A.; Douglas, D. G.

    1972-01-01

    Statistical measurements of wave height fluctuations have been made in a wind wave tank. The power spectral density function of temporal wave height fluctuations evidenced second-harmonic components and an f to the minus 5th power law decay beyond the second harmonic. The observations of second harmonic effects agreed very well with a theoretical prediction. From the wave statistics, surface drift currents were inferred and compared to experimental measurements with satisfactory agreement. Measurements were made of the two dimensional correlation coefficient at 15 deg increments in angle with respect to the wind vector. An estimate of the two-dimensional spatial power spectral density function was also made.

  14. Advances in gamma titanium aluminides and their manufacturing techniques

    NASA Astrophysics Data System (ADS)

    Kothari, Kunal; Radhakrishnan, Ramachandran; Wereley, Norman M.

    2012-11-01

    Gamma titanium aluminides display attractive properties for high temperature applications. For over a decade in the 1990s, the attractive properties of titanium aluminides were outweighed by difficulties encountered in processing and machining at room temperature. But advances in manufacturing technologies, deeper understanding of titanium aluminides microstructure, deformation mechanisms, and advances in micro-alloying, has led to the production of gamma titanium aluminide sheets. An in-depth review of key advances in gamma titanium aluminides is presented, including microstructure, deformation mechanisms, and alloy development. Traditional manufacturing techniques such as ingot metallurgy and investment casting are reviewed and advances via powder metallurgy based manufacturing techniques are discussed. Finally, manufacturing challenges facing gamma titanium aluminides, as well as avenues to overcome them, are discussed.

  15. 75 FR 44015 - Certain Semiconductor Products Made by Advanced Lithography Techniques and Products Containing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-27

    ... COMMISSION Certain Semiconductor Products Made by Advanced Lithography Techniques and Products Containing... importation of certain semiconductor products made by advanced lithography techniques and products containing... certain semiconductor products made by advanced lithography techniques or products containing same...

  16. Advanced liner-cooling techniques for gas turbine combustors

    NASA Technical Reports Server (NTRS)

    Norgren, C. T.; Riddlebaugh, S. M.

    1985-01-01

    Component research for advanced small gas turbine engines is currently underway at the NASA Lewis Research Center. As part of this program, a basic reverse-flow combustor geometry was being maintained while different advanced liner wall cooling techniques were investigated. Performance and liner cooling effectiveness of the experimental combustor configuration featuring counter-flow film-cooled panels is presented and compared with two previously reported combustors featuring: splash film-cooled liner walls; and transpiration cooled liner walls (Lamilloy).

  17. Advanced regenerative-cooling techniques for future space transportation systems

    NASA Technical Reports Server (NTRS)

    Wagner, W. R.; Shoji, J. M.

    1975-01-01

    A review of regenerative-cooling techniques applicable to advanced planned engine designs for space booster and orbit transportation systems has developed the status of the key elements of this cooling mode. This work is presented in terms of gas side, coolant side, wall conduction heat transfer, and chamber life fatigue margin considerations. Described are preliminary heat transfer and trade analyses performed using developed techniques combining channel wall construction with advanced, high-strength, high-thermal-conductivity materials (NARloy-Z or Zr-Cu alloys) in high heat flux regions, combined with lightweight steel tubular nozzle wall construction. Advanced cooling techniques such as oxygen cooling and dual-mode hydrocarbon/hydrogen fuel operation and their limitations are indicated for the regenerative cooling approach.

  18. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    SciTech Connect

    Bihn T. Pham; Jeffrey J. Einerson

    2010-06-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automated processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.

  19. Bi-maxillary advancement surgery: Technique, indications and results.

    PubMed

    Olivi, Pierre; Garcia, Claude

    2014-06-01

    Esthetic analysis of the face in some patients presenting a dental Class II can reveal the need for maxillo-mandibular advancement surgery. In these cases, mandibular advancement alone would provide a result which was satisfactory from the occlusal viewpoint but esthetically displeasing. Using bi-maxillary advancement, the impact of nasal volume is reduced and the nasolabial relationship is corrected. The sub-mandibular length is increased, thus creating a better-defined cervico-mental angle. This treatment technique involving a prior mandibular procedure has the advantage of restoring patients' dental occlusion while optimizing their facial esthetics.

  20. Statistical Analysis of speckle noise reduction techniques for echocardiographic Images

    NASA Astrophysics Data System (ADS)

    Saini, Kalpana; Dewal, M. L.; Rohit, Manojkumar

    2011-12-01

    Echocardiography is the safe, easy and fast technology for diagnosing the cardiac diseases. As in other ultrasound images these images also contain speckle noise. In some cases this speckle noise is useful such as in motion detection. But in general noise removal is required for better analysis of the image and proper diagnosis. Different Adaptive and anisotropic filters are included for statistical analysis. Statistical parameters such as Signal-to-Noise Ratio (SNR), Peak Signal-to-Noise Ratio (PSNR), and Root Mean Square Error (RMSE) calculated for performance measurement. One more important aspect that there may be blurring during speckle noise removal. So it is prefered that filter should be able to enhance edges during noise removal.

  1. New Dynamical-Statistical Techniques for Wind Power Prediction

    NASA Astrophysics Data System (ADS)

    Stathopoulos, C.; Kaperoni, A.; Galanis, G.; Kallos, G.

    2012-04-01

    The increased use of renewable energy sources, and especially of wind power, has revealed the significance of accurate environmental and wind power predictions over wind farms that critically affect the integration of the produced power in the general grid. This issue is studied in the present paper by means of high resolution physical and statistical models. Two numerical weather prediction (NWP) systems namely SKIRON and RAMS are used to simulate the flow characteristics in selected wind farms in Greece. The NWP model output is post-processed by utilizing Kalman and Kolmogorov statistics in order to remove systematic errors. Modeled wind predictions in combination with available on-site observations are used for estimation of the wind power potential by utilizing a variety of statistical power prediction models based on non-linear and hyperbolic functions. The obtained results reveal the strong dependence of the forecasts uncertainty on the wind variation, the limited influence of previously recorded power values and the advantages that nonlinear - non polynomial functions could have in the successful control of power curve characteristics. This methodology is developed at the framework of the FP7 projects WAUDIT and MARINA PLATFORM.

  2. Blood species identification for forensic purposes using Raman spectroscopy combined with advanced statistical analysis.

    PubMed

    Virkler, Kelly; Lednev, Igor K

    2009-09-15

    Forensic analysis has become one of the most growing areas of analytical chemistry in recent years. The ability to determine the species of origin of a body fluid sample is a very important and crucial part of a forensic investigation. We introduce here a new technique which utilizes a modern analytical method based on the combination of Raman spectroscopy and advanced statistics to analyze the composition of blood traces from different species. Near-infrared Raman spectroscopy (NIR) was used to analyze multiple dry samples of human, canine, and feline blood for the ultimate application to forensic species identification. All of the spectra were combined into a single data matrix, and the number of principle components that described the system was determined using multiple statistical methods such as significant factor analysis (SFA), principle component analysis (PCA), and several cross-validation methods. Of the six principle components that were determined to be present, the first three, which contributed over 90% to the spectral data of the system, were used to form a three-dimensional scores plot that clearly showed significant separation between the three groups of species. Ellipsoids representing a 99% confidence interval surrounding each species group showed no overlap. This technique using Raman spectroscopy is nondestructive and quick and can potentially be performed at the scene of a crime.

  3. The nano-mechanical signature of Ultra High Performance Concrete by statistical nanoindentation techniques

    SciTech Connect

    Sorelli, Luca Constantinides, Georgios; Ulm, Franz-Josef; Toutlemonde, Francois

    2008-12-15

    Advances in engineering the microstructure of cementitious composites have led to the development of fiber reinforced Ultra High Performance Concretes (UHPC). The scope of this paper is twofold, first to characterize the nano-mechanical properties of the phases governing the UHPC microstructure by means of a novel statistical nanoindentation technique; then to upscale those nanoscale properties, by means of continuum micromechanics, to the macroscopic scale of engineering applications. In particular, a combined investigation of nanoindentation, scanning electron microscope (SEM) and X-ray Diffraction (XRD) indicates that the fiber-matrix transition zone is relatively defect free. On this basis, a four-level multiscale model with defect free interfaces allows to accurately determine the composite stiffness from the measured nano-mechanical properties. Besides evidencing the dominant role of high density calcium silicate hydrates and the stiffening effect of residual clinker, the suggested model may become a useful tool for further optimizing cement-based engineered composites.

  4. Statistical techniques for the characterization of partially observed epidemics.

    SciTech Connect

    Safta, Cosmin; Ray, Jaideep; Crary, David; Cheng, Karen

    2010-11-01

    Techniques appear promising to construct and integrate automated detect-and-characterize technique for epidemics - Working off biosurveillance data, and provides information on the particular/ongoing outbreak. Potential use - in crisis management and planning, resource allocation - Parameter estimation capability ideal for providing the input parameters into an agent-based model, Index Cases, Time of Infection, infection rate. Non-communicable diseases are easier than communicable ones - Small anthrax can be characterized well with 7-10 days of data, post-detection; plague takes longer, Large attacks are very easy.

  5. Advances in assessing geomorphic plausibility in statistical susceptibility modelling

    NASA Astrophysics Data System (ADS)

    Steger, Stefan; Brenning, Alexander; Bell, Rainer; Petschko, Helene; Glade, Thomas

    2014-05-01

    The quality, reliability and applicability of landslide susceptibility maps is regularly deduced directly by interpreting quantitative model performance measures. These quantitative estimates are usually calculated for an independent test sample of a landslide inventory. Numerous studies demonstrate that totally unbiased landslide inventories are rarely available. We assume that such biases are also inherent in the test sample used to quantitatively validate the models. Therefore we suppose that the explanatory power of statistical performance measures is limited by the quality of the inventory used to calculate these statistics. To investigate this assumption, we generated and validated 16 statistical susceptibility models by using two landslide inventories of differing qualities for the Rhenodanubian Flysch zone of Lower Austria (1,354 km²). The ALS-based (Airborne Laser Scan) Inventory (n=6,218) was mapped purposely for susceptibility modelling from a high resolution hillshade and exhibits a high positional accuracy. The less accurate building ground register (BGR; n=681) provided by the Geological Survey of Lower Austria represents reported damaging events and shows a substantially lower completeness. Both inventories exhibit differing systematic biases regarding the land cover. For instance, due to human impact on the visibility of geomorphic structures (e.g. planation), few ALS landslides could be mapped on settlements and pastures (ALS-mapping bias). In contrast, damaging events were frequently reported for settlements and pastures (BGR-report bias). Susceptibility maps were calculated by applying four multivariate classification methods, namely generalized linear model, generalized additive model, random forest and support vector machine separately for both inventories and two sets of explanatory variables (with and without land cover). Quantitative validation was performed by calculating the area under the receiver operating characteristics curve (AUROC

  6. Using Classroom Assessment Techniques in an Introductory Statistics Class

    ERIC Educational Resources Information Center

    Goldstein, Gary S.

    2007-01-01

    College instructors often provide students with only summative evaluations of their work, typically in the form of exam scores or paper grades. Formative evaluation, such as classroom assessment techniques (CATs), are rarer in higher education and provide an ongoing evaluation of students' progress. In this article, the author summarizes the use…

  7. Getting at the Factors Underlying Trends Using Statistical Decomposition Techniques.

    ERIC Educational Resources Information Center

    Story, Sherie; Swanson, David

    This paper presents and illustrates a technique that can be used to analyze the internal and external influences on community college enrollment trends and staffing patterns in a way that quantitatively expresses the amount of increase or decrease attributable to various competing factors. Introductory material relates the rationale behind the…

  8. Statistical Methods Handbook for Advanced Gas Reactor Fuel Materials

    SciTech Connect

    J. J. Einerson

    2005-05-01

    Fuel materials such as kernels, coated particles, and compacts are being manufactured for experiments simulating service in the next generation of high temperature gas reactors. These must meet predefined acceptance specifications. Many tests are performed for quality assurance, and many of these correspond to criteria that must be met with specified confidence, based on random samples. This report describes the statistical methods to be used. The properties of the tests are discussed, including the risk of false acceptance, the risk of false rejection, and the assumption of normality. Methods for calculating sample sizes are also described.

  9. Advanced Marketing Core Curriculum. Test Items and Assessment Techniques.

    ERIC Educational Resources Information Center

    Smith, Clifton L.; And Others

    This document contains duties and tasks, multiple-choice test items, and other assessment techniques for Missouri's advanced marketing core curriculum. The core curriculum begins with a list of 13 suggested textbook resources. Next, nine duties with their associated tasks are given. Under each task appears one or more citations to appropriate…

  10. Statistical analysis of heartbeat data with wavelet techniques

    NASA Astrophysics Data System (ADS)

    Pazsit, Imre

    2004-05-01

    The purpose of this paper is to demonstrate the use of some methods of signal analysis, performed on ECG and in some cases blood pressure signals, for the classification of the health status of the heart of mice and rats. Spectral and wavelet analysis were performed on the raw signals. FFT-based coherence and phase was also calculated between blood pressure and raw ECG signals. Finally, RR-intervals were deduced from the ECG signals and an analysis of the fractal dimensions was performed. The analysis was made on data from mice and rats. A correlation was found between the health status of the mice and the rats and some of the statistical descriptors, most notably the phase of the cross-spectra between ECG and blood pressure, and the fractal properties and dimensions of the interbeat series (RR-interval fluctuations).

  11. Some Bayesian statistical techniques useful in estimating frequency and density

    USGS Publications Warehouse

    Johnson, D.H.

    1977-01-01

    This paper presents some elementary applications of Bayesian statistics to problems faced by wildlife biologists. Bayesian confidence limits for frequency of occurrence are shown to be generally superior to classical confidence limits. Population density can be estimated from frequency data if the species is sparsely distributed relative to the size of the sample plot. For other situations, limits are developed based on the normal distribution and prior knowledge that the density is non-negative, which insures that the lower confidence limit is non-negative. Conditions are described under which Bayesian confidence limits are superior to those calculated with classical methods; examples are also given on how prior knowledge of the density can be used to sharpen inferences drawn from a new sample.

  12. Geochemical portray of the Pacific Ridge: New isotopic data and statistical techniques

    NASA Astrophysics Data System (ADS)

    Hamelin, Cédric; Dosso, Laure; Hanan, Barry B.; Moreira, Manuel; Kositsky, Andrew P.; Thomas, Marion Y.

    2011-02-01

    Samples collected during the PACANTARCTIC 2 cruise fill a sampling gap from 53° to 41° S along the Pacific Antarctic Ridge (PAR). Analysis of Sr, Nd, Pb, Hf, and He isotope compositions of these new samples is shown together with published data from 66°S to 53°S and from the EPR. The recent advance in analytical mass spectrometry techniques generates a spectacular increase in the number of multidimensional isotopic data for oceanic basalts. Working with such multidimensional datasets generates a new approach for the data interpretation, preferably based on statistical analysis techniques. Principal Component Analysis (PCA) is a powerful mathematical tool to study this type of datasets. The purpose of PCA is to reduce the number of dimensions by keeping only those characteristics that contribute most to its variance. Using this technique, it becomes possible to have a statistical picture of the geochemical variations along the entire Pacific Ridge from 70°S to 10°S. The incomplete sampling of the ridge led previously to the identification of a large-scale division of the south Pacific mantle at the latitude of Easter Island. The PCA method applied here to the completed dataset reveals a different geochemical profile. Along the Pacific Ridge, a large-scale bell-shaped variation with an extremum at about 38°S of latitude is interpreted as a progressive change in the geochemical characteristics of the depleted matrix of the mantle. This Pacific Isotopic Bump (PIB) is also noticeable in the He isotopic ratio along-axis variation. The linear correlation observed between He and heavy radiogenic isotopes, together with the result of the PCA calculation, suggests that the large-scale variation is unrelated to the plume-ridge interactions in the area and should rather be attributed to the partial melting of a marble-cake assemblage.

  13. Regularized versus non-regularized statistical reconstruction techniques

    NASA Astrophysics Data System (ADS)

    Denisova, N. V.

    2011-08-01

    An important feature of positron emission tomography (PET) and single photon emission computer tomography (SPECT) is the stochastic property of real clinical data. Statistical algorithms such as ordered subset-expectation maximization (OSEM) and maximum a posteriori (MAP) are a direct consequence of the stochastic nature of the data. The principal difference between these two algorithms is that OSEM is a non-regularized approach, while the MAP is a regularized algorithm. From the theoretical point of view, reconstruction problems belong to the class of ill-posed problems and should be considered using regularization. Regularization introduces an additional unknown regularization parameter into the reconstruction procedure as compared with non-regularized algorithms. However, a comparison of non-regularized OSEM and regularized MAP algorithms with fixed regularization parameters has shown very minor difference between reconstructions. This problem is analyzed in the present paper. To improve the reconstruction quality, a method of local regularization is proposed based on the spatially adaptive regularization parameter. The MAP algorithm with local regularization was tested in reconstruction of the Hoffman brain phantom.

  14. Statistical classification techniques in high energy physics (SDDT algorithm)

    NASA Astrophysics Data System (ADS)

    Bouř, Petr; Kůs, Václav; Franc, Jiří

    2016-08-01

    We present our proposal of the supervised binary divergence decision tree with nested separation method based on the generalized linear models. A key insight we provide is the clustering driven only by a few selected physical variables. The proper selection consists of the variables achieving the maximal divergence measure between two different classes. Further, we apply our method to Monte Carlo simulations of physics processes corresponding to a data sample of top quark-antiquark pair candidate events in the lepton+jets decay channel. The data sample is produced in pp̅ collisions at √S = 1.96 TeV. It corresponds to an integrated luminosity of 9.7 fb-1 recorded with the D0 detector during Run II of the Fermilab Tevatron Collider. The efficiency of our algorithm achieves 90% AUC in separating signal from background. We also briefly deal with the modification of statistical tests applicable to weighted data sets in order to test homogeneity of the Monte Carlo simulations and measured data. The justification of these modified tests is proposed through the divergence tests.

  15. The statistical multifragmentation model: Origins and recent advances

    NASA Astrophysics Data System (ADS)

    Donangelo, R.; Souza, S. R.

    2016-07-01

    We review the Statistical Multifragmentation Model (SMM) which considers a generalization of the liquid-drop model for hot nuclei and allows one to calculate thermodynamic quantities characterizing the nuclear ensemble at the disassembly stage. We show how to determine probabilities of definite partitions of finite nuclei and how to determine, through Monte Carlo calculations, observables such as the caloric curve, multiplicity distributions, heat capacity, among others. Some experimental measurements of the caloric curve confirmed the SMM predictions of over 10 years before, leading to a surge in the interest in the model. However, the experimental determination of the fragmentation temperatures relies on the yields of different isotopic species, which were not correctly calculated in the schematic, liquid-drop picture, employed in the SMM. This led to a series of improvements in the SMM, in particular to the more careful choice of nuclear masses and energy densities, specially for the lighter nuclei. With these improvements the SMM is able to make quantitative determinations of isotope production. We show the application of SMM to the production of exotic nuclei through multifragmentation. These preliminary calculations demonstrate the need for a careful choice of the system size and excitation energy to attain maximum yields.

  16. Computing aerodynamic sound using advanced statistical turbulence theories

    NASA Technical Reports Server (NTRS)

    Hecht, A. M.; Teske, M. E.; Bilanin, A. J.

    1981-01-01

    It is noted that the calculation of turbulence-generated aerodynamic sound requires knowledge of the spatial and temporal variation of Q sub ij (xi sub k, tau), the two-point, two-time turbulent velocity correlations. A technique is presented to obtain an approximate form of these correlations based on closure of the Reynolds stress equations by modeling of higher order terms. The governing equations for Q sub ij are first developed for a general flow. The case of homogeneous, stationary turbulence in a unidirectional constant shear mean flow is then assumed. The required closure form for Q sub ij is selected which is capable of qualitatively reproducing experimentally observed behavior. This form contains separation time dependent scale factors as parameters and depends explicitly on spatial separation. The approximate forms of Q sub ij are used in the differential equations and integral moments are taken over the spatial domain. The velocity correlations are used in the Lighthill theory of aerodynamic sound by assuming normal joint probability.

  17. Statistical, connectionist, and fuzzy inference techniques for image classification

    NASA Astrophysics Data System (ADS)

    Israel, Steven A.; Kasabov, Nikola K.

    1997-07-01

    A spectral classification comparison was performed using four different classifiers, the parametric maximum likelihood classifier and three nonparametric classifiers: neural networks, fuzzy rules, and fuzzy neural networks. The input image data is a System Pour l'Observation de la Terre (SPOT) satellite image of Otago Harbour near Dunedin, New Zealand. The SPOT image data contains three spectral bands in the green, red, and visible infrared portions of the electromagnetic spectrum. The specific area contains intertidal vegetation species above and below the waterline. Of specific interest is eelgrass (Zostera novazelandica), which is a biotic indicator of environmental health. The mixed covertypes observed in an in situ field survey are difficult to classify because of subjectivity and water's preferential absorption of the visible infrared spectrum. In this analysis, each of the classifiers were applied to the data in two different testing procedures. In the first test procedure, the reference data was divided into training and test by area. Although this is an efficient data handling technique, the classifier is not presented with all of the subtle microclimate variations. In the second test procedure, the same reference areas were amalgamated and randomly sorted into training and test data. The amalgamation and sorting were performed external to the analysis software. For the first testing procedure, the highest testing accuracy was obtained through the use of fuzzy inferences at 89%. In the second testing procedure, the maximum likelihood classifier and the fuzzy neural networks provided the best results. Although the testing accuracy for the maximum likelihood classifier and the fuzzy neural networks provided the best results. Although the testing accuracy for the maximum likelihood classifier and the fuzzy neural networks were simulator, the latter algorithm has additional features, such as rules extraction, explanation, and fine tuning of individual classes.

  18. Advanced Packaging Materials and Techniques for High Power TR Module: Standard Flight vs. Advanced Packaging

    NASA Technical Reports Server (NTRS)

    Hoffman, James Patrick; Del Castillo, Linda; Miller, Jennifer; Jenabi, Masud; Hunter, Donald; Birur, Gajanana

    2011-01-01

    The higher output power densities required of modern radar architectures, such as the proposed DESDynI [Deformation, Ecosystem Structure, and Dynamics of Ice] SAR [Synthetic Aperture Radar] Instrument (or DSI) require increasingly dense high power electronics. To enable these higher power densities, while maintaining or even improving hardware reliability, requires advances in integrating advanced thermal packaging technologies into radar transmit/receive (TR) modules. New materials and techniques have been studied and compared to standard technologies.

  19. An Advanced Time Averaging Modelling Technique for Power Electronic Circuits

    NASA Astrophysics Data System (ADS)

    Jankuloski, Goce

    For stable and efficient performance of power converters, a good mathematical model is needed. This thesis presents a new modelling technique for DC/DC and DC/AC Pulse Width Modulated (PWM) converters. The new model is more accurate than the existing modelling techniques such as State Space Averaging (SSA) and Discrete Time Modelling. Unlike the SSA model, the new modelling technique, the Advanced Time Averaging Model (ATAM) includes the averaging dynamics of the converter's output. In addition to offering enhanced model accuracy, application of linearization techniques to the ATAM enables the use of conventional linear control design tools. A controller design application demonstrates that a controller designed based on the ATAM outperforms one designed using the ubiquitous SSA model. Unlike the SSA model, ATAM for DC/AC augments the system's dynamics with the dynamics needed for subcycle fundamental contribution (SFC) calculation. This allows for controller design that is based on an exact model.

  20. Technology development of fabrication techniques for advanced solar dynamic concentrators

    NASA Technical Reports Server (NTRS)

    Richter, Scott W.

    1991-01-01

    The objective of the advanced concentrator program is to develop the technology that will lead to lightweight, highly reflective, accurate, scaleable, and long lived space solar dynamic concentrators. The advanced concentrator program encompasses new and innovative concepts, fabrication techniques, materials selection, and simulated space environmental testing. Fabrication techniques include methods of fabricating the substrates and coating substrate surfaces to produce high-quality optical surfaces, acceptable for further coating with vapor deposited optical films. The selected materials to obtain a high quality optical surface include microsheet glass and Eccocoat EP-3 epoxy, with DC-93-500 selected as a candidate silicone adhesive and levelizing layer. The following procedures are defined: cutting, cleaning, forming, and bonding microsheet glass. Procedures are also defined for surface cleaning, and EP-3 epoxy application. The results and analyses from atomic oxygen and thermal cycling tests are used to determine the effects of orbital conditions in a space environment.

  1. Technology development of fabrication techniques for advanced solar dynamic concentrators

    NASA Technical Reports Server (NTRS)

    Richter, Scott W.

    1991-01-01

    The objective of the advanced concentrator program is to develop the technology that will lead to lightweight, highly reflective, accurate, scaleable, and long lived space solar dynamic concentrators. The advanced concentrator program encompasses new and innovative concepts, fabrication techniques, materials selection, and simulated space environmental testing. Fabrication techniques include methods of fabricating the substrates and coating substrate surfaces to produce high quality optical surfaces, acceptable for further coating with vapor deposited optical films. The selected materials to obtain a high quality optical surface include microsheet glass and Eccocoat EP-3 epoxy, with DC-93-500 selected as a candidate silicone adhesive and levelizing layer. The following procedures are defined: cutting, cleaning, forming, and bonding microsheet glass. Procedures are also defined for surface cleaning, and EP-3 epoxy application. The results and analyses from atomic oxygen and thermal cycling tests are used to determine the effects of orbital conditions in a space environment.

  2. Advanced Morphological and Functional Magnetic Resonance Techniques in Glaucoma

    PubMed Central

    Mastropasqua, Rodolfo; Agnifili, Luca; Mattei, Peter A.; Caulo, Massimo; Fasanella, Vincenzo; Navarra, Riccardo; Mastropasqua, Leonardo; Marchini, Giorgio

    2015-01-01

    Glaucoma is a multifactorial disease that is the leading cause of irreversible blindness. Recent data documented that glaucoma is not limited to the retinal ganglion cells but that it also extends to the posterior visual pathway. The diagnosis is based on the presence of signs of glaucomatous optic neuropathy and consistent functional visual field alterations. Unfortunately these functional alterations often become evident when a significant amount of the nerve fibers that compose the optic nerve has been irreversibly lost. Advanced morphological and functional magnetic resonance (MR) techniques (morphometry, diffusion tensor imaging, arterial spin labeling, and functional connectivity) may provide a means for observing modifications induced by this fiber loss, within the optic nerve and the visual cortex, in an earlier stage. The aim of this systematic review was to determine if the use of these advanced MR techniques could offer the possibility of diagnosing glaucoma at an earlier stage than that currently possible. PMID:26167474

  3. Systematic and Statistical Errors Associated with Nuclear Decay Constant Measurements Using the Counting Technique

    NASA Astrophysics Data System (ADS)

    Koltick, David; Wang, Haoyu; Liu, Shih-Chieh; Heim, Jordan; Nistor, Jonathan

    2016-03-01

    Typical nuclear decay constants are measured at the accuracy level of 10-2. There are numerous reasons: tests of unconventional theories, dating of materials, and long term inventory evolution which require decay constants accuracy at a level of 10-4 to 10-5. The statistical and systematic errors associated with precision measurements of decays using the counting technique are presented. Precision requires high count rates, which introduces time dependent dead time and pile-up corrections. An approach to overcome these issues is presented by continuous recording of the detector current. Other systematic corrections include, the time dependent dead time due to background radiation, control of target motion and radiation flight path variation due to environmental conditions, and the time dependent effects caused by scattered events are presented. The incorporation of blind experimental techniques can help make measurement independent of past results. A spectrometer design and data analysis is reviewed that can accomplish these goals. The author would like to thank TechSource, Inc. and Advanced Physics Technologies, LLC. for their support in this work.

  4. Advanced computer graphic techniques for laser range finder (LRF) simulation

    NASA Astrophysics Data System (ADS)

    Bedkowski, Janusz; Jankowski, Stanislaw

    2008-11-01

    This paper show an advanced computer graphic techniques for laser range finder (LRF) simulation. The LRF is the common sensor for unmanned ground vehicle, autonomous mobile robot and security applications. The cost of the measurement system is extremely high, therefore the simulation tool is designed. The simulation gives an opportunity to execute algorithm such as the obstacle avoidance[1], slam for robot localization[2], detection of vegetation and water obstacles in surroundings of the robot chassis[3], LRF measurement in crowd of people[1]. The Axis Aligned Bounding Box (AABB) and alternative technique based on CUDA (NVIDIA Compute Unified Device Architecture) is presented.

  5. Three-dimensional hybrid grid generation using advancing front techniques

    NASA Technical Reports Server (NTRS)

    Steinbrenner, John P.; Noack, Ralph W.

    1995-01-01

    A new 3-dimensional hybrid grid generation technique has been developed, based on ideas of advancing fronts for both structured and unstructured grids. In this approach, structured grids are first generate independently around individual components of the geometry. Fronts are initialized on these structure grids, and advanced outward so that new cells are extracted directly from the structured grids. Employing typical advancing front techniques, cells are rejected if they intersect the existing front or fail other criteria When no more viable structured cells exist further cells are advanced in an unstructured manner to close off the overall domain, resulting in a grid of 'hybrid' form. There are two primary advantages to the hybrid formulation. First, generating blocks with limited regard to topology eliminates the bottleneck encountered when a multiple block system is used to fully encapsulate a domain. Individual blocks may be generated free of external constraints, which will significantly reduce the generation time. Secondly, grid points near the body (presumably with high aspect ratio) will still maintain a structured (non-triangular or tetrahedral) character, thereby maximizing grid quality and solution accuracy near the surface.

  6. Full Endoscopic Spinal Surgery Techniques: Advancements, Indications, and Outcomes

    PubMed Central

    Yue, James J.; Long, William

    2015-01-01

    Advancements in both surgical instrumentation and full endoscopic spine techniques have resulted in positive clinical outcomes in the treatment of cervical, thoracic, and lumbar spine pathologies. Endoscopic techniques impart minimal approach related disruption of non-pathologic spinal anatomy and function while concurrently maximizing functional visualization and correction of pathological tissues. An advanced understanding of the applicable functional neuroanatomy, in particular the neuroforamen, is essential for successful outcomes. Additionally, an understanding of the varying types of disc prolapse pathology in relation to the neuroforamen will result in more optimal surgical outcomes. Indications for lumbar endoscopic spine surgery include disc herniations, spinal stenosis, infections, medial branch rhizotomy, and interbody fusion. Limitations are based on both non spine and spine related findings. A high riding iliac wing, a more posteriorly located retroperitoneal cavity, an overly distal or proximally migrated herniated disc are all relative contra-indications to lumbar endoscopic spinal surgery techniques. Modifications in scope size and visual field of view angulation have enabled both anterior and posterior cervical decompression. Endoscopic burrs, electrocautery, and focused laser technology allow for the least invasive spinal surgical techniques in all age groups and across varying body habitus. Complications include among others, dural tears, dysesthsia, nerve injury, and infection. PMID:26114086

  7. Full Endoscopic Spinal Surgery Techniques: Advancements, Indications, and Outcomes.

    PubMed

    Yue, James J; Long, William

    2015-01-01

    Advancements in both surgical instrumentation and full endoscopic spine techniques have resulted in positive clinical outcomes in the treatment of cervical, thoracic, and lumbar spine pathologies. Endoscopic techniques impart minimal approach related disruption of non-pathologic spinal anatomy and function while concurrently maximizing functional visualization and correction of pathological tissues. An advanced understanding of the applicable functional neuroanatomy, in particular the neuroforamen, is essential for successful outcomes. Additionally, an understanding of the varying types of disc prolapse pathology in relation to the neuroforamen will result in more optimal surgical outcomes. Indications for lumbar endoscopic spine surgery include disc herniations, spinal stenosis, infections, medial branch rhizotomy, and interbody fusion. Limitations are based on both non spine and spine related findings. A high riding iliac wing, a more posteriorly located retroperitoneal cavity, an overly distal or proximally migrated herniated disc are all relative contra-indications to lumbar endoscopic spinal surgery techniques. Modifications in scope size and visual field of view angulation have enabled both anterior and posterior cervical decompression. Endoscopic burrs, electrocautery, and focused laser technology allow for the least invasive spinal surgical techniques in all age groups and across varying body habitus. Complications include among others, dural tears, dysesthsia, nerve injury, and infection. PMID:26114086

  8. The origins of bioethics: advances in resuscitations techniques.

    PubMed

    Niebroj, L

    2008-12-01

    During the last years there has been an increasing interest in meta-bioethical issues. This turn in the research focus is regarded as a sign of the maturation of bioethics as a distinct area of an academic inquiry. The role of historic-philosophical reflection is often emphasized. It should be noted that there is a rather common agreement that the future of bioethics lies in the critical reflection on its past, in particular, on the very origins of this discipline. Sharing Caplan's opinion, advances in medicine technologies, especially the introduction of respirators and artificial heart machines, is considered as one of the main issues that started bioethics. Using methods of historical as well as meta-ethical research, this article aims at describing the role of advances in resuscitation techniques in the emergence of bioethics and at exploring how bioethical reflection has been shaped by technological developments. A brief historical analysis permits to say that there is a close bond between the emergence of bioethics and the introduction of sophisticated resuscitation technologies into medical practice. The meta-ethical reflection reveals that advances in resuscitation techniques not only initiated bioethics in the second half of the 20(th) century but influenced its evolution by (i) posing a question of justice in health care, (ii) altering commonly accepted ontological notions of human corporeality, and (iii) reconsidering the very purpose of medicine.

  9. Advanced aeroservoelastic stabilization techniques for hypersonic flight vehicles

    NASA Technical Reports Server (NTRS)

    Chan, Samuel Y.; Cheng, Peter Y.; Myers, Thomas T.; Klyde, David H.; Magdaleno, Raymond E.; Mcruer, Duane T.

    1992-01-01

    Advanced high performance vehicles, including Single-Stage-To-Orbit (SSTO) hypersonic flight vehicles, that are statically unstable, require higher bandwidth flight control systems to compensate for the instability resulting in interactions between the flight control system, the engine/propulsion dynamics, and the low frequency structural modes. Military specifications, such as MIL-F-9490D and MIL-F-87242, tend to limit treatment of structural modes to conventional gain stabilization techniques. The conventional gain stabilization techniques, however, introduce low frequency effective time delays which can be troublesome from a flying qualities standpoint. These time delays can be alleviated by appropriate blending of gain and phase stabilization techniques (referred to as Hybrid Phase Stabilization or HPS) for the low frequency structural modes. The potential of using HPS for compensating structural mode interaction was previously explored. It was shown that effective time delay was significantly reduced with the use of HPS; however, the HPS design was seen to have greater residual response than a conventional gain stablized design. Additional work performed to advance and refine the HPS design procedure, to further develop residual response metrics as a basis for alternative structural stability specifications, and to develop strategies for validating HPS design and specification concepts in manned simulation is presented. Stabilization design sensitivity to structural uncertainties and aircraft-centered requirements are also assessed.

  10. Advanced computer modeling techniques expand belt conveyor technology

    SciTech Connect

    Alspaugh, M.

    1998-07-01

    Increased mining production is continuing to challenge engineers and manufacturers to keep up. The pressure to produce larger and more versatile equipment is increasing. This paper will show some recent major projects in the belt conveyor industry that have pushed the limits of design and engineering technology. Also, it will discuss the systems engineering discipline and advanced computer modeling tools that have helped make these achievements possible. Several examples of technologically advanced designs will be reviewed. However, new technology can sometimes produce increased problems with equipment availability and reliability if not carefully developed. Computer modeling techniques that help one design larger equipment can also compound operational headaches if engineering processes and algorithms are not carefully analyzed every step of the way.

  11. Testing aspects of advanced coherent electron cooling technique

    SciTech Connect

    Litvinenko, V.; Jing, Y.; Pinayev, I.; Wang, G.; Samulyak, R.; Ratner, D.

    2015-05-03

    An advanced version of the Coherent-electron Cooling (CeC) based on the micro-bunching instability was proposed. This approach promises significant increase in the bandwidth of the CeC system and, therefore, significant shortening of cooling time in high-energy hadron colliders. In this paper we present our plans of simulating and testing the key aspects of this proposed technique using the set-up of the coherent-electron-cooling proof-of-principle experiment at BNL.

  12. Recent advances in UHV techniques for particle accelerators

    SciTech Connect

    M. G. Rao

    1995-01-01

    The ultrahigh vacuum (UHV) requirements for storage rings and accelerators, and the development of the science and technology of UHV for particle accelerators and magnetic fusion devices have been recently reviewed by N.B. Mistry and H.F. Dylla respectively. In this paper, the latest developments in the advancement of UHV techniques for the vacuum integrity of Continuous Electron Beam Accelerator Facility (CEBAF) and for successfully dealing with the synchrotron radiation related beam line vacuum problem encountered in the design of the SSC are reviewed: the review includes developments in extreme sensitivity He leak detection technique based on the dynamic adsorption and desorption of He, operation of ionization gauges at Lhe temperatures, metal sponges for the effective cryopumping of H{sup 2} and He to pressures better than 10{sup -14} torr, and low cost and high He sensitivity RGA's. The details of a new extreme sensitivity He leak detector system are also discussed here.

  13. Using the Student Research Project to Integrate Macroeconomics and Statistics in an Advanced Cost Accounting Course

    ERIC Educational Resources Information Center

    Hassan, Mahamood M.; Schwartz, Bill N.

    2014-01-01

    This paper discusses a student research project that is part of an advanced cost accounting class. The project emphasizes active learning, integrates cost accounting with macroeconomics and statistics by "learning by doing" using real world data. Students analyze sales data for a publicly listed company by focusing on the company's…

  14. Reducing Anxiety and Increasing Self-Efficacy within an Advanced Graduate Psychology Statistics Course

    ERIC Educational Resources Information Center

    McGrath, April L.; Ferns, Alyssa; Greiner, Leigh; Wanamaker, Kayla; Brown, Shelley

    2015-01-01

    In this study we assessed the usefulness of a multifaceted teaching framework in an advanced statistics course. We sought to expand on past findings by using this framework to assess changes in anxiety and self-efficacy, and we collected focus group data to ascertain whether students attribute such changes to a multifaceted teaching approach.…

  15. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC I. Instructor Book.

    ERIC Educational Resources Information Center

    Averitt, Sallie D.

    This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills and instructing them in basic calculator…

  16. Project T.E.A.M. (Technical Education Advancement Modules). Introduction to Statistical Process Control.

    ERIC Educational Resources Information Center

    Billings, Paul H.

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 6-hour introductory module on statistical process control (SPC), designed to develop competencies in the following skill areas: (1) identification of the three classes of SPC use; (2) understanding a process and how it works; (3)…

  17. Recent Advances in Techniques for Hyperspectral Image Processing

    NASA Technical Reports Server (NTRS)

    Plaza, Antonio; Benediktsson, Jon Atli; Boardman, Joseph W.; Brazile, Jason; Bruzzone, Lorenzo; Camps-Valls, Gustavo; Chanussot, Jocelyn; Fauvel, Mathieu; Gamba, Paolo; Gualtieri, Anthony; Marconcini, Mattia; Tilton, James C.; Trianni, Giovanna

    2009-01-01

    Imaging spectroscopy, also known as hyperspectral imaging, has been transformed in less than 30 years from being a sparse research tool into a commodity product available to a broad user community. Currently, there is a need for standardized data processing techniques able to take into account the special properties of hyperspectral data. In this paper, we provide a seminal view on recent advances in techniques for hyperspectral image processing. Our main focus is on the design of techniques able to deal with the highdimensional nature of the data, and to integrate the spatial and spectral information. Performance of the discussed techniques is evaluated in different analysis scenarios. To satisfy time-critical constraints in specific applications, we also develop efficient parallel implementations of some of the discussed algorithms. Combined, these parts provide an excellent snapshot of the state-of-the-art in those areas, and offer a thoughtful perspective on future potentials and emerging challenges in the design of robust hyperspectral imaging algorithms

  18. Advanced bronchoscopic techniques in diagnosis and staging of lung cancer.

    PubMed

    Zaric, Bojan; Stojsic, Vladimir; Sarcev, Tatjana; Stojanovic, Goran; Carapic, Vladimir; Perin, Branislav; Zarogoulidis, Paul; Darwiche, Kaid; Tsakiridis, Kosmas; Karapantzos, Ilias; Kesisis, Georgios; Kougioumtzi, Ioanna; Katsikogiannis, Nikolaos; Machairiotis, Nikolaos; Stylianaki, Aikaterini; Foroulis, Christophoros N; Zarogoulidis, Konstantinos

    2013-09-01

    The role of advanced brochoscopic diagnostic techniques in detection and staging of lung cancer has steeply increased in recent years. Bronchoscopic imaging techniques became widely available and easy to use. Technical improvement led to merging in technologies making autofluorescence or narrow band imaging incorporated into one bronchoscope. New tools, such as autofluorescence imagining (AFI), narrow band imaging (NBI) or fuji intelligent chromo endoscopy (FICE), found their place in respiratory endoscopy suites. Development of endobronchial ultrasound (EBUS) improved minimally invasive mediastinal staging and diagnosis of peripheral lung lesions. Linear EBUS proven to be complementary to mediastinoscopy. This technique is now available in almost all high volume centers performing bronchoscopy. Radial EBUS with mini-probes and guiding sheaths provides accurate diagnosis of peripheral pulmonary lesions. Combining EBUS guided procedures with rapid on site cytology (ROSE) increases diagnostic yield even more. Electromagnetic navigation technology (EMN) is also widely used for diagnosis of peripheral lesions. Future development will certainly lead to new improvements in technology and creation of new sophisticated tools for research in respiratory endoscopy. Broncho-microscopy, alveoloscopy, optical coherence tomography are some of the new research techniques emerging for rapid technological development.

  19. Measuring the microbiome: perspectives on advances in DNA-based techniques for exploring microbial life

    PubMed Central

    Bunge, John; Gilbert, Jack A.; Moore, Jason H.

    2012-01-01

    This article reviews recent advances in ‘microbiome studies’: molecular, statistical and graphical techniques to explore and quantify how microbial organisms affect our environments and ourselves given recent increases in sequencing technology. Microbiome studies are moving beyond mere inventories of specific ecosystems to quantifications of community diversity and descriptions of their ecological function. We review the last 24 months of progress in this sort of research, and anticipate where the next 2 years will take us. We hope that bioinformaticians will find this a helpful springboard for new collaborations with microbiologists. PMID:22308073

  20. Recalibration of CFS seasonal precipitation forecasts using statistical techniques for bias correction

    NASA Astrophysics Data System (ADS)

    Bliefernicht, Jan; Laux, Patrick; Siegmund, Jonatan; Kunstmann, Harald

    2013-04-01

    The development and application of statistical techniques with a special focus on a recalibration of meteorological or hydrological forecasts to eliminate the bias between forecasts and observations has received a great deal of attention in recent years. One reason is that retrospective forecasts are nowadays available which allows for a proper training and validation of this kind of techniques. The objective of this presentation is to propose several statistical techniques with different degree of complexity and to evaluate and compare their performance for a recalibration of seasonal ensemble forecasts of monthly precipitation. The techniques selected in this study range from straightforward normal score and quantile-quantile transformation, local scaling, to more sophisticated and novel statistical techniques such as Copula-based methodology recently proposed by Laux et al. (2011). The seasonal forecasts are derived from the Climate Forecast System Version 2. This version is the current coupled ocean-atmosphere general circulation model of the U.S. National Centers for Environmental Prediction used to provide forecasts up to nine months. The CFS precipitation forecasts are compared to monthly precipitation observations from the Global Precipitation Climatology Centre. The statistical techniques are tested for semi-arid regions in West Africa and the Indian subcontinent focusing on large-scale river basins such as the Ganges and the Volta basin. In both regions seasonal precipitation forecasts are a crucial source of information for the prediction of hydro-meteorological extremes, in particular for droughts. The evaluation is done using retrospective CFS ensemble forecast from 1982 to 2009. The training of the statistical techniques is done in a cross-validation mode. The outcome of this investigation illustrates large systematic differences between forecasts and observations, in particular for the Volta basin in West Africa. The selection of straightforward

  1. Advanced Techniques for Removal of Retrievable Inferior Vena Cava Filters

    SciTech Connect

    Iliescu, Bogdan; Haskal, Ziv J.

    2012-08-15

    Inferior vena cava (IVC) filters have proven valuable for the prevention of primary or recurrent pulmonary embolism in selected patients with or at high risk for venous thromboembolic disease. Their use has become commonplace, and the numbers implanted increase annually. During the last 3 years, in the United States, the percentage of annually placed optional filters, i.e., filters than can remain as permanent filters or potentially be retrieved, has consistently exceeded that of permanent filters. In parallel, the complications of long- or short-term filtration have become increasingly evident to physicians, regulatory agencies, and the public. Most filter removals are uneventful, with a high degree of success. When routine filter-retrieval techniques prove unsuccessful, progressively more advanced tools and skill sets must be used to enhance filter-retrieval success. These techniques should be used with caution to avoid damage to the filter or cava during IVC retrieval. This review describes the complex techniques for filter retrieval, including use of additional snares, guidewires, angioplasty balloons, and mechanical and thermal approaches as well as illustrates their specific application.

  2. Basics, common errors and essentials of statistical tools and techniques in anesthesiology research

    PubMed Central

    Bajwa, Sukhminder Jit Singh

    2015-01-01

    The statistical portion is a vital component of any research study. The research methodology and the application of statistical tools and techniques have evolved over the years and have significantly helped the research activities throughout the globe. The results and inferences are not accurately possible without proper validation with various statistical tools and tests. The evidencebased anesthesia research and practice has to incorporate statistical tools in the methodology right from the planning stage of the study itself. Though the medical fraternity is well acquainted with the significance of statistics in research, there is a lack of in-depth knowledge about the various statistical concepts and principles among majority of the researchers. The clinical impact and consequences can be serious as the incorrect analysis, conclusions, and false results may construct an artificial platform on which future research activities are replicated. The present tutorial is an attempt to make anesthesiologists aware of the various aspects of statistical methods used in evidence-based research and also to highlight the common areas where maximum number of statistical errors are committed so as to adopt better statistical practices. PMID:26702217

  3. Lightweight and Statistical Techniques for Petascale Debugging: Correctness on Petascale Systems (CoPS) Preliminry Report

    SciTech Connect

    de Supinski, B R; Miller, B P; Liblit, B

    2011-09-13

    Petascale platforms with O(10{sup 5}) and O(10{sup 6}) processing cores are driving advancements in a wide range of scientific disciplines. These large systems create unprecedented application development challenges. Scalable correctness tools are critical to shorten the time-to-solution on these systems. Currently, many DOE application developers use primitive manual debugging based on printf or traditional debuggers such as TotalView or DDT. This paradigm breaks down beyond a few thousand cores, yet bugs often arise above that scale. Programmers must reproduce problems in smaller runs to analyze them with traditional tools, or else perform repeated runs at scale using only primitive techniques. Even when traditional tools run at scale, the approach wastes substantial effort and computation cycles. Continued scientific progress demands new paradigms for debugging large-scale applications. The Correctness on Petascale Systems (CoPS) project is developing a revolutionary debugging scheme that will reduce the debugging problem to a scale that human developers can comprehend. The scheme can provide precise diagnoses of the root causes of failure, including suggestions of the location and the type of errors down to the level of code regions or even a single execution point. Our fundamentally new strategy combines and expands three relatively new complementary debugging approaches. The Stack Trace Analysis Tool (STAT), a 2011 R&D 100 Award Winner, identifies behavior equivalence classes in MPI jobs and highlights behavior when elements of the class demonstrate divergent behavior, often the first indicator of an error. The Cooperative Bug Isolation (CBI) project has developed statistical techniques for isolating programming errors in widely deployed code that we will adapt to large-scale parallel applications. Finally, we are developing a new approach to parallelizing expensive correctness analyses, such as analysis of memory usage in the Memgrind tool. In the first two

  4. A survey of image processing techniques and statistics for ballistic specimens in forensic science.

    PubMed

    Gerules, George; Bhatia, Sanjiv K; Jackson, Daniel E

    2013-06-01

    This paper provides a review of recent investigations on the image processing techniques used to match spent bullets and cartridge cases. It is also, to a lesser extent, a review of the statistical methods that are used to judge the uniqueness of fired bullets and spent cartridge cases. We review 2D and 3D imaging techniques as well as many of the algorithms used to match these images. We also provide a discussion of the strengths and weaknesses of these methods for both image matching and statistical uniqueness. The goal of this paper is to be a reference for investigators and scientists working in this field.

  5. Techniques for developing approximate optimal advanced launch system guidance

    NASA Technical Reports Server (NTRS)

    Feeley, Timothy S.; Speyer, Jason L.

    1991-01-01

    An extension to the authors' previous technique used to develop a real-time guidance scheme for the Advanced Launch System is presented. The approach is to construct an optimal guidance law based upon an asymptotic expansion associated with small physical parameters, epsilon. The trajectory of a rocket modeled as a point mass is considered with the flight restricted to an equatorial plane while reaching an orbital altitude at orbital injection speeds. The dynamics of this problem can be separated into primary effects due to thrust and gravitational forces, and perturbation effects which include the aerodynamic forces and the remaining inertial forces. An analytic solution to the reduced-order problem represented by the primary dynamics is possible. The Hamilton-Jacobi-Bellman or dynamic programming equation is expanded in an asymptotic series where the zeroth-order term (epsilon = 0) can be obtained in closed form.

  6. Neurocysticercosis: evaluation with advanced magnetic resonance techniques and atypical forms.

    PubMed

    do Amaral, Lázaro Luís Faria; Ferreira, Rafael Martins; da Rocha, Antônio José; Ferreira, Nelson Paes Diniz Fortes

    2005-04-01

    Neurocysticercosis (NCC) is the most common helminthic infection of the central nervous system, but its diagnosis remains difficult. The purpose of this article is to perform a critical analysis of the literature and show our experience in the evaluation of NCC. We discuss the advanced MR technique applications such as diffusion and perfusion-weighted imaging, spectroscopy, cisternography with FLAIR, and supplemental O2 and 3D-CISS. The typical manifestations of NCC are described; emphasis is given to the unusual presentations. The atypical forms of neurocysticercosis were divided into: intraventricular, subarachnoid, spinal, orbital, and intraparenchymatous. Special attention was also given to reactivation of previously calcified lesions and neurocysticercosis associated with mesial temporal sclerosis.

  7. COAL AND CHAR STUDIES BY ADVANCED EMR TECHNIQUES

    SciTech Connect

    R. Linn Belford; Robert B. Clarkson; Mark J. Nilges; Boris M. Odintsov; Alex I. Smirnov

    2001-04-30

    Advanced electronic magnetic resonance (EMR) as well as nuclear magnetic resonance (NMR) methods have been used to examine properties of coals, chars, and molecular species related to constituents of coal. During the span of this grant, progress was made on construction and applications to coals and chars of two high frequency EMR systems particularly appropriate for such studies--48 GHz and 95 GHz electron magnetic resonance spectrometer, on new low-frequency dynamic nuclear polarization (DNP) experiments to examine the interaction between water and the surfaces of suspended char particulates in slurries, and on a variety of proton nuclear magnetic resonance (NMR) techniques to measure characteristics of the water directly in contact with the surfaces and pore spaces of carbonaceous particulates.

  8. Advanced Fibre Bragg Grating and Microfibre Bragg Grating Fabrication Techniques

    NASA Astrophysics Data System (ADS)

    Chung, Kit Man

    Fibre Bragg gratings (FBGs) have become a very important technology for communication systems and fibre optic sensing. Typically, FBGs are less than 10-mm long and are fabricated using fused silica uniform phase masks which become more expensive for longer length or non-uniform pitch. Generally, interference UV laser beams are employed to make long or complex FBGs, and this technique introduces critical precision and control issues. In this work, we demonstrate an advanced FBG fabrication system that enables the writing of long and complex gratings in optical fibres with virtually any apodisation profile, local phase and Bragg wavelength using a novel optical design in which the incident angles of two UV beams onto an optical fibre can be adjusted simultaneously by moving just one optical component, instead of two optics employed in earlier configurations, to vary the grating pitch. The key advantage of the grating fabrication system is that complex gratings can be fabricated by controlling the linear movements of two translation stages. In addition to the study of advanced grating fabrication technique, we also focus on the inscription of FBGs written in optical fibres with a cladding diameter of several ten's of microns. Fabrication of microfibres was investigated using a sophisticated tapering method. We also proposed a simple but practical technique to filter out the higher order modes reflected from the FBG written in microfibres via a linear taper region while the fundamental mode re-couples to the core. By using this technique, reflection from the microfibre Bragg grating (MFBG) can be effectively single mode, simplifying the demultiplexing and demodulation processes. MFBG exhibits high sensitivity to contact force and an MFBG-based force sensor was also constructed and tested to investigate their suitability for use as an invasive surgery device. Performance of the contact force sensor packaged in a conforming elastomer material compares favourably to one

  9. Multiple advanced surgical techniques to treat acquired seminal duct obstruction

    PubMed Central

    Jiang, Hong-Tao; Yuan, Qian; Liu, Yu; Liu, Zeng-Qin; Zhou, Zhen-Yu; Xiao, Ke-Feng; Yang, Jiang-Gen

    2014-01-01

    The aim of this study was to evaluate the outcomes of multiple advanced surgical treatments (i.e. microsurgery, laparoscopic surgery and endoscopic surgery) for acquired obstructive azoospermia. We analyzed the surgical outcomes of 51 patients with suspected acquired obstructive azoospermia consecutively who enrolled at our center between January 2009 and May 2013. Modified vasoepididymostomy, laparoscopically assisted vasovasostomy and transurethral incision of the ejaculatory duct with holmium laser were chosen and performed based on the different obstruction sites. The mean postoperative follow-up time was 22 months (range: 9 months to 52 months). Semen analyses were initiated at four postoperative weeks, followed by trimonthly (months 3, 6, 9 and 12) semen analyses, until no sperm was found at 12 months or until pregnancy was achieved. Patency was defined as >10,000 sperm ml−1 of semen. The obstruction sites, postoperative patency and natural pregnancy rate were recorded. Of 51 patients, 47 underwent bilateral or unilateral surgical reconstruction; the other four patients were unable to be treated with surgical reconstruction because of pelvic vas or intratesticular tubules obstruction. The reconstruction rate was 92.2% (47/51), and the patency rate and natural pregnancy rate were 89.4% (42/47) and 38.1% (16/42), respectively. No severe complications were observed. Using multiple advanced surgical techniques, more extensive range of seminal duct obstruction was accessible and correctable; thus, a favorable patency and pregnancy rate can be achieved. PMID:25337841

  10. Are Assumptions of Well-Known Statistical Techniques Checked, and Why (Not)?

    PubMed Central

    Hoekstra, Rink; Kiers, Henk A. L.; Johnson, Addie

    2012-01-01

    A valid interpretation of most statistical techniques requires that one or more assumptions be met. In published articles, however, little information tends to be reported on whether the data satisfy the assumptions underlying the statistical techniques used. This could be due to self-selection: Only manuscripts with data fulfilling the assumptions are submitted. Another explanation could be that violations of assumptions are rarely checked for in the first place. We studied whether and how 30 researchers checked fictitious data for violations of assumptions in their own working environment. Participants were asked to analyze the data as they would their own data, for which often used and well-known techniques such as the t-procedure, ANOVA and regression (or non-parametric alternatives) were required. It was found that the assumptions of the techniques were rarely checked, and that if they were, it was regularly by means of a statistical test. Interviews afterward revealed a general lack of knowledge about assumptions, the robustness of the techniques with regards to the assumptions, and how (or whether) assumptions should be checked. These data suggest that checking for violations of assumptions is not a well-considered choice, and that the use of statistics can be described as opportunistic. PMID:22593746

  11. Advances in the Rising Bubble Technique for discharge measurement

    NASA Astrophysics Data System (ADS)

    Hilgersom, Koen; Luxemburg, Willem; Willemsen, Geert; Bussmann, Luuk

    2014-05-01

    Already in the 19th century, d'Auria described a discharge measurement technique that applies floats to find the depth-integrated velocity (d'Auria, 1882). The basis of this technique was that the horizontal distance that the float travels on its way to the surface is the image of the integrated velocity profile over depth. Viol and Semenov (1964) improved this method by using air bubbles as floats, but still distances were measured manually until Sargent (1981) introduced a technique that could derive the distances from two photographs simultaneously taken from each side of the river bank. Recently, modern image processing techniques proved to further improve the applicability of the method (Hilgersom and Luxemburg, 2012). In the 2012 article, controlling and determining the rising velocity of an air bubble still appeared a major challenge for the application of this method. Ever since, laboratory experiments with different nozzle and tube sizes lead to advances in our self-made equipment enabling us to produce individual air bubbles with a more constant rising velocity. Also, we introduced an underwater camera to on-site determine the rising velocity, which is dependent on the water temperature and contamination, and therefore is site-specific. Camera measurements of the rising velocity proved successful in a laboratory and field setting, although some improvements to the setup are necessary to capture the air bubbles also at depths where little daylight penetrates. References D'Auria, L.: Velocity of streams; A new method to determine correctly the mean velocity of any perpendicular in rivers and canals, (The) American Engineers, 3, 1882. Hilgersom, K.P. and Luxemburg, W.M.J.: Technical Note: How image processing facilitates the rising bubble technique for discharge measurement, Hydrology and Earth System Sciences, 16(2), 345-356, 2012. Sargent, D.: Development of a viable method of stream flow measurement using the integrating float technique, Proceedings of

  12. Advanced microscopy techniques resolving complex precipitates in steels

    NASA Astrophysics Data System (ADS)

    Saikaly, W.; Soto, R.; Bano, X.; Issartel, C.; Rigaut, G.; Charaï, A.

    1999-06-01

    Scanning electron microscopy as well as analytical transmission electron microscopy techniques such as high resolution, electron diffraction, energy dispersive X-ray spectrometry (EDX), parallel electron energy loss spectroscopy (PEELS) and elemental mapping via a Gatan Imaging Filter (GIF) have been used to study complex precipitation in commercial dual phase steels microalloyed with titanium. Titanium nitrides, titanium carbosulfides, titanium carbonitrides and titanium carbides were characterized in this study. Both carbon extraction replicas and thin foils were used as sample preparation techniques. On both the microscopic and nanometric scales, it was found that a large amount of precipitation occurred heterogeneously on already existing inclusions/precipitates. CaS inclusions (1 to 2 μm), already present in liquid steel, acted as nucleation sites for TiN precipitating upon the steel's solidification. In addition, TiC nucleated on existing smaller TiN (around 30 to 50 nm). Despite the complexity of such alloys, the statistical analysis conducted on the non-equilibrium samples were found to be in rather good agreement with the theoretical equilibrium calculations. Heterogeneous precipitation must have played a role in bringing these results closer together.

  13. Statistics

    Cancer.gov

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  14. Advances and challenges in the attribution of climate impacts using statistical inference

    NASA Astrophysics Data System (ADS)

    Hsiang, S. M.

    2015-12-01

    We discuss recent advances, challenges, and debates in the use of statistical models to infer and attribute climate impacts, such as distinguishing effects of "climate" vs. "weather," accounting for simultaneous environmental changes along multiple dimensions, evaluating multiple sources of uncertainty, accounting for adaptation, and simulating counterfactual economic or social trajectories. We relate these ideas to recent findings linking temperature to economic productivity/violence and tropical cyclones to economic growth.

  15. Computer program uses Monte Carlo techniques for statistical system performance analysis

    NASA Technical Reports Server (NTRS)

    Wohl, D. P.

    1967-01-01

    Computer program with Monte Carlo sampling techniques determines the effect of a component part of a unit upon the overall system performance. It utilizes the full statistics of the disturbances and misalignments of each component to provide unbiased results through simulated random sampling.

  16. Nuclear Technology. Course 26: Metrology. Module 27-7, Statistical Techniques in Metrology.

    ERIC Educational Resources Information Center

    Espy, John; Selleck, Ben

    This seventh in a series of eight modules for a course titled Metrology focuses on descriptive and inferential statistical techniques in metrology. The module follows a typical format that includes the following sections: (1) introduction, (2) module prerequisites, (3) objectives, (4) notes to instructor/student, (5) subject matter, (6) materials…

  17. Use of statistical techniques to account for parameter uncertainty in landslide tsunami generation

    NASA Astrophysics Data System (ADS)

    Salmanidou, Dimitra; Guillas, Serge; Georgiopoulou, Aggeliki; Dias, Frederic

    2016-04-01

    Landslide tsunamis constitute complex phenomena, the nature of which is governed by varying rheological and geomorphological parameters. In an attempt to understand better these mechanisms, statistical methods can be used to quantify uncertainty and carry out sensitivity analyses. Such a method is the statistical emulation of the numerical code used to model a phenomenon. In comparison to numerical simulators, statistical emulators have the advantage of being faster and less expensive to run. In this study we implement a Bayesian calibration which allows us to build a statistical surrogate of the numerical simulators used to model submarine sliding and tsunami generation in the Rockall Bank Slide Complex, NE Atlantic Ocean. For the parameter selection and numerical simulations of the event we make use of a sophisticated sampling technique (Latin Hypercube Sampling). The posterior distributions of the parameters and the predictions made with the emulator are provided.

  18. Accuracy Evaluation of a Mobile Mapping System with Advanced Statistical Methods

    NASA Astrophysics Data System (ADS)

    Toschi, I.; Rodríguez-Gonzálvez, P.; Remondino, F.; Minto, S.; Orlandini, S.; Fuller, A.

    2015-02-01

    This paper discusses a methodology to evaluate the precision and the accuracy of a commercial Mobile Mapping System (MMS) with advanced statistical methods. So far, the metric potentialities of this emerging mapping technology have been studied in few papers, where generally the assumption that errors follow a normal distribution is made. In fact, this hypothesis should be carefully verified in advance, in order to test how well the Gaussian classic statistics can adapt to datasets that are usually affected by asymmetrical gross errors. The workflow adopted in this study relies on a Gaussian assessment, followed by an outlier filtering process. Finally, non-parametric statistical models are applied, in order to achieve a robust estimation of the error dispersion. Among the different MMSs available on the market, the latest solution provided by RIEGL is here tested, i.e. the VMX-450 Mobile Laser Scanning System. The test-area is the historic city centre of Trento (Italy), selected in order to assess the system performance in dealing with a challenging and historic urban scenario. Reference measures are derived from photogrammetric and Terrestrial Laser Scanning (TLS) surveys. All datasets show a large lack of symmetry that leads to the conclusion that the standard normal parameters are not adequate to assess this type of data. The use of non-normal statistics gives thus a more appropriate description of the data and yields results that meet the quoted a-priori errors.

  19. A review of hemorheology: Measuring techniques and recent advances

    NASA Astrophysics Data System (ADS)

    Sousa, Patrícia C.; Pinho, Fernando T.; Alves, Manuel A.; Oliveira, Mónica S. N.

    2016-02-01

    Significant progress has been made over the years on the topic of hemorheology, not only in terms of the development of more accurate and sophisticated techniques, but also in terms of understanding the phenomena associated with blood components, their interactions and impact upon blood properties. The rheological properties of blood are strongly dependent on the interactions and mechanical properties of red blood cells, and a variation of these properties can bring further insight into the human health state and can be an important parameter in clinical diagnosis. In this article, we provide both a reference for hemorheological research and a resource regarding the fundamental concepts in hemorheology. This review is aimed at those starting in the field of hemodynamics, where blood rheology plays a significant role, but also at those in search of the most up-to-date findings (both qualitative and quantitative) in hemorheological measurements and novel techniques used in this context, including technical advances under more extreme conditions such as in large amplitude oscillatory shear flow or under extensional flow, which impose large deformations comparable to those found in the microcirculatory system and in diseased vessels. Given the impressive rate of increase in the available knowledge on blood flow, this review is also intended to identify areas where current knowledge is still incomplete, and which have the potential for new, exciting and useful research. We also discuss the most important parameters that can lead to an alteration of blood rheology, and which as a consequence can have a significant impact on the normal physiological behavior of blood.

  20. Advances in Poly(4-aminodiphenylaniline) Nanofibers Preparation by Electrospinning Technique.

    PubMed

    Della Pina, C; Busacca, C; Frontera, P; Antonucci, P L; Scarpino, L A; Sironi, A; Falletta, E

    2016-05-01

    Polyaniline (PANI) nanofibers are drawing a great deal of interest from academia and industry due to their multiple applications, especially in biomedical field. PANI nanofibers were successfully electrospun for the first time by MacDiarmid and co-workers at the beginning of the millennium and since then many efforts have been addressed to improve their quality. However, traditional PANI prepared from aniline monomer shows some drawbacks, such as presence of toxic (i.e., benzidine) and inorganic (salts and metals) co-products, that complicate polymer post-treatment, and low solubility in common organic solvents, making hard its processing by electrospinning technique. Some industrial sectors, such as medical and biomedical, need to employ materials free from toxic and polluting species. In this regard, the oxidative polymerization of N-(4-aminophenyl)aniline, aniline dimer, to produce poly(4-aminodiphenylaniline), P4ADA, a kind of PANI, represents an innovative alternative to the traditional synthesis because the obtained polymer results free from carcinogenic and/or polluting co-products, and, moreover, more soluble than traditional PANI. This latter feature can be exploited to obtain P4ADA nanofibers by electrospinning technique. In this paper we report the advances obtained in the P4ADA nanofibers electrospinnig. A comparison among polyethylene oxide (PEO), polymethyl methacrylate (PMMA) and polystyrene (PS), as the second polymer to facilitate the electrospinning process, is shown. In order to increase the conductivity of P4ADA nanofibers, two strategies were adopted and compared: selective insulating binder removal from electrospun nanofibers by a rinsing tratment, afterwards optimizing the minimum amount of binder necessary for the electrospinning process. Moreover, the effect of PEO/P4ADA weight ratio on the fibers morphology and conductivity was highlighted. PMID:27483933

  1. Advances in Poly(4-aminodiphenylaniline) Nanofibers Preparation by Electrospinning Technique.

    PubMed

    Della Pina, C; Busacca, C; Frontera, P; Antonucci, P L; Scarpino, L A; Sironi, A; Falletta, E

    2016-05-01

    Polyaniline (PANI) nanofibers are drawing a great deal of interest from academia and industry due to their multiple applications, especially in biomedical field. PANI nanofibers were successfully electrospun for the first time by MacDiarmid and co-workers at the beginning of the millennium and since then many efforts have been addressed to improve their quality. However, traditional PANI prepared from aniline monomer shows some drawbacks, such as presence of toxic (i.e., benzidine) and inorganic (salts and metals) co-products, that complicate polymer post-treatment, and low solubility in common organic solvents, making hard its processing by electrospinning technique. Some industrial sectors, such as medical and biomedical, need to employ materials free from toxic and polluting species. In this regard, the oxidative polymerization of N-(4-aminophenyl)aniline, aniline dimer, to produce poly(4-aminodiphenylaniline), P4ADA, a kind of PANI, represents an innovative alternative to the traditional synthesis because the obtained polymer results free from carcinogenic and/or polluting co-products, and, moreover, more soluble than traditional PANI. This latter feature can be exploited to obtain P4ADA nanofibers by electrospinning technique. In this paper we report the advances obtained in the P4ADA nanofibers electrospinnig. A comparison among polyethylene oxide (PEO), polymethyl methacrylate (PMMA) and polystyrene (PS), as the second polymer to facilitate the electrospinning process, is shown. In order to increase the conductivity of P4ADA nanofibers, two strategies were adopted and compared: selective insulating binder removal from electrospun nanofibers by a rinsing tratment, afterwards optimizing the minimum amount of binder necessary for the electrospinning process. Moreover, the effect of PEO/P4ADA weight ratio on the fibers morphology and conductivity was highlighted.

  2. Advanced Techniques for Power System Identification from Measured Data

    SciTech Connect

    Pierre, John W.; Wies, Richard; Trudnowski, Daniel

    2008-11-25

    Time-synchronized measurements provide rich information for estimating a power-system's electromechanical modal properties via advanced signal processing. This information is becoming critical for the improved operational reliability of interconnected grids. A given mode's properties are described by its frequency, damping, and shape. Modal frequencies and damping are useful indicators of power-system stress, usually declining with increased load or reduced grid capacity. Mode shape provides critical information for operational control actions. This project investigated many advanced techniques for power system identification from measured data focusing on mode frequency and damping ratio estimation. Investigators from the three universities coordinated their effort with Pacific Northwest National Laboratory (PNNL). Significant progress was made on developing appropriate techniques for system identification with confidence intervals and testing those techniques on field measured data and through simulation. Experimental data from the western area power system was provided by PNNL and Bonneville Power Administration (BPA) for both ambient conditions and for signal injection tests. Three large-scale tests were conducted for the western area in 2005 and 2006. Measured field PMU (Phasor Measurement Unit) data was provided to the three universities. A 19-machine simulation model was enhanced for testing the system identification algorithms. Extensive simulations were run with this model to test the performance of the algorithms. University of Wyoming researchers participated in four primary activities: (1) Block and adaptive processing techniques for mode estimation from ambient signals and probing signals, (2) confidence interval estimation, (3) probing signal design and injection method analysis, and (4) performance assessment and validation from simulated and field measured data. Subspace based methods have been use to improve previous results from block processing

  3. Nanocrystalline materials: recent advances in crystallographic characterization techniques.

    PubMed

    Ringe, Emilie

    2014-11-01

    Most properties of nanocrystalline materials are shape-dependent, providing their exquisite tunability in optical, mechanical, electronic and catalytic properties. An example of the former is localized surface plasmon resonance (LSPR), the coherent oscillation of conduction electrons in metals that can be excited by the electric field of light; this resonance frequency is highly dependent on both the size and shape of a nanocrystal. An example of the latter is the marked difference in catalytic activity observed for different Pd nanoparticles. Such examples highlight the importance of particle shape in nanocrystalline materials and their practical applications. However, one may ask 'how are nanoshapes created?', 'how does the shape relate to the atomic packing and crystallography of the material?', 'how can we control and characterize the external shape and crystal structure of such small nanocrystals?'. This feature article aims to give the reader an overview of important techniques, concepts and recent advances related to these questions. Nucleation, growth and how seed crystallography influences the final synthesis product are discussed, followed by shape prediction models based on seed crystallography and thermodynamic or kinetic parameters. The crystallographic implications of epitaxy and orientation in multilayered, core-shell nanoparticles are overviewed, and, finally, the development and implications of novel, spatially resolved analysis tools are discussed.

  4. Achieving miniature sensor systems via advanced packaging techniques

    NASA Astrophysics Data System (ADS)

    Hartup, David C.; Bobier, Kevin; Demmin, Jeffrey

    2005-05-01

    Demands for miniaturized networked sensors that can be deployed in large quantities dictate that the packages be small and cost effective. In order to accomplish these objectives, system developers generally apply advanced packaging techniques to proven systems. A partnership of Nova Engineering and Tessera begins with a baseline of Nova's Unattended Ground Sensors (UGS) technology and utilizes Tessera's three-dimensional (3D) Chip-Scale Packaging (CSP), Multi-Chip Packaging (MCP), and System-in-Package (SIP) innovations to enable novel methods for fabricating compact, vertically integrated sensors utilizing digital, RF, and micro-electromechanical systems (MEMS) devices. These technologies, applied to a variety of sensors and integrated radio architectures, enable diverse multi-modal sensing networks with wireless communication capabilities. Sensors including imaging, accelerometers, acoustical, inertial measurement units, and gas and pressure sensors can be utilized. The greatest challenge to high density, multi-modal sensor networks is the ability to test each component prior to integration, commonly called Known Good Die (KGD) testing. In addition, the mix of multi-sourcing and high technology magnifies the challenge of testing at the die level. Utilizing Tessera proprietary CSP, MCP, and SIP interconnection methods enables fully testable, low profile stacking to create multi-modal sensor radios with high yield.

  5. Removing baseline flame's spectrum by using advanced recovering spectrum techniques.

    PubMed

    Arias, Luis; Sbarbaro, Daniel; Torres, Sergio

    2012-09-01

    In this paper, a novel automated algorithm to estimate and remove the continuous baseline from measured flame spectra is proposed. The algorithm estimates the continuous background based on previous information obtained from a learning database of continuous flame spectra. Then, the discontinuous flame emission is calculated by subtracting the estimated continuous baseline from the measured spectrum. The key issue subtending the learning database is that the continuous flame emissions are predominant in the sooty regions, in absence of discontinuous radiation. The proposed algorithm was tested using natural gas and bio-oil flames spectra at different combustion conditions, and the goodness-of-fit coefficient (GFC) quality metric was used to quantify the performance in the estimation process. Additionally, the commonly used first derivative method (FDM) for baseline removing was applied to the same testing spectra in order to compare and to evaluate the proposed technique. The achieved results show that the proposed method is a very attractive tool for designing advanced combustion monitoring strategies of discontinuous emissions. PMID:22945158

  6. Development of advanced strain diagnostic techniques for reactor environments.

    SciTech Connect

    Fleming, Darryn D.; Holschuh, Thomas Vernon,; Miller, Timothy J.; Hall, Aaron Christopher; Urrea, David Anthony,; Parma, Edward J.,

    2013-02-01

    The following research is operated as a Laboratory Directed Research and Development (LDRD) initiative at Sandia National Laboratories. The long-term goals of the program include sophisticated diagnostics of advanced fuels testing for nuclear reactors for the Department of Energy (DOE) Gen IV program, with the future capability to provide real-time measurement of strain in fuel rod cladding during operation in situ at any research or power reactor in the United States. By quantifying the stress and strain in fuel rods, it is possible to significantly improve fuel rod design, and consequently, to improve the performance and lifetime of the cladding. During the past year of this program, two sets of experiments were performed: small-scale tests to ensure reliability of the gages, and reactor pulse experiments involving the most viable samples in the Annulated Core Research Reactor (ACRR), located onsite at Sandia. Strain measurement techniques that can provide useful data in the extreme environment of a nuclear reactor core are needed to characterize nuclear fuel rods. This report documents the progression of solutions to this issue that were explored for feasibility in FY12 at Sandia National Laboratories, Albuquerque, NM.

  7. Nanocrystalline materials: recent advances in crystallographic characterization techniques

    PubMed Central

    Ringe, Emilie

    2014-01-01

    Most properties of nanocrystalline materials are shape-dependent, providing their exquisite tunability in optical, mechanical, electronic and catalytic properties. An example of the former is localized surface plasmon resonance (LSPR), the coherent oscillation of conduction electrons in metals that can be excited by the electric field of light; this resonance frequency is highly dependent on both the size and shape of a nanocrystal. An example of the latter is the marked difference in catalytic activity observed for different Pd nanoparticles. Such examples highlight the importance of particle shape in nanocrystalline materials and their practical applications. However, one may ask ‘how are nanoshapes created?’, ‘how does the shape relate to the atomic packing and crystallography of the material?’, ‘how can we control and characterize the external shape and crystal structure of such small nanocrystals?’. This feature article aims to give the reader an overview of important techniques, concepts and recent advances related to these questions. Nucleation, growth and how seed crystallography influences the final synthesis product are discussed, followed by shape prediction models based on seed crystallography and thermodynamic or kinetic parameters. The crystallographic implications of epitaxy and orientation in multilayered, core-shell nanoparticles are overviewed, and, finally, the development and implications of novel, spatially resolved analysis tools are discussed. PMID:25485133

  8. Curve fitting and modeling with splines using statistical variable selection techniques

    NASA Technical Reports Server (NTRS)

    Smith, P. L.

    1982-01-01

    The successful application of statistical variable selection techniques to fit splines is demonstrated. Major emphasis is given to knot selection, but order determination is also discussed. Two FORTRAN backward elimination programs, using the B-spline basis, were developed. The program for knot elimination is compared in detail with two other spline-fitting methods and several statistical software packages. An example is also given for the two-variable case using a tensor product basis, with a theoretical discussion of the difficulties of their use.

  9. Using Candy Samples To Learn about Sampling Techniques and Statistical Data Evaluation

    NASA Astrophysics Data System (ADS)

    Canaes, Larissa S.; Brancalion, Marcel L.; Rossi, Adriana V.; Rath, Susanne

    2008-08-01

    A classroom exercise for undergraduate and beginning graduate students that takes about one class period is proposed and discussed. It is an easy, interesting exercise that demonstrates important aspects of sampling techniques (sample amount, particle size, and the representativeness of the sample in relation to the bulk material). The exercise also explores a simple statistical approach to commonly used parameters (mean, median, standard deviation, errors, quartiles, confidence limits); the presentation of results using histogram, box-plot and whisker plot; and the related tests (normality, outliers, significance) using parametric and non-parametric statistical methods.

  10. Student Estimates of Probability and Uncertainty in Advanced Laboratory and Statistical Physics Courses

    NASA Astrophysics Data System (ADS)

    Mountcastle, Donald B.; Bucy, Brandon R.; Thompson, John R.

    2007-11-01

    Equilibrium properties of macroscopic systems are highly predictable as n, the number of particles approaches and exceeds Avogadro's number; theories of statistical physics depend on these results. Typical pedagogical devices used in statistical physics textbooks to introduce entropy (S) and multiplicity (ω) (where S = k ln(ω)) include flipping coins and/or other equivalent binary events, repeated n times. Prior to instruction, our statistical mechanics students usually gave reasonable answers about the probabilities, but not the relative uncertainties, of the predicted outcomes of such events. However, they reliably predicted that the uncertainty in a measured continuous quantity (e.g., the amount of rainfall) does decrease as the number of measurements increases. Typical textbook presentations assume that students understand that the relative uncertainty of binary outcomes will similarly decrease as the number of events increases. This is at odds with our findings, even though most of our students had previously completed mathematics courses in statistics, as well as an advanced electronics laboratory course that included statistical analysis of distributions of dart scores as n increased.

  11. Advanced techniques and technology for efficient data storage, access, and transfer

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.; Miller, Warner

    1991-01-01

    Advanced techniques for efficiently representing most forms of data are being implemented in practical hardware and software form through the joint efforts of three NASA centers. These techniques adapt to local statistical variations to continually provide near optimum code efficiency when representing data without error. Demonstrated in several earlier space applications, these techniques are the basis of initial NASA data compression standards specifications. Since the techniques clearly apply to most NASA science data, NASA invested in the development of both hardware and software implementations for general use. This investment includes high-speed single-chip very large scale integration (VLSI) coding and decoding modules as well as machine-transferrable software routines. The hardware chips were tested in the laboratory at data rates as high as 700 Mbits/s. A coding module's definition includes a predictive preprocessing stage and a powerful adaptive coding stage. The function of the preprocessor is to optimally process incoming data into a standard form data source that the second stage can handle.The built-in preprocessor of the VLSI coder chips is ideal for high-speed sampled data applications such as imaging and high-quality audio, but additionally, the second stage adaptive coder can be used separately with any source that can be externally preprocessed into the 'standard form'. This generic functionality assures that the applicability of these techniques and their recent high-speed implementations should be equally broad outside of NASA.

  12. Hybrid inverse lithography techniques for advanced hierarchical memories

    NASA Astrophysics Data System (ADS)

    Xiao, Guangming; Hooker, Kevin; Irby, Dave; Zhang, Yunqiang; Ward, Brian; Cecil, Tom; Hall, Brett; Lee, Mindy; Kim, Dave; Lucas, Kevin

    2014-03-01

    Traditional segment-based model-based OPC methods have been the mainstream mask layout optimization techniques in volume production for memory and embedded memory devices for many device generations. These techniques have been continually optimized over time to meet the ever increasing difficulties of memory and memory periphery patterning. There are a range of difficult issues for patterning embedded memories successfully. These difficulties include the need for a very high level of symmetry and consistency (both within memory cells themselves and between cells) due to circuit effects such as noise margin requirements in SRAMs. Memory cells and access structures consume a large percentage of area in embedded devices so there is a very high return from shrinking the cell area as much as possible. This aggressive scaling leads to very difficult resolution, 2D CD control and process window requirements. Additionally, the range of interactions between mask synthesis corrections of neighboring areas can extend well beyond the size of the memory cell, making it difficult to fully take advantage of the inherent designed cell hierarchy in mask pattern optimization. This is especially true for non-traditional (i.e., less dependent on geometric rule) OPC/RET methods such as inverse lithography techniques (ILT) which inherently have more model-based decisions in their optimizations. New inverse methods such as model-based SRAF placement and ILT are, however, well known to have considerable benefits in finding flexible mask pattern solutions to improve process window, improve 2D CD control, and improve resolution in ultra-dense memory patterns. They also are known to reduce recipe complexity and provide native MRC compliant mask pattern solutions. Unfortunately, ILT is also known to be several times slower than traditional OPC methods due to the increased computational lithographic optimizations it performs. In this paper, we describe and present results for a methodology to

  13. Data on electrical energy conservation using high efficiency motors for the confidence bounds using statistical techniques.

    PubMed

    Shaikh, Muhammad Mujtaba; Memon, Abdul Jabbar; Hussain, Manzoor

    2016-09-01

    In this article, we describe details of the data used in the research paper "Confidence bounds for energy conservation in electric motors: An economical solution using statistical techniques" [1]. The data presented in this paper is intended to show benefits of high efficiency electric motors over the standard efficiency motors of similar rating in the industrial sector of Pakistan. We explain how the data was collected and then processed by means of formulas to show cost effectiveness of energy efficient motors in terms of three important parameters: annual energy saving, cost saving and payback periods. This data can be further used to construct confidence bounds for the parameters using statistical techniques as described in [1].

  14. Data on electrical energy conservation using high efficiency motors for the confidence bounds using statistical techniques.

    PubMed

    Shaikh, Muhammad Mujtaba; Memon, Abdul Jabbar; Hussain, Manzoor

    2016-09-01

    In this article, we describe details of the data used in the research paper "Confidence bounds for energy conservation in electric motors: An economical solution using statistical techniques" [1]. The data presented in this paper is intended to show benefits of high efficiency electric motors over the standard efficiency motors of similar rating in the industrial sector of Pakistan. We explain how the data was collected and then processed by means of formulas to show cost effectiveness of energy efficient motors in terms of three important parameters: annual energy saving, cost saving and payback periods. This data can be further used to construct confidence bounds for the parameters using statistical techniques as described in [1]. PMID:27408926

  15. Advanced statistical methods for improved data analysis of NASA astrophysics missions

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric D.

    The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.

  16. Advanced statistical methods for improved data analysis of NASA astrophysics missions

    NASA Technical Reports Server (NTRS)

    Feigelson, Eric D.

    1992-01-01

    The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.

  17. ADVANCED TECHNIQUES FOR RESERVOIR SIMULATION AND MODELING OF NONCONVENTIONAL WELLS

    SciTech Connect

    Louis J. Durlofsky; Khalid Aziz

    2004-08-20

    Nonconventional wells, which include horizontal, deviated, multilateral and ''smart'' wells, offer great potential for the efficient management of oil and gas reservoirs. These wells are able to contact larger regions of the reservoir than conventional wells and can also be used to target isolated hydrocarbon accumulations. The use of nonconventional wells instrumented with downhole inflow control devices allows for even greater flexibility in production. Because nonconventional wells can be very expensive to drill, complete and instrument, it is important to be able to optimize their deployment, which requires the accurate prediction of their performance. However, predictions of nonconventional well performance are often inaccurate. This is likely due to inadequacies in some of the reservoir engineering and reservoir simulation tools used to model and optimize nonconventional well performance. A number of new issues arise in the modeling and optimization of nonconventional wells. For example, the optimal use of downhole inflow control devices has not been addressed for practical problems. In addition, the impact of geological and engineering uncertainty (e.g., valve reliability) has not been previously considered. In order to model and optimize nonconventional wells in different settings, it is essential that the tools be implemented into a general reservoir simulator. This simulator must be sufficiently general and robust and must in addition be linked to a sophisticated well model. Our research under this five year project addressed all of the key areas indicated above. The overall project was divided into three main categories: (1) advanced reservoir simulation techniques for modeling nonconventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and for coupling the well to the simulator (which includes the accurate calculation of well index and the modeling of multiphase flow in the wellbore

  18. Advanced Techniques for Simulating the Behavior of Sand

    NASA Astrophysics Data System (ADS)

    Clothier, M.; Bailey, M.

    2009-12-01

    research is to simulate the look and behavior of sand, this work will go beyond simple particle collision. In particular, we can continue to use our parallel algorithms not only on single particles but on particle “clumps” that consist of multiple combined particles. Since sand is typically not spherical in nature, these particle “clumps” help to simulate the coarse nature of sand. In a simulation environment, multiple combined particles could be used to simulate the polygonal and granular nature of sand grains. Thus, a diversity of sand particles can be generated. The interaction between these particles can then be parallelized using GPU hardware. As such, this research will investigate different graphics and physics techniques and determine the tradeoffs in performance and visual quality for sand simulation. An enhanced sand model through the use of high performance computing and GPUs has great potential to impact research for both earth and space scientists. Interaction with JPL has provided an opportunity for us to refine our simulation techniques that can ultimately be used for their vehicle simulator. As an added benefit of this work, advancements in simulating sand can also benefit scientists here on earth, especially in regard to understanding landslides and debris flows.

  19. Hierarchical probabilistic regionalization of volcanism for Sengan region in Japan using multivariate statistical techniques and geostatistical interpolation techniques

    SciTech Connect

    Park, Jinyong; Balasingham, P; McKenna, Sean Andrew; Pinnaduwa H.S.W. Kulatilake

    2004-09-01

    Sandia National Laboratories, under contract to Nuclear Waste Management Organization of Japan (NUMO), is performing research on regional classification of given sites in Japan with respect to potential volcanic disruption using multivariate statistics and geo-statistical interpolation techniques. This report provides results obtained for hierarchical probabilistic regionalization of volcanism for the Sengan region in Japan by applying multivariate statistical techniques and geostatistical interpolation techniques on the geologic data provided by NUMO. A workshop report produced in September 2003 by Sandia National Laboratories (Arnold et al., 2003) on volcanism lists a set of most important geologic variables as well as some secondary information related to volcanism. Geologic data extracted for the Sengan region in Japan from the data provided by NUMO revealed that data are not available at the same locations for all the important geologic variables. In other words, the geologic variable vectors were found to be incomplete spatially. However, it is necessary to have complete geologic variable vectors to perform multivariate statistical analyses. As a first step towards constructing complete geologic variable vectors, the Universal Transverse Mercator (UTM) zone 54 projected coordinate system and a 1 km square regular grid system were selected. The data available for each geologic variable on a geographic coordinate system were transferred to the aforementioned grid system. Also the recorded data on volcanic activity for Sengan region were produced on the same grid system. Each geologic variable map was compared with the recorded volcanic activity map to determine the geologic variables that are most important for volcanism. In the regionalized classification procedure, this step is known as the variable selection step. The following variables were determined as most important for volcanism: geothermal gradient, groundwater temperature, heat discharge, groundwater

  20. Weldability and joining techniques for advanced fossil energy system alloys

    SciTech Connect

    Lundin, C.D.; Qiao, C.Y.P.; Liu, W.; Yang, D.; Zhou, G.; Morrison, M.

    1998-05-01

    The efforts represent the concerns for the basic understanding of the weldability and fabricability of the advanced high temperature alloys so necessary to affect increases in the efficiency of the next generation Fossil Energy Power Plants. The effort was divided into three tasks with the first effort dealing with the welding and fabrication behavior of 310HCbN (HR3C), the second task details the studies aimed at understanding the weldability of a newly developed 310TaN high temperature stainless (a modification of 310 stainless) and Task 3 addressed the cladding of austenitic tubing with Iron-Aluminide using the GTAW process. Task 1 consisted of microstructural studies on 310HCbN and the development of a Tube Weldability test which has applications to production welding techniques as well as laboratory weldability assessments. In addition, the evaluation of ex-service 310HCbN which showed fireside erosion and cracking at the attachment weld locations was conducted. Task 2 addressed the behavior of the newly developed 310 TaN modification of standard 310 stainless steel and showed that the weldability was excellent and that the sensitization potential was minimal for normal welding and fabrication conditions. The microstructural evolution during elevated temperature testing was characterized and the second phase particles evolved upon aging were identified. Task 3 details the investigation undertaken to clad 310HCbN tubing with Iron Aluminide and developed welding conditions necessary to provide a crack free cladding. The work showed that both a preheat and a post-heat was necessary for crack free deposits and the effect of a third element on the cracking potential was defined together with the effect of the aluminum level for optimum weldability.

  1. Investigation of joining techniques for advanced austenitic alloys

    SciTech Connect

    Lundin, C.D.; Qiao, C.Y.P.; Kikuchi, Y.; Shi, C.; Gill, T.P.S.

    1991-05-01

    Modified Alloys 316 and 800H, designed for high temperature service, have been developed at Oak Ridge National Laboratory. Assessment of the weldability of the advanced austenitic alloys has been conducted at the University of Tennessee. Four aspects of weldability of the advanced austenitic alloys were included in the investigation.

  2. Evaluation of drug-polymer solubility curves through formal statistical analysis: comparison of preparation techniques.

    PubMed

    Knopp, Matthias Manne; Olesen, Niels Erik; Holm, Per; Löbmann, Korbinian; Holm, René; Langguth, Peter; Rades, Thomas

    2015-01-01

    In this study, the influence of the preparation technique (ball milling, spray drying, and film casting) of a supersaturated amorphous dispersion on the quality of solubility determinations of indomethacin in polyvinylpyrrolidone was investigated by means of statistical analysis. After annealing of the amorphous dispersions above the crystallization temperature for 2 h, the solubility curve was derived from the glass transition temperature of the demixed material using the Gordon-Taylor relationship and fitting with the Flory-Huggins model. The study showed that the predicted solubility from the ball-milled mixtures was not consistent with those from spray drying and film casting, indicating fundamental differences between the preparation techniques. Through formal statistical analysis, the best combination of fit to the Flory-Huggins model and reproducibility of the measurements was analyzed. Ball milling provided the best reproducibility of the three preparation techniques; however, an analysis of residuals revealed a systematic error. In contrast, film casting demonstrated a good fit to the model but poor reproducibility of the measurements. Therefore, this study recommends that techniques such as spray drying or potentially film casting (if experimental reproducibility can be improved) should be used to prepare the amorphous dispersions when performing solubility measurements of this kind.

  3. Selecting statistical or machine learning techniques for regional landslide susceptibility modelling by evaluating spatial prediction

    NASA Astrophysics Data System (ADS)

    Goetz, Jason; Brenning, Alexander; Petschko, Helene; Leopold, Philip

    2015-04-01

    With so many techniques now available for landslide susceptibility modelling, it can be challenging to decide on which technique to apply. Generally speaking, the criteria for model selection should be tied closely to end users' purpose, which could be spatial prediction, spatial analysis or both. In our research, we focus on comparing the spatial predictive abilities of landslide susceptibility models. We illustrate how spatial cross-validation, a statistical approach for assessing spatial prediction performance, can be applied with the area under the receiver operating characteristic curve (AUROC) as a prediction measure for model comparison. Several machine learning and statistical techniques are evaluated for prediction in Lower Austria: support vector machine, random forest, bundling with penalized linear discriminant analysis, logistic regression, weights of evidence, and the generalized additive model. In addition to predictive performance, the importance of predictor variables in each model was estimated using spatial cross-validation by calculating the change in AUROC performance when variables are randomly permuted. The susceptibility modelling techniques were tested in three areas of interest in Lower Austria, which have unique geologic conditions associated with landslide occurrence. Overall, we found for the majority of comparisons that there were little practical or even statistically significant differences in AUROCs. That is the models' prediction performances were very similar. Therefore, in addition to prediction, the ability to interpret models for spatial analysis and the qualitative qualities of the prediction surface (map) are considered and discussed. The measure of variable importance provided some insight into the model behaviour for prediction, in particular for "black-box" models. However, there were no clear patterns in all areas of interest to why certain variables were given more importance over others.

  4. High-resolution climate simulations for Central Europe: An assessment of dynamical and statistical downscaling techniques

    NASA Astrophysics Data System (ADS)

    Miksovsky, J.; Huth, R.; Halenka, T.; Belda, M.; Farda, A.; Skalak, P.; Stepanek, P.

    2009-12-01

    To bridge the resolution gap between the outputs of global climate models (GCMs) and finer-scale data needed for studies of the climate change impacts, two approaches are widely used: dynamical downscaling, based on application of regional climate models (RCMs) embedded into the domain of the GCM simulation, and statistical downscaling (SDS), using empirical transfer functions between the large-scale data generated by the GCM and local measurements. In our contribution, we compare the performance of different variants of both techniques for the region of Central Europe. The dynamical downscaling is represented by the outputs of two regional models run in the 10 km horizontal grid, ALADIN-CLIMATE/CZ (co-developed by the Czech Hydrometeorological Institute and Meteo-France) and RegCM3 (developed by the Abdus Salam Centre for Theoretical Physics). The applied statistical methods were based on multiple linear regression, as well as on several of its nonlinear alternatives, including techniques employing artificial neural networks. Validation of the downscaling outputs was carried out using measured data, gathered from weather stations in the Czech Republic, Slovakia, Austria and Hungary for the end of the 20th century; series of daily values of maximum and minimum temperature, precipitation and relative humidity were analyzed. None of the regional models or statistical downscaling techniques could be identified as the universally best one. For instance, while most statistical methods misrepresented the shape of the statistical distribution of the target variables (especially in the more challenging cases such as estimation of daily precipitation), RCM-generated data often suffered from severe biases. It is also shown that further enhancement of the simulated fields of climate variables can be achieved through a combination of dynamical downscaling and statistical postprocessing. This can not only be used to reduce biases and other systematic flaws in the generated time

  5. How complex climate networks complement eigen techniques for the statistical analysis of climatological data

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan; Petrova, Irina; Löw, Alexander; Marwan, Norbert; Kurths, Jürgen

    2015-04-01

    Eigen techniques such as empirical orthogonal function (EOF) or coupled pattern (CP) / maximum covariance analysis have been frequently used for detecting patterns in multivariate climatological data sets. Recently, statistical methods originating from the theory of complex networks have been employed for the very same purpose of spatio-temporal analysis. This climate network (CN) analysis is usually based on the same set of similarity matrices as is used in classical EOF or CP analysis, e.g., the correlation matrix of a single climatological field or the cross-correlation matrix between two distinct climatological fields. In this study, formal relationships as well as conceptual differences between both eigen and network approaches are derived and illustrated using global precipitation, evaporation and surface air temperature data sets. These results allow us to pinpoint that CN analysis can complement classical eigen techniques and provides additional information on the higher-order structure of statistical interrelationships in climatological data. Hence, CNs are a valuable supplement to the statistical toolbox of the climatologist, particularly for making sense out of very large data sets such as those generated by satellite observations and climate model intercomparison exercises.

  6. How complex climate networks complement eigen techniques for the statistical analysis of climatological data

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan F.; Petrova, Irina; Loew, Alexander; Marwan, Norbert; Kurths, Jürgen

    2015-11-01

    Eigen techniques such as empirical orthogonal function (EOF) or coupled pattern (CP)/maximum covariance analysis have been frequently used for detecting patterns in multivariate climatological data sets. Recently, statistical methods originating from the theory of complex networks have been employed for the very same purpose of spatio-temporal analysis. This climate network (CN) analysis is usually based on the same set of similarity matrices as is used in classical EOF or CP analysis, e.g., the correlation matrix of a single climatological field or the cross-correlation matrix between two distinct climatological fields. In this study, formal relationships as well as conceptual differences between both eigen and network approaches are derived and illustrated using global precipitation, evaporation and surface air temperature data sets. These results allow us to pinpoint that CN analysis can complement classical eigen techniques and provides additional information on the higher-order structure of statistical interrelationships in climatological data. Hence, CNs are a valuable supplement to the statistical toolbox of the climatologist, particularly for making sense out of very large data sets such as those generated by satellite observations and climate model intercomparison exercises.

  7. Assessment technique for acne treatments based on statistical parameters of skin thermal images.

    PubMed

    Padilla-Medina, J Alfredo; León-Ordoñez, Francisco; Prado-Olivarez, Juan; Vela-Aguirre, Noe; Ramírez-Agundis, Agustin; Díaz-Carmona, Javier

    2014-04-01

    Acne vulgaris as an inflammatory disease, with an excessive production of subdermal fat, modifies the dynamics of the bloodstream, and consequently temperature, on the affected skin zone. A high percentage of this heat interchange is manifested as electromagnetic radiation with far-infrared wavelengths, which can be captured through a thermal imaging camera. A technique based on thermal image analysis for efficiency assessment in acne vulgaris is described. The procedure is based on computing statistical parameters of thermal images captured from the affected skin zone being attended by an acne treatment. The proposed technique was used to determine the skin thermal behavior according to acne severity levels in different acne treatment stages. Infrared images of acne skin zones on eight patients, diagnosed with acne vulgaris and attended by one specific acne treatment, were weekly registered during 11 weeks. The infrared images were captured until no more improvement in affected zones was detected. The obtained results suggest a direct relationship between the used statistical parameters, particularly first- and second-order statistics, and the acne vulgaris severity level on the affected zones.

  8. Statistical adaptive reversible steganographic technique using bicubic interpolation and difference expansion

    NASA Astrophysics Data System (ADS)

    Liu, Yu-Chi; Tsai, Chwei-Shyong; Yang, Wen-Lung; Tsai, Yi-Chang; Yu, Shyr-Shen

    2010-08-01

    The reversible steganographic technique allows extraction of secret messages and restoration of original images without any distortion from the embedded image. In this work, a statistical adaptive reversible steganographic technique is proposed to improve difference expansion (DE)-based schemes, consisting of two parts. First, bicubic interpolation is adopted as the pixel prediction to obtain more embeddable pixels. Meanwhile, since differences are generated between the accurate predicted value and its original value, quality of difference is also considered. Second, a statistical adaptive reversible embedding algorithm is proposed to overcome the restriction of the embedding capacity under single-layer embedding. The relationship between the complexity of the neighboring pixels and the difference distribution for the image is generalized as the variance conditional in statistics. With the maximum modifiable degree of the predicted pixel, the proposed scheme provides a suitable embedding capacity for all embeddable pixels with less additional information. The experimental results demonstrate advantages of the proposed scheme and prove that it is able to provide high capacity with good visual quality for the embedded image.

  9. Recent advances in biosensor techniques for environmental monitoring.

    PubMed

    Rogers, K R

    2006-05-24

    Biosensors for environmental applications continue to show advances and improvements in areas such as sensitivity, selectivity and simplicity. In addition to detecting and measuring specific compounds or compound classes such as pesticides, hazardous industrial chemicals, toxic metals, and pathogenic bacteria, biosensors and bioanalytical assays have been designed to measure biological effects such as cytotoxicity, genotoxicity, biological oxygen demand, pathogenic bacteria, and endocrine disruption effects. This article is intended to discuss recent advances in the area of biosensors for environmental applications.

  10. Probabilistic and numerical techniques in the study of statistical theories of turbulence. Final Technical Report

    SciTech Connect

    Ellis, Richard S.; Turkington, B.

    2003-06-09

    In this research project we made fundamental advances in a number of problems arising in statistical equilibrium theories of turbulence. Here are the highlights. In most cases the mathematical analysis was supplemented by numerical calculations. (a) Maximum entropy principles. We analyzed in a unified framework the Miller-Robert continuum model of equilibrium states in an ideal fluid and a modification of that model due to Turkington. (b) Equivalence and nonequivalence of ensembles. We gave a complete analysis of the equivalence and nonequivalence of the microcanonical, canonical, and mixed ensembles at the level of equilibrium macrostates for a large class of models of turbulence. (c) Nonlinear stability of flows. We refined the well known Arnold stability theorems by proving the nonlinear stability of steady mean flows for the quasi-geostrophic potential vorticity equation in the case when the ensembles are nonequivalent. (d) Geophysical application. The theories developed in items (a), (b), and (c) were applied to predict the emergence and persistence of coherent structures in the active weather layer of the Jovian atmosphere. This is the first work in which sophisticated statistical theories are synthesized with actual observations data from the Voyager and Galileo space missions. (e) Nonlinear dispersive waves. For a class of nonlinear Schroedinger equations we demonstrated that the self-organization of solutions into a ground-state solitary wave immersed in fine-scale fluctuations is a relaxation into statistical equilibrium.

  11. Application of statistical downscaling technique for the production of wine grapes (Vitis vinifera L.) in Spain

    NASA Astrophysics Data System (ADS)

    Gaitán Fernández, E.; García Moreno, R.; Pino Otín, M. R.; Ribalaygua Batalla, J.

    2012-04-01

    Climate and soil are two of the most important limiting factors for agricultural production. Nowadays climate change has been documented in many geographical locations affecting different cropping systems. The General Circulation Models (GCM) has become important tools to simulate the more relevant aspects of the climate expected for the XXI century in the frame of climatic change. These models are able to reproduce the general features of the atmospheric dynamic but their low resolution (about 200 Km) avoids a proper simulation of lower scale meteorological effects. Downscaling techniques allow overcoming this problem by adapting the model outcomes to local scale. In this context, FIC (Fundación para la Investigación del Clima) has developed a statistical downscaling technique based on a two step analogue methods. This methodology has been broadly tested on national and international environments leading to excellent results on future climate models. In a collaboration project, this statistical downscaling technique was applied to predict future scenarios for the grape growing systems in Spain. The application of such model is very important to predict expected climate for the different growing crops, mainly for grape, where the success of different varieties are highly related to climate and soil. The model allowed the implementation of agricultural conservation practices in the crop production, detecting highly sensible areas to negative impacts produced by any modification of climate in the different regions, mainly those protected with protected designation of origin, and the definition of new production areas with optimal edaphoclimatic conditions for the different varieties.

  12. Multivariate statistical techniques for the assessment of seasonal variations in surface water quality of pasture ecosystems.

    PubMed

    Ajorlo, Majid; Abdullah, Ramdzani B; Yusoff, Mohd Kamil; Halim, Ridzwan Abd; Hanif, Ahmad Husni Mohd; Willms, Walter D; Ebrahimian, Mahboubeh

    2013-10-01

    This study investigates the applicability of multivariate statistical techniques including cluster analysis (CA), discriminant analysis (DA), and factor analysis (FA) for the assessment of seasonal variations in the surface water quality of tropical pastures. The study was carried out in the TPU catchment, Kuala Lumpur, Malaysia. The dataset consisted of 1-year monitoring of 14 parameters at six sampling sites. The CA yielded two groups of similarity between the sampling sites, i.e., less polluted (LP) and moderately polluted (MP) at temporal scale. Fecal coliform (FC), NO3, DO, and pH were significantly related to the stream grouping in the dry season, whereas NH3, BOD, Escherichia coli, and FC were significantly related to the stream grouping in the rainy season. The best predictors for distinguishing clusters in temporal scale were FC, NH3, and E. coli, respectively. FC, E. coli, and BOD with strong positive loadings were introduced as the first varifactors in the dry season which indicates the biological source of variability. EC with a strong positive loading and DO with a strong negative loading were introduced as the first varifactors in the rainy season, which represents the physiochemical source of variability. Multivariate statistical techniques were effective analytical techniques for classification and processing of large datasets of water quality and the identification of major sources of water pollution in tropical pastures.

  13. A Case Study on Teaching the Topic "Experimental Unit" and How It Is Presented in Advanced Placement Statistics Textbooks

    ERIC Educational Resources Information Center

    Perrett, Jamis J.

    2012-01-01

    This article demonstrates how textbooks differ in their description of the term "experimental unit". Advanced Placement Statistics teachers and students are often limited in their statistical knowledge by the information presented in their classroom textbook. Definitions and descriptions differ among textbooks as well as among different editions…

  14. Advanced Millimeter-Wave Security Portal Imaging Techniques

    SciTech Connect

    Sheen, David M.; Bernacki, Bruce E.; McMakin, Douglas L.

    2012-04-01

    Millimeter-wave imaging is rapidly gaining acceptance for passenger screening at airports and other secured facilities. This paper details a number of techniques developed over the last several years including novel image reconstruction and display techniques, polarimetric imaging techniques, array switching schemes, as well as high frequency high bandwidth techniques. Implementation of some of these methods will increase the cost and complexity of the mm-wave security portal imaging systems. RF photonic methods may provide new solutions to the design and development of the sequentially switched linear mm-wave arrays that are the key element in the mm-wave portal imaging systems.

  15. A statistical technique for processing radio interferometer data. [using maximum likelihood algorithm

    NASA Technical Reports Server (NTRS)

    Papadopoulos, G. D.

    1975-01-01

    The output of a radio interferometer is the Fourier transform of the object under investigation. Due to the limited coverage of the Fourier plane, the reconstruction of the image of the source is blurred by the beam of the synthesized array. A maximum-likelihood processing technique is described which uses the statistical properties of the received noise-like signals. This technique has been used extensively in the processing of large-aperture seismic arrays. This inversion method results in a synthesized beam that is more uniform, has lower sidelobes, and higher resolution than the normal Fourier transform methods. The maximum-likelihood method algorithm was applied successfully to very long baseline and short baseline interferometric data.

  16. The adaptive statistical iterative reconstruction-V technique for radiation dose reduction in abdominal CT: comparison with the adaptive statistical iterative reconstruction technique

    PubMed Central

    Cho, Jinhan; Oh, Jongyeong; Kim, Dongwon; Cho, Junghyun; Kim, Sanghyun; Lee, Sangyun; Lee, Jihyun

    2015-01-01

    Objective: To investigate whether reduced radiation dose abdominal CT images reconstructed with adaptive statistical iterative reconstruction V (ASIR-V) compromise the depiction of clinically competent features when compared with the currently used routine radiation dose CT images reconstructed with ASIR. Methods: 27 consecutive patients (mean body mass index: 23.55 kg m−2 underwent CT of the abdomen at two time points. At the first time point, abdominal CT was scanned at 21.45 noise index levels of automatic current modulation at 120 kV. Images were reconstructed with 40% ASIR, the routine protocol of Dong-A University Hospital. At the second time point, follow-up scans were performed at 30 noise index levels. Images were reconstructed with filtered back projection (FBP), 40% ASIR, 30% ASIR-V, 50% ASIR-V and 70% ASIR-V for the reduced radiation dose. Both quantitative and qualitative analyses of image quality were conducted. The CT dose index was also recorded. Results: At the follow-up study, the mean dose reduction relative to the currently used common radiation dose was 35.37% (range: 19–49%). The overall subjective image quality and diagnostic acceptability of the 50% ASIR-V scores at the reduced radiation dose were nearly identical to those recorded when using the initial routine-dose CT with 40% ASIR. Subjective ratings of the qualitative analysis revealed that of all reduced radiation dose CT series reconstructed, 30% ASIR-V and 50% ASIR-V were associated with higher image quality with lower noise and artefacts as well as good sharpness when compared with 40% ASIR and FBP. However, the sharpness score at 70% ASIR-V was considered to be worse than that at 40% ASIR. Objective image noise for 50% ASIR-V was 34.24% and 46.34% which was lower than 40% ASIR and FBP. Conclusion: Abdominal CT images reconstructed with ASIR-V facilitate radiation dose reductions of to 35% when compared with the ASIR. Advances in knowledge: This study represents the first

  17. Performance of Statistical Temporal Downscaling Techniques of Wind Speed Data Over Aegean Sea

    NASA Astrophysics Data System (ADS)

    Gokhan Guler, Hasan; Baykal, Cuneyt; Ozyurt, Gulizar; Kisacik, Dogan

    2016-04-01

    Wind speed data is a key input for many meteorological and engineering applications. Many institutions provide wind speed data with temporal resolutions ranging from one hour to twenty four hours. Higher temporal resolution is generally required for some applications such as reliable wave hindcasting studies. One solution to generate wind data at high sampling frequencies is to use statistical downscaling techniques to interpolate values of the finer sampling intervals from the available data. In this study, the major aim is to assess temporal downscaling performance of nine statistical interpolation techniques by quantifying the inherent uncertainty due to selection of different techniques. For this purpose, hourly 10-m wind speed data taken from 227 data points over Aegean Sea between 1979 and 2010 having a spatial resolution of approximately 0.3 degrees are analyzed from the National Centers for Environmental Prediction (NCEP) The Climate Forecast System Reanalysis database. Additionally, hourly 10-m wind speed data of two in-situ measurement stations between June, 2014 and June, 2015 are considered to understand effect of dataset properties on the uncertainty generated by interpolation technique. In this study, nine statistical interpolation techniques are selected as w0 (left constant) interpolation, w6 (right constant) interpolation, averaging step function interpolation, linear interpolation, 1D Fast Fourier Transform interpolation, 2nd and 3rd degree Lagrange polynomial interpolation, cubic spline interpolation, piecewise cubic Hermite interpolating polynomials. Original data is down sampled to 6 hours (i.e. wind speeds at 0th, 6th, 12th and 18th hours of each day are selected), then 6 hourly data is temporally downscaled to hourly data (i.e. the wind speeds at each hour between the intervals are computed) using nine interpolation technique, and finally original data is compared with the temporally downscaled data. A penalty point system based on

  18. Recent advances in microscopic techniques for visualizing leukocytes in vivo

    PubMed Central

    Jain, Rohit; Tikoo, Shweta; Weninger, Wolfgang

    2016-01-01

    Leukocytes are inherently motile and interactive cells. Recent advances in intravital microscopy approaches have enabled a new vista of their behavior within intact tissues in real time. This brief review summarizes the developments enabling the tracking of immune responses in vivo. PMID:27239292

  19. Bricklaying Curriculum: Advanced Bricklaying Techniques. Instructional Materials. Revised.

    ERIC Educational Resources Information Center

    Turcotte, Raymond J.; Hendrix, Laborn J.

    This curriculum guide is designed to assist bricklaying instructors in providing performance-based instruction in advanced bricklaying. Included in the first section of the guide are units on customized or architectural masonry units; glass block; sills, lintels, and copings; and control (expansion) joints. The next two units deal with cut,…

  20. Advanced NDE techniques for quantitative characterization of aircraft

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.; Winfree, William P.

    1990-01-01

    Recent advances in nondestructive evaluation (NDE) at NASA Langley Research Center and their applications that have resulted in quantitative assessment of material properties based on thermal and ultrasonic measurements are reviewed. Specific applications include ultrasonic determination of bolt tension, ultrasonic and thermal characterization of bonded layered structures, characterization of composite materials, and disbonds in aircraft skins.

  1. Raman spectroscopy coupled with advanced statistics for differentiating menstrual and peripheral blood.

    PubMed

    Sikirzhytskaya, Aliaksandra; Sikirzhytski, Vitali; Lednev, Igor K

    2014-01-01

    Body fluids are a common and important type of forensic evidence. In particular, the identification of menstrual blood stains is often a key step during the investigation of rape cases. Here, we report on the application of near-infrared Raman microspectroscopy for differentiating menstrual blood from peripheral blood. We observed that the menstrual and peripheral blood samples have similar but distinct Raman spectra. Advanced statistical analysis of the multiple Raman spectra that were automatically (Raman mapping) acquired from the 40 dried blood stains (20 donors for each group) allowed us to build classification model with maximum (100%) sensitivity and specificity. We also demonstrated that despite certain common constituents, menstrual blood can be readily distinguished from vaginal fluid. All of the classification models were verified using cross-validation methods. The proposed method overcomes the problems associated with currently used biochemical methods, which are destructive, time consuming and expensive.

  2. A Superposition Technique for Deriving Photon Scattering Statistics in Plane-Parallel Cloudy Atmospheres

    NASA Technical Reports Server (NTRS)

    Platnick, S.

    1999-01-01

    Photon transport in a multiple scattering medium is critically dependent on scattering statistics, in particular the average number of scatterings. A superposition technique is derived to accurately determine the average number of scatterings encountered by reflected and transmitted photons within arbitrary layers in plane-parallel, vertically inhomogeneous clouds. As expected, the resulting scattering number profiles are highly dependent on cloud particle absorption and solar/viewing geometry. The technique uses efficient adding and doubling radiative transfer procedures, avoiding traditional time-intensive Monte Carlo methods. Derived superposition formulae are applied to a variety of geometries and cloud models, and selected results are compared with Monte Carlo calculations. Cloud remote sensing techniques that use solar reflectance or transmittance measurements generally assume a homogeneous plane-parallel cloud structure. The scales over which this assumption is relevant, in both the vertical and horizontal, can be obtained from the superposition calculations. Though the emphasis is on photon transport in clouds, the derived technique is applicable to any scattering plane-parallel radiative transfer problem, including arbitrary combinations of cloud, aerosol, and gas layers in the atmosphere.

  3. Evaluating Statistical Process Control (SPC) techniques and computing the uncertainty of force calibrations

    NASA Technical Reports Server (NTRS)

    Navard, Sharon E.

    1989-01-01

    In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.

  4. Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems.

    PubMed

    Williams, Richard A; Timmis, Jon; Qwarnstrom, Eva E

    2016-01-01

    Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model. PMID:27571414

  5. Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems.

    PubMed

    Williams, Richard A; Timmis, Jon; Qwarnstrom, Eva E

    2016-01-01

    Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model.

  6. Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems

    PubMed Central

    Timmis, Jon; Qwarnstrom, Eva E.

    2016-01-01

    Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model. PMID:27571414

  7. Mathematical and Statistical Techniques for Systems Medicine: The Wnt Signaling Pathway as a Case Study.

    PubMed

    MacLean, Adam L; Harrington, Heather A; Stumpf, Michael P H; Byrne, Helen M

    2016-01-01

    The last decade has seen an explosion in models that describe phenomena in systems medicine. Such models are especially useful for studying signaling pathways, such as the Wnt pathway. In this chapter we use the Wnt pathway to showcase current mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since they remain the most widely used approach in systems biology and medicine and continue to offer great potential. We present methods for the analysis of a single model, comprising applications of standard dynamical systems approaches such as nondimensionalization, steady state, asymptotic and sensitivity analysis, and more recent statistical and algebraic approaches to compare models with data. We present parameter estimation and model comparison techniques, focusing on Bayesian analysis and coplanarity via algebraic geometry. Our intention is that this (non-exhaustive) review may serve as a useful starting point for the analysis of models in systems medicine.

  8. Mathematical and Statistical Techniques for Systems Medicine: The Wnt Signaling Pathway as a Case Study.

    PubMed

    MacLean, Adam L; Harrington, Heather A; Stumpf, Michael P H; Byrne, Helen M

    2016-01-01

    The last decade has seen an explosion in models that describe phenomena in systems medicine. Such models are especially useful for studying signaling pathways, such as the Wnt pathway. In this chapter we use the Wnt pathway to showcase current mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since they remain the most widely used approach in systems biology and medicine and continue to offer great potential. We present methods for the analysis of a single model, comprising applications of standard dynamical systems approaches such as nondimensionalization, steady state, asymptotic and sensitivity analysis, and more recent statistical and algebraic approaches to compare models with data. We present parameter estimation and model comparison techniques, focusing on Bayesian analysis and coplanarity via algebraic geometry. Our intention is that this (non-exhaustive) review may serve as a useful starting point for the analysis of models in systems medicine. PMID:26677193

  9. Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan

    2016-01-01

    The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.

  10. Performance evaluation of a hybrid-passive landfill leachate treatment system using multivariate statistical techniques.

    PubMed

    Wallace, Jack; Champagne, Pascale; Monnier, Anne-Charlotte

    2015-01-01

    A pilot-scale hybrid-passive treatment system operated at the Merrick Landfill in North Bay, Ontario, Canada, treats municipal landfill leachate and provides for subsequent natural attenuation. Collected leachate is directed to a hybrid-passive treatment system, followed by controlled release to a natural attenuation zone before entering the nearby Little Sturgeon River. The study presents a comprehensive evaluation of the performance of the system using multivariate statistical techniques to determine the interactions between parameters, major pollutants in the leachate, and the biological and chemical processes occurring in the system. Five parameters (ammonia, alkalinity, chemical oxygen demand (COD), "heavy" metals of interest, with atomic weights above calcium, and iron) were set as criteria for the evaluation of system performance based on their toxicity to aquatic ecosystems and importance in treatment with respect to discharge regulations. System data for a full range of water quality parameters over a 21-month period were analyzed using principal components analysis (PCA), as well as principal components (PC) and partial least squares (PLS) regressions. PCA indicated a high degree of association for most parameters with the first PC, which explained a high percentage (>40%) of the variation in the data, suggesting strong statistical relationships among most of the parameters in the system. Regression analyses identified 8 parameters (set as independent variables) that were most frequently retained for modeling the five criteria parameters (set as dependent variables), on a statistically significant level: conductivity, dissolved oxygen (DO), nitrite (NO2(-)), organic nitrogen (N), oxidation reduction potential (ORP), pH, sulfate and total volatile solids (TVS). The criteria parameters and the significant explanatory parameters were most important in modeling the dynamics of the passive treatment system during the study period. Such techniques and

  11. Performance evaluation of a hybrid-passive landfill leachate treatment system using multivariate statistical techniques.

    PubMed

    Wallace, Jack; Champagne, Pascale; Monnier, Anne-Charlotte

    2015-01-01

    A pilot-scale hybrid-passive treatment system operated at the Merrick Landfill in North Bay, Ontario, Canada, treats municipal landfill leachate and provides for subsequent natural attenuation. Collected leachate is directed to a hybrid-passive treatment system, followed by controlled release to a natural attenuation zone before entering the nearby Little Sturgeon River. The study presents a comprehensive evaluation of the performance of the system using multivariate statistical techniques to determine the interactions between parameters, major pollutants in the leachate, and the biological and chemical processes occurring in the system. Five parameters (ammonia, alkalinity, chemical oxygen demand (COD), "heavy" metals of interest, with atomic weights above calcium, and iron) were set as criteria for the evaluation of system performance based on their toxicity to aquatic ecosystems and importance in treatment with respect to discharge regulations. System data for a full range of water quality parameters over a 21-month period were analyzed using principal components analysis (PCA), as well as principal components (PC) and partial least squares (PLS) regressions. PCA indicated a high degree of association for most parameters with the first PC, which explained a high percentage (>40%) of the variation in the data, suggesting strong statistical relationships among most of the parameters in the system. Regression analyses identified 8 parameters (set as independent variables) that were most frequently retained for modeling the five criteria parameters (set as dependent variables), on a statistically significant level: conductivity, dissolved oxygen (DO), nitrite (NO2(-)), organic nitrogen (N), oxidation reduction potential (ORP), pH, sulfate and total volatile solids (TVS). The criteria parameters and the significant explanatory parameters were most important in modeling the dynamics of the passive treatment system during the study period. Such techniques and

  12. Application of the Statistical ICA Technique in the DANCE Data Analysis

    NASA Astrophysics Data System (ADS)

    Baramsai, Bayarbadrakh; Jandel, M.; Bredeweg, T. A.; Rusev, G.; Walker, C. L.; Couture, A.; Mosby, S.; Ullmann, J. L.; Dance Collaboration

    2015-10-01

    The Detector for Advanced Neutron Capture Experiments (DANCE) at the Los Alamos Neutron Science Center is used to improve our understanding of the neutron capture reaction. DANCE is a highly efficient 4 π γ-ray detector array consisting of 160 BaF2 crystals which make it an ideal tool for neutron capture experiments. The (n, γ) reaction Q-value equals to the sum energy of all γ-rays emitted in the de-excitation cascades from the excited capture state to the ground state. The total γ-ray energy is used to identify reactions on different isotopes as well as the background. However, it's challenging to identify contribution in the Esum spectra from different isotopes with the similar Q-values. Recently we have tested the applicability of modern statistical methods such as Independent Component Analysis (ICA) to identify and separate different (n, γ) reaction yields on different isotopes that are present in the target material. ICA is a recently developed computational tool for separating multidimensional data into statistically independent additive subcomponents. In this conference talk, we present some results of the application of ICA algorithms and its modification for the DANCE experimental data analysis. This research is supported by the U. S. Department of Energy, Office of Science, Nuclear Physics under the Early Career Award No. LANL20135009.

  13. Backscattered Electron Microscopy as an Advanced Technique in Petrography.

    ERIC Educational Resources Information Center

    Krinsley, David Henry; Manley, Curtis Robert

    1989-01-01

    Three uses of this method with sandstone, desert varnish, and granite weathering are described. Background information on this technique is provided. Advantages of this type of microscopy are stressed. (CW)

  14. Electroextraction and electromembrane extraction: Advances in hyphenation to analytical techniques

    PubMed Central

    Oedit, Amar; Ramautar, Rawi; Hankemeier, Thomas

    2016-01-01

    Electroextraction (EE) and electromembrane extraction (EME) are sample preparation techniques that both require an electric field that is applied over a liquid‐liquid system, which enables the migration of charged analytes. Furthermore, both techniques are often used to pre‐concentrate analytes prior to analysis. In this review an overview is provided of the body of literature spanning April 2012–November 2015 concerning EE and EME, focused on hyphenation to analytical techniques. First, the theoretical aspects of concentration enhancement in EE and EME are discussed to explain extraction recovery and enrichment factor. Next, overviews are provided of the techniques based on their hyphenation to LC, GC, CE, and direct detection. These overviews cover the compounds and matrices, experimental aspects (i.e. donor volume, acceptor volume, extraction time, extraction voltage, and separation time) and the analytical aspects (i.e. limit of detection, enrichment factor, and extraction recovery). Techniques that were either hyphenated online to analytical techniques or show high potential with respect to online hyphenation are highlighted. Finally, the potential future directions of EE and EME are discussed. PMID:26864699

  15. Advanced millimeter-wave security portal imaging techniques

    NASA Astrophysics Data System (ADS)

    Sheen, David M.; Bernacki, Bruce E.; McMakin, Douglas L.

    2012-03-01

    Millimeter-wave (mm-wave) imaging is rapidly gaining acceptance as a security tool to augment conventional metal detectors and baggage x-ray systems for passenger screening at airports and other secured facilities. This acceptance indicates that the technology has matured; however, many potential improvements can yet be realized. The authors have developed a number of techniques over the last several years including novel image reconstruction and display techniques, polarimetric imaging techniques, array switching schemes, and high-frequency high-bandwidth techniques. All of these may improve the performance of new systems; however, some of these techniques will increase the cost and complexity of the mm-wave security portal imaging systems. Reducing this cost may require the development of novel array designs. In particular, RF photonic methods may provide new solutions to the design and development of the sequentially switched linear mm-wave arrays that are the key element in the mm-wave portal imaging systems. Highfrequency, high-bandwidth designs are difficult to achieve with conventional mm-wave electronic devices, and RF photonic devices may be a practical alternative. In this paper, the mm-wave imaging techniques developed at PNNL are reviewed and the potential for implementing RF photonic mm-wave array designs is explored.

  16. A Modified Moore Approach to Teaching Mathematical Statistics: An Inquiry Based Learning Technique to Teaching Mathematical Statistics

    ERIC Educational Resources Information Center

    McLoughlin, M. Padraig M. M.

    2008-01-01

    The author of this paper submits the thesis that learning requires doing; only through inquiry is learning achieved, and hence this paper proposes a programme of use of a modified Moore method in a Probability and Mathematical Statistics (PAMS) course sequence to teach students PAMS. Furthermore, the author of this paper opines that set theory…

  17. Pilot-scale investigation of drinking water ultrafiltration membrane fouling rates using advanced data analysis techniques.

    PubMed

    Chen, Fei; Peldszus, Sigrid; Peiris, Ramila H; Ruhl, Aki S; Mehrez, Renata; Jekel, Martin; Legge, Raymond L; Huck, Peter M

    2014-01-01

    A pilot-scale investigation of the performance of biofiltration as a pre-treatment to ultrafiltration for drinking water treatment was conducted between 2008 and 2010. The objective of this study was to further understand the fouling behaviour of ultrafiltration at pilot scale and assess the utility of different foulant monitoring tools. Various fractions of natural organic matter (NOM) and colloidal/particulate matter of raw water, biofilter effluents, and membrane permeate were characterized by employing two advanced NOM characterization techniques: liquid chromatography - organic carbon detection (LC-OCD) and fluorescence excitation-emission matrices (FEEM) combined with principal component analysis (PCA). A framework of fouling rate quantification and classification was also developed and utilized in this study. In cases such as the present one where raw water quality and therefore fouling potential vary substantially, such classification can be considered essential for proper data interpretation. The individual and combined contributions of various NOM fractions and colloidal/particulate matter to hydraulically reversible and irreversible fouling were investigated using various multivariate statistical analysis techniques. Protein-like substances and biopolymers were identified as major contributors to both reversible and irreversible fouling, whereas colloidal/particulate matter can alleviate the extent of irreversible fouling. Humic-like substances contributed little to either reversible or irreversible fouling at low level fouling rates. The complementary nature of FEEM-PCA and LC-OCD for assessing the fouling potential of complex water matrices was also illustrated by this pilot-scale study.

  18. Nondestructive Evaluation of Thick Concrete Using Advanced Signal Processing Techniques

    SciTech Connect

    Clayton, Dwight A; Barker, Alan M; Santos-Villalobos, Hector J; Albright, Austin P; Hoegh, Kyle; Khazanovich, Lev

    2015-09-01

    The purpose of the U.S. Department of Energy Office of Nuclear Energy’s Light Water Reactor Sustainability (LWRS) Program is to develop technologies and other solutions that can improve the reliability, sustain the safety, and extend the operating lifetimes of nuclear power plants (NPPs) beyond 60 years [1]. Since many important safety structures in an NPP are constructed of concrete, inspection techniques must be developed and tested to evaluate the internal condition. In-service containment structures generally do not allow for the destructive measures necessary to validate the accuracy of these inspection techniques. This creates a need for comparative testing of the various nondestructive evaluation (NDE) measurement techniques on concrete specimens with known material properties, voids, internal microstructure flaws, and reinforcement locations.

  19. Performance evaluation of a hybrid-passive landfill leachate treatment system using multivariate statistical techniques

    SciTech Connect

    Wallace, Jack; Champagne, Pascale; Monnier, Anne-Charlotte

    2015-01-15

    Highlights: • Performance of a hybrid passive landfill leachate treatment system was evaluated. • 33 Water chemistry parameters were sampled for 21 months and statistically analyzed. • Parameters were strongly linked and explained most (>40%) of the variation in data. • Alkalinity, ammonia, COD, heavy metals, and iron were criteria for performance. • Eight other parameters were key in modeling system dynamics and criteria. - Abstract: A pilot-scale hybrid-passive treatment system operated at the Merrick Landfill in North Bay, Ontario, Canada, treats municipal landfill leachate and provides for subsequent natural attenuation. Collected leachate is directed to a hybrid-passive treatment system, followed by controlled release to a natural attenuation zone before entering the nearby Little Sturgeon River. The study presents a comprehensive evaluation of the performance of the system using multivariate statistical techniques to determine the interactions between parameters, major pollutants in the leachate, and the biological and chemical processes occurring in the system. Five parameters (ammonia, alkalinity, chemical oxygen demand (COD), “heavy” metals of interest, with atomic weights above calcium, and iron) were set as criteria for the evaluation of system performance based on their toxicity to aquatic ecosystems and importance in treatment with respect to discharge regulations. System data for a full range of water quality parameters over a 21-month period were analyzed using principal components analysis (PCA), as well as principal components (PC) and partial least squares (PLS) regressions. PCA indicated a high degree of association for most parameters with the first PC, which explained a high percentage (>40%) of the variation in the data, suggesting strong statistical relationships among most of the parameters in the system. Regression analyses identified 8 parameters (set as independent variables) that were most frequently retained for modeling

  20. Brain development in preterm infants assessed using advanced MRI techniques.

    PubMed

    Tusor, Nora; Arichi, Tomoki; Counsell, Serena J; Edwards, A David

    2014-03-01

    Infants who are born preterm have a high incidence of neurocognitive and neurobehavioral abnormalities, which may be associated with impaired brain development. Advanced magnetic resonance imaging (MRI) approaches, such as diffusion MRI (d-MRI) and functional MRI (fMRI), provide objective and reproducible measures of brain development. Indices derived from d-MRI can be used to provide quantitative measures of preterm brain injury. Although fMRI of the neonatal brain is currently a research tool, future studies combining d-MRI and fMRI have the potential to assess the structural and functional properties of the developing brain and its response to injury.

  1. Application of advanced coating techniques to rocket engine components

    NASA Technical Reports Server (NTRS)

    Verma, S. K.

    1988-01-01

    The materials problem in the space shuttle main engine (SSME) is reviewed. Potential coatings and the method of their application for improved life of SSME components are discussed. A number of advanced coatings for turbine blade components and disks are being developed and tested in a multispecimen thermal fatigue fluidized bed facility at IIT Research Institute. This facility is capable of producing severe strains of the degree present in blades and disk components of the SSME. The potential coating systems and current efforts at IITRI being taken for life extension of the SSME components are summarized.

  2. Transcranial Doppler: Techniques and advanced applications: Part 2

    PubMed Central

    Sharma, Arvind K.; Bathala, Lokesh; Batra, Amit; Mehndiratta, Man Mohan; Sharma, Vijay K.

    2016-01-01

    Transcranial Doppler (TCD) is the only diagnostic tool that can provide continuous information about cerebral hemodynamics in real time and over extended periods. In the previous paper (Part 1), we have already presented the basic ultrasound physics pertaining to TCD, insonation methods, and various flow patterns. This article describes various advanced applications of TCD such as detection of right-to-left shunt, emboli monitoring, vasomotor reactivity (VMR), monitoring of vasospasm in subarachnoid hemorrhage (SAH), monitoring of intracranial pressure, its role in stoke prevention in sickle cell disease, and as a supplementary test for confirmation of brain death. PMID:27011639

  3. Tunnel monitoring with an advanced InSAR technique

    NASA Astrophysics Data System (ADS)

    Rabus, Bernhard; Eppler, Jayson; Sharma, Jayanti; Busler, Jennifer

    2012-06-01

    The detection and monitoring of subsurface excavations has a variety of applications in both the civil and defense domains. We have developed a novel InSAR method (Homogenous Distributed Scatterer (HDS)-InSAR) that exploits both persistent point and coherent distributed scatterers by using adaptive multilooking of statistically homogenous pixel neighborhoods. In order to enhance the detection of small scale structures in low SNR environments a matched parametric spatio-temporal model is fit to the deformation signal. We illustrate the performance of our new method for the city of Vancouver over the last nine years using InSAR stacks of RADARSAT-1 and RADARSAT-2 data.

  4. In Situ Techniques for Monitoring Electrochromism: An Advanced Laboratory Experiment

    ERIC Educational Resources Information Center

    Saricayir, Hakan; Uce, Musa; Koca, Atif

    2010-01-01

    This experiment employs current technology to enhance and extend existing lab content. The basic principles of spectroscopic and electroanalytical techniques and their use in determining material properties are covered in some detail in many undergraduate chemistry programs. However, there are limited examples of laboratory experiments with in…

  5. Advances in reduction techniques for tire contact problems

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1995-01-01

    Some recent developments in reduction techniques, as applied to predicting the tire contact response and evaluating the sensitivity coefficients of the different response quantities, are reviewed. The sensitivity coefficients measure the sensitivity of the contact response to variations in the geometric and material parameters of the tire. The tire is modeled using a two-dimensional laminated anisotropic shell theory with the effects of variation in geometric and material parameters, transverse shear deformation, and geometric nonlinearities included. The contact conditions are incorporated into the formulation by using a perturbed Lagrangian approach with the fundamental unknowns consisting of the stress resultants, the generalized displacements, and the Lagrange multipliers associated with the contact conditions. The elemental arrays are obtained by using a modified two-field, mixed variational principle. For the application of reduction techniques, the tire finite element model is partitioned into two regions. The first region consists of the nodes that are likely to come in contact with the pavement, and the second region includes all the remaining nodes. The reduction technique is used to significantly reduce the degrees of freedom in the second region. The effectiveness of the computational procedure is demonstrated by a numerical example of the frictionless contact response of the space shuttle nose-gear tire, inflated and pressed against a rigid flat surface. Also, the research topics which have high potential for enhancing the effectiveness of reduction techniques are outlined.

  6. Benefits of advanced software techniques for mission planning systems

    NASA Technical Reports Server (NTRS)

    Gasquet, A.; Parrod, Y.; Desaintvincent, A.

    1994-01-01

    The increasing complexity of modern spacecraft, and the stringent requirement for maximizing their mission return, call for a new generation of Mission Planning Systems (MPS). In this paper, we discuss the requirements for the Space Mission Planning and the benefits which can be expected from Artificial Intelligence techniques through examples of applications developed by Matra Marconi Space.

  7. Single Molecule Techniques for Advanced in situ Hybridization

    SciTech Connect

    Hollars, C W; Stubbs, L; Carlson, K; Lu, X; Wehri, E

    2003-02-03

    One of the most significant achievements of modern science is completion of the human genome sequence, completed in the year 2000. Despite this monumental accomplishment, researchers have only begun to understand the relationships between this three-billion-nucleotide genetic code and the regulation and control of gene and protein expression within each of the millions of different types of highly specialized cells. Several methodologies have been developed for the analysis of gene and protein expression in situ, yet despite these advancements, the pace of such analyses is extremely limited. Because information regarding the precise timing and location of gene expression is a crucial component in the discovery of new pharmacological agents for the treatment of disease, there is an enormous incentive to develop technologies that accelerate the analytical process. Here we report on the use of plasmon resonant particles as advanced probes for in situ hybridization. These probes are used for the detection of low levels of gene-probe response and demonstrate a detection method that enables precise, simultaneous localization within a cell of the points of expression of multiple genes or proteins in a single sample.

  8. Hydrogeochemical assessment of groundwater quality in a river delta using multivariate statistical techniques

    NASA Astrophysics Data System (ADS)

    Matiatos, Ioannis; Paraskevopoulou, Vasiliki; Botsou, Fotini; Dassenakis, Manolis; Lazogiannis, Konstantinos; Ghionis, George; Poulos, Serafim

    2016-04-01

    The knowledge of the factors controlling the regional groundwater quality regime is important for planning and management of the groundwater resources. This work applies conventional hydrogeochemical and multivariate statistical techniques to identify the main factors and mechanisms controlling the hydrogeochemistry of groundwater in the deltaic environment of River Pinios (Thessaly) as well as possible areas of interactions between groundwater and surface water bodies. Hierarchical Cluster Analysis (HCA) and Principal Components Analysis (PCA) are performed using a data set of physical-chemical parameters from surface water and groundwater sites. Through HCA the paper's objective is to group together surface water and groundwater monitoring sites based on similarities in hydrochemistry in order to indicate areas of groundwater-surface water interaction. On the other hand, PCA aims at indicating factors responsible for the hydrogeochemical characteristics of the water bodies in the river delta (e.g., water-rock interaction, seawater intrusion, anthropogenic activities).

  9. Automatic detection of health changes using statistical process control techniques on measured transfer times of elderly.

    PubMed

    Baldewijns, Greet; Luca, Stijn; Nagels, William; Vanrumste, Bart; Croonenborghs, Tom

    2015-01-01

    It has been shown that gait speed and transfer times are good measures of functional ability in elderly. However, data currently acquired by systems that measure either gait speed or transfer times in the homes of elderly people require manual reviewing by healthcare workers. This reviewing process is time-consuming. To alleviate this burden, this paper proposes the use of statistical process control methods to automatically detect both positive and negative changes in transfer times. Three SPC techniques: tabular CUSUM, standardized CUSUM and EWMA, known for their ability to detect small shifts in the data, are evaluated on simulated transfer times. This analysis shows that EWMA is the best-suited method with a detection accuracy of 82% and an average detection time of 9.64 days. PMID:26737425

  10. Evaluating machine learning and statistical prediction techniques for landslide susceptibility modeling

    NASA Astrophysics Data System (ADS)

    Goetz, J. N.; Brenning, A.; Petschko, H.; Leopold, P.

    2015-08-01

    Statistical and now machine learning prediction methods have been gaining popularity in the field of landslide susceptibility modeling. Particularly, these data driven approaches show promise when tackling the challenge of mapping landslide prone areas for large regions, which may not have sufficient geotechnical data to conduct physically-based methods. Currently, there is no best method for empirical susceptibility modeling. Therefore, this study presents a comparison of traditional statistical and novel machine learning models applied for regional scale landslide susceptibility modeling. These methods were evaluated by spatial k-fold cross-validation estimation of the predictive performance, assessment of variable importance for gaining insights into model behavior and by the appearance of the prediction (i.e. susceptibility) map. The modeling techniques applied were logistic regression (GLM), generalized additive models (GAM), weights of evidence (WOE), the support vector machine (SVM), random forest classification (RF), and bootstrap aggregated classification trees (bundling) with penalized discriminant analysis (BPLDA). These modeling methods were tested for three areas in the province of Lower Austria, Austria. The areas are characterized by different geological and morphological settings. Random forest and bundling classification techniques had the overall best predictive performances. However, the performances of all modeling techniques were for the majority not significantly different from each other; depending on the areas of interest, the overall median estimated area under the receiver operating characteristic curve (AUROC) differences ranged from 2.9 to 8.9 percentage points. The overall median estimated true positive rate (TPR) measured at a 10% false positive rate (FPR) differences ranged from 11 to 15pp. The relative importance of each predictor was generally different between the modeling methods. However, slope angle, surface roughness and plan

  11. Developments and advances concerning the hyperpolarisation technique SABRE.

    PubMed

    Mewis, Ryan E

    2015-10-01

    To overcome the inherent sensitivity issue in NMR and MRI, hyperpolarisation techniques are used. Signal Amplification By Reversible Exchange (SABRE) is a hyperpolarisation technique that utilises parahydrogen, a molecule that possesses a nuclear singlet state, as the source of polarisation. A metal complex is required to break the singlet order of parahydrogen and, by doing so, facilitates polarisation transfer to analyte molecules ligated to the same complex through the J-coupled network that exists. The increased signal intensities that the analyte molecules possess as a result of this process have led to investigations whereby their potential as MRI contrast agents has been probed and to understand the fundamental processes underpinning the polarisation transfer mechanism. As well as discussing literature relevant to both of these areas, the chemical structure of the complex, the physical constraints of the polarisation transfer process and the successes of implementing SABRE at low and high magnetic fields are discussed. PMID:26264565

  12. An Analysis of Research Methods and Statistical Techniques Used by Doctoral Dissertation at the Education Sciences in Turkey

    ERIC Educational Resources Information Center

    Karadag, Engin

    2010-01-01

    To assess research methods and analysis of statistical techniques employed by educational researchers, this study surveyed unpublished doctoral dissertation from 2003 to 2007. Frequently used research methods consisted of experimental research; a survey; a correlational study; and a case study. Descriptive statistics, t-test, ANOVA, factor…

  13. Advances in statistical methods to map quantitative trait loci in outbred populations.

    PubMed

    Hoeschele, I; Uimari, P; Grignola, F E; Zhang, Q; Gage, K M

    1997-11-01

    Statistical methods to map quantitative trait loci (QTL) in outbred populations are reviewed, extensions and applications to human and plant genetic data are indicated, and areas for further research are identified. Simple and computationally inexpensive methods include (multiple) linear regression of phenotype on marker genotypes and regression of squared phenotypic differences among relative pairs on estimated proportions of identity-by-descent at a locus. These methods are less suited for genetic parameter estimation in outbred populations but allow the determination of test statistic distributions via simulation or data permutation; however, further inferences including confidence intervals of QTL location require the use of Monte Carlo or bootstrap sampling techniques. A method which is intermediate in computational requirements is residual maximum likelihood (REML) with a covariance matrix of random QTL effects conditional on information from multiple linked markers. Testing for the number of QTLs on a chromosome is difficult in a classical framework. The computationally most demanding methods are maximum likelihood and Bayesian analysis, which take account of the distribution of multilocus marker-QTL genotypes on a pedigree and permit investigators to fit different models of variation at the QTL. The Bayesian analysis includes the number of QTLs on a chromosome as an unknown.

  14. Advanced techniques for characterization of ion beam modified materials

    SciTech Connect

    Zhang, Yanwen; Debelle, Aurélien; Boulle, Alexandre; Kluth, Patrick; Tuomisto, Filip

    2014-10-30

    Understanding the mechanisms of damage formation in materials irradiated with energetic ions is essential for the field of ion-beam materials modification and engineering. Utilizing incident ions, electrons, photons, and positrons, various analysis techniques, including Rutherford backscattering spectrometry (RBS), electron RBS, Raman spectroscopy, high-resolution X-ray diffraction, small-angle X-ray scattering, and positron annihilation spectroscopy, are routinely used or gaining increasing attention in characterizing ion beam modified materials. The distinctive information, recent developments, and some perspectives in these techniques are reviewed in this paper. Applications of these techniques are discussed to demonstrate their unique ability for studying ion-solid interactions and the corresponding radiation effects in modified depths ranging from a few nm to a few tens of μm, and to provide information on electronic and atomic structure of the materials, defect configuration and concentration, as well as phase stability, amorphization and recrystallization processes. Finally, such knowledge contributes to our fundamental understanding over a wide range of extreme conditions essential for enhancing material performance and also for design and synthesis of new materials to address a broad variety of future energy applications.

  15. Advanced techniques for characterization of ion beam modified materials

    DOE PAGES

    Zhang, Yanwen; Debelle, Aurélien; Boulle, Alexandre; Kluth, Patrick; Tuomisto, Filip

    2014-10-30

    Understanding the mechanisms of damage formation in materials irradiated with energetic ions is essential for the field of ion-beam materials modification and engineering. Utilizing incident ions, electrons, photons, and positrons, various analysis techniques, including Rutherford backscattering spectrometry (RBS), electron RBS, Raman spectroscopy, high-resolution X-ray diffraction, small-angle X-ray scattering, and positron annihilation spectroscopy, are routinely used or gaining increasing attention in characterizing ion beam modified materials. The distinctive information, recent developments, and some perspectives in these techniques are reviewed in this paper. Applications of these techniques are discussed to demonstrate their unique ability for studying ion-solid interactions and the corresponding radiationmore » effects in modified depths ranging from a few nm to a few tens of μm, and to provide information on electronic and atomic structure of the materials, defect configuration and concentration, as well as phase stability, amorphization and recrystallization processes. Finally, such knowledge contributes to our fundamental understanding over a wide range of extreme conditions essential for enhancing material performance and also for design and synthesis of new materials to address a broad variety of future energy applications.« less

  16. Advanced materials and techniques for fibre-optic sensing

    NASA Astrophysics Data System (ADS)

    Henderson, Philip J.

    2014-06-01

    Fibre-optic monitoring systems came of age in about 1999 upon the emergence of the world's first significant commercialising company - a spin-out from the UK's collaborative MAST project. By using embedded fibre-optic technology, the MAST project successfully measured transient strain within high-performance composite yacht masts. Since then, applications have extended from smart composites into civil engineering, energy, military, aerospace, medicine and other sectors. Fibre-optic sensors come in various forms, and may be subject to embedment, retrofitting, and remote interrogation. The unique challenges presented by each implementation require careful scrutiny before widespread adoption can take place. Accordingly, various aspects of design and reliability are discussed spanning a range of representative technologies that include resonant microsilicon structures, MEMS, Bragg gratings, advanced forms of spectroscopy, and modern trends in nanotechnology. Keywords: Fibre-optic sensors, fibre Bragg gratings, MEMS, MOEMS, nanotechnology, plasmon.

  17. Recent advances in bioprinting techniques: approaches, applications and future prospects.

    PubMed

    Li, Jipeng; Chen, Mingjiao; Fan, Xianqun; Zhou, Huifang

    2016-01-01

    Bioprinting technology shows potential in tissue engineering for the fabrication of scaffolds, cells, tissues and organs reproducibly and with high accuracy. Bioprinting technologies are mainly divided into three categories, inkjet-based bioprinting, pressure-assisted bioprinting and laser-assisted bioprinting, based on their underlying printing principles. These various printing technologies have their advantages and limitations. Bioprinting utilizes biomaterials, cells or cell factors as a "bioink" to fabricate prospective tissue structures. Biomaterial parameters such as biocompatibility, cell viability and the cellular microenvironment strongly influence the printed product. Various printing technologies have been investigated, and great progress has been made in printing various types of tissue, including vasculature, heart, bone, cartilage, skin and liver. This review introduces basic principles and key aspects of some frequently used printing technologies. We focus on recent advances in three-dimensional printing applications, current challenges and future directions. PMID:27645770

  18. Development of processing techniques for advanced thermal protection materials

    NASA Technical Reports Server (NTRS)

    Selvaduray, Guna S.

    1994-01-01

    The effort, which was focused on the research and development of advanced materials for use in Thermal Protection Systems (TPS), has involved chemical and physical testing of refractory ceramic tiles, fabrics, threads and fibers. This testing has included determination of the optical properties, thermal shock resistance, high temperature dimensional stability, and tolerance to environmental stresses. Materials have also been tested in the Arc Jet 2 x 9 Turbulent Duct Facility (TDF), the 1 atmosphere Radiant Heat Cycler, and the Mini-Wind Tunnel Facility (MWTF). A significant part of the effort hitherto has gone towards modifying and upgrading the test facilities so that meaningful tests can be carried out. Another important effort during this period has been the creation of a materials database. Computer systems administration and support have also been provided. These are described in greater detail below.

  19. Advanced techniques for constrained internal coordinate molecular dynamics.

    PubMed

    Wagner, Jeffrey R; Balaraman, Gouthaman S; Niesen, Michiel J M; Larsen, Adrien B; Jain, Abhinandan; Vaidehi, Nagarajan

    2013-04-30

    Internal coordinate molecular dynamics (ICMD) methods provide a more natural description of a protein by using bond, angle, and torsional coordinates instead of a Cartesian coordinate representation. Freezing high-frequency bonds and angles in the ICMD model gives rise to constrained ICMD (CICMD) models. There are several theoretical aspects that need to be developed to make the CICMD method robust and widely usable. In this article, we have designed a new framework for (1) initializing velocities for nonindependent CICMD coordinates, (2) efficient computation of center of mass velocity during CICMD simulations, (3) using advanced integrators such as Runge-Kutta, Lobatto, and adaptive CVODE for CICMD simulations, and (4) cancelling out the "flying ice cube effect" that sometimes arises in Nosé-Hoover dynamics. The Generalized Newton-Euler Inverse Mass Operator (GNEIMO) method is an implementation of a CICMD method that we have developed to study protein dynamics. GNEIMO allows for a hierarchy of coarse-grained simulation models based on the ability to rigidly constrain any group of atoms. In this article, we perform tests on the Lobatto and Runge-Kutta integrators to determine optimal simulation parameters. We also implement an adaptive coarse-graining tool using the GNEIMO Python interface. This tool enables the secondary structure-guided "freezing and thawing" of degrees of freedom in the molecule on the fly during molecular dynamics simulations and is shown to fold four proteins to their native topologies. With these advancements, we envision the use of the GNEIMO method in protein structure prediction, structure refinement, and in studying domain motion.

  20. Statistically Optimal Approximations of Astronomical Signals: Implications to Classification and Advanced Study of Variable Stars

    NASA Astrophysics Data System (ADS)

    Andronov, I. L.; Chinarova, L. L.; Kudashkina, L. S.; Marsakova, V. I.; Tkachenko, M. G.

    2016-06-01

    We have elaborated a set of new algorithms and programs for advanced time series analysis of (generally) multi-component multi-channel observations with irregularly spaced times of observations, which is a common case for large photometric surveys. Previous self-review on these methods for periodogram, scalegram, wavelet, autocorrelation analysis as well as on "running" or "sub-interval" local approximations were self-reviewed in (2003ASPC..292..391A). For an approximation of the phase light curves of nearly-periodic pulsating stars, we use a Trigonometric Polynomial (TP) fit of the statistically optimal degree and initial period improvement using differential corrections (1994OAP.....7...49A). For the determination of parameters of "characteristic points" (minima, maxima, crossings of some constant value etc.) we use a set of methods self-reviewed in 2005ASPC..335...37A, Results of the analysis of the catalogs compiled using these programs are presented in 2014AASP....4....3A. For more complicated signals, we use "phenomenological approximations" with "special shapes" based on functions defined on sub-intervals rather on the complete interval. E. g. for the Algol-type stars we developed the NAV ("New Algol Variable") algorithm (2012Ap.....55..536A, 2012arXiv1212.6707A, 2015JASS...32..127A), which was compared to common methods of Trigonometric Polynomial Fit (TP) or local Algebraic Polynomial (A) fit of a fixed or (alternately) statistically optimal degree. The method allows determine the minimal set of parameters required for the "General Catalogue of Variable Stars", as well as an extended set of phenomenological and astrophysical parameters which may be used for the classification. Totally more that 1900 variable stars were studied in our group using these methods in a frame of the "Inter-Longitude Astronomy" campaign (2010OAP....23....8A) and the "Ukrainian Virtual Observatory" project (2012KPCB...28...85V).

  1. Advances in dental veneers: materials, applications, and techniques.

    PubMed

    Pini, Núbia Pavesi; Aguiar, Flávio Henrique Baggio; Lima, Débora Alves Nunes Leite; Lovadino, José Roberto; Terada, Raquel Sano Suga; Pascotto, Renata Corrêa

    2012-01-01

    Laminate veneers are a conservative treatment of unaesthetic anterior teeth. The continued development of dental ceramics offers clinicians many options for creating highly aesthetic and functional porcelain veneers. This evolution of materials, ceramics, and adhesive systems permits improvement of the aesthetic of the smile and the self-esteem of the patient. Clinicians should understand the latest ceramic materials in order to be able to recommend them and their applications and techniques, and to ensure the success of the clinical case. The current literature was reviewed to search for the most important parameters determining the long-term success, correct application, and clinical limitations of porcelain veneers.

  2. Advances in dental veneers: materials, applications, and techniques

    PubMed Central

    Pini, Núbia Pavesi; Aguiar, Flávio Henrique Baggio; Lima, Débora Alves Nunes Leite; Lovadino, José Roberto; Terada, Raquel Sano Suga; Pascotto, Renata Corrêa

    2012-01-01

    Laminate veneers are a conservative treatment of unaesthetic anterior teeth. The continued development of dental ceramics offers clinicians many options for creating highly aesthetic and functional porcelain veneers. This evolution of materials, ceramics, and adhesive systems permits improvement of the aesthetic of the smile and the self-esteem of the patient. Clinicians should understand the latest ceramic materials in order to be able to recommend them and their applications and techniques, and to ensure the success of the clinical case. The current literature was reviewed to search for the most important parameters determining the long-term success, correct application, and clinical limitations of porcelain veneers. PMID:23674920

  3. The emerging role of advanced neuroimaging techniques for brain metastases.

    PubMed

    Nowosielski, Martha; Radbruch, Alexander

    2015-06-01

    Brain metastases are an increasingly encountered and frightening manifestation of systemic cancer. More effective therapeutic strategies for the primary tumor are resulting in longer patient survival on the one hand while on the other, better brain tumor detection has resulted from increased availability and development of more precise brain imaging methods. This review focuses on the emerging role of functional neuroimaging techniques; magnetic resonance imaging (MRI) as well as positron emission tomography (PET), in establishing diagnosis, for monitoring treatment response with an emphasis on new targeted as well as immunomodulatory therapies and for predicting prognosis in patients with brain metastases.

  4. Advances in parameter estimation techniques applied to flexible structures

    NASA Technical Reports Server (NTRS)

    Maben, Egbert; Zimmerman, David C.

    1994-01-01

    In this work, various parameter estimation techniques are investigated in the context of structural system identification utilizing distributed parameter models and 'measured' time-domain data. Distributed parameter models are formulated using the PDEMOD software developed by Taylor. Enhancements made to PDEMOD for this work include the following: (1) a Wittrick-Williams based root solving algorithm; (2) a time simulation capability; and (3) various parameter estimation algorithms. The parameter estimations schemes will be contrasted using the NASA Mini-Mast as the focus structure.

  5. The updated statistical inversion technique to the evaluation of Umkehr observations

    NASA Technical Reports Server (NTRS)

    Frolov, Alexander D.; Obrazcov, Sergey P.

    1994-01-01

    In the present study the standard retrieval Umkehr method to estimate the vertical distribution of ozone was updated using a statistical approach to the mathematical inversion scheme. The vertical ozone profile covariance matrix was used as a priori information for the inverse problem. A new method of the ozonesonde data organization according to air mass types helped to improve the covariance matrix quality. A retrieval method was developed using eigenvector technique. An optimal vertical ozone profile resolution was determined from the mathematical inversion scheme analysis based on the same technique. The sun radiation transfer was accounted for multiple scattering and atmospheric sphericity in this calculation. The retrievals using actual Umkehr Dobson spectrophotometer observations were also performed to provide the comparison of the standard and updated methods with concurrent ozone sound data at Boulder U.S. The comparison has revealed that the present method has some advantages in both resolution and accuracy, as compared to the standard one, especially for the atmospheric layers below ozone maximum.

  6. Statistically-constrained shallow text marking: techniques, evaluation paradigm and results

    NASA Astrophysics Data System (ADS)

    Murphy, Brian; Vogel, Carl

    2007-02-01

    We present three natural language marking strategies based on fast and reliable shallow parsing techniques, and on widely available lexical resources: lexical substitution, adjective conjunction swaps, and relativiser switching. We test these techniques on a random sample of the British National Corpus. Individual candidate marks are checked for goodness of structural and semantic fit, using both lexical resources, and the web as a corpus. A representative sample of marks is given to 25 human judges to evaluate for acceptability and preservation of meaning. This establishes a correlation between corpus based felicity measures and perceived quality, and makes qualified predictions. Grammatical acceptability correlates with our automatic measure strongly (Pearson's r = 0.795, p = 0.001), allowing us to account for about two thirds of variability in human judgements. A moderate but statistically insignificant (Pearson's r = 0.422, p = 0.356) correlation is found with judgements of meaning preservation, indicating that the contextual window of five content words used for our automatic measure may need to be extended.

  7. Ultrasonic Technique for Experimental Investigation of Statistical Characteristics of Grid Generated Turbulence.

    NASA Astrophysics Data System (ADS)

    Andreeva, Tatiana; Durgin, William

    2001-11-01

    This paper focuses on ultrasonic measurements of a grid-generated turbulent flow using the travel time technique. In the present work an attempt to describe a turbulent flow by means of statistics of ultrasound wave propagation time is undertaken in combination with Kolmogorov (2/3)-power law. There are two objectives in current research work. The first one is to demonstrate an application of the travel-time ultrasonic technique for data acquisition in the grid-generated turbulence produced in a wind tunnel. The second one is to use the experimental data to verify or refute the analytically obtained expression for travel time dispersion as a function of velocity fluctuation metrics. The theoretical analysis and derivations of that formula are based on Kolmogorov theory. The series of experiment was conducted at different values of wind speeds and distances from the grid giving rise to different values of the dimensional turbulence characteristic coefficient K. Theoretical analysis, based on the experimental data reveals strong dependence of the turbulent characteristic K on the mean wind velocity. Tabulated values of the turbulent characteristic coefficient may be used for further understanding of the effect of turbulence on sound propagation.

  8. Advanced techniques in reliability model representation and solution

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.; Nicol, David M.

    1992-01-01

    The current tendency of flight control system designs is towards increased integration of applications and increased distribution of computational elements. The reliability analysis of such systems is difficult because subsystem interactions are increasingly interdependent. Researchers at NASA Langley Research Center have been working for several years to extend the capability of Markov modeling techniques to address these problems. This effort has been focused in the areas of increased model abstraction and increased computational capability. The reliability model generator (RMG) is a software tool that uses as input a graphical object-oriented block diagram of the system. RMG uses a failure-effects algorithm to produce the reliability model from the graphical description. The ASSURE software tool is a parallel processing program that uses the semi-Markov unreliability range evaluator (SURE) solution technique and the abstract semi-Markov specification interface to the SURE tool (ASSIST) modeling language. A failure modes-effects simulation is used by ASSURE. These tools were used to analyze a significant portion of a complex flight control system. The successful combination of the power of graphical representation, automated model generation, and parallel computation leads to the conclusion that distributed fault-tolerant system architectures can now be analyzed.

  9. Advanced terahertz techniques for quality control and counterfeit detection

    NASA Astrophysics Data System (ADS)

    Ahi, Kiarash; Anwar, Mehdi

    2016-04-01

    This paper reports our invented methods for detection of counterfeit electronic. These versatile techniques are also handy in quality control applications. Terahertz pulsed laser systems are capable of giving the material characteristics and thus make it possible to distinguish between the materials used in authentic components and their counterfeit clones. Components with material defects can also be distinguished in section in this manner. In this work different refractive indices and absorption coefficients were observed for counterfeit components compared to their authentic counterparts. Existence of unexpected ingredient materials was detected in counterfeit components by Fourier Transform analysis of the transmitted terahertz pulse. Thicknesses of different layers are obtainable by analyzing the reflected terahertz pulse. Existence of unexpected layers is also detectable in this manner. Recycled, sanded and blacktopped counterfeit electronic components were detected as a result of these analyses. Counterfeit ICs with die dislocations were detected by depicting the terahertz raster scanning data in a coordinate plane which gives terahertz images. In the same manner, raster scanning of the reflected pulse gives terahertz images of the surfaces of the components which were used to investigate contaminant materials and sanded points on the surfaces. The results of the later technique, reveals the recycled counterfeit components.

  10. Comparison of three advanced chromatographic techniques for cannabis identification.

    PubMed

    Debruyne, D; Albessard, F; Bigot, M C; Moulin, M

    1994-01-01

    The development of chromatography technology, with the increasing availability of easier-to-use mass spectrometers combined with gas chromatography (GC), the use of diode-array or programmable variable-wavelength ultraviolet absorption detectors in conjunction with high-performance liquid chromatography (HPLC), and the availability of scanners capable of reading thin-layer chromatography (TLC) plates in the ultraviolet and visible regions, has made for easier, quicker and more positive identification of cannabis samples that standard analytical laboratories are occasionally required to undertake in the effort to combat drug addiction. At laboratories that do not possess the technique of GC combined with mass spectrometry, which provides an irrefutable identification, the following procedure involving HPLC or TLC techniques may be used: identification of the chromatographic peaks corresponding to each of the three main cannabis constituents-cannabidiol (CBD), delta-9-tetrahydrocannabinol (delta-9-THC) and cannabinol (CBN)-by comparison with published data in conjunction with a specific absorption spectrum for each of those constituents obtained between 200 and 300 nm. The collection of the fractions corresponding to the three major cannabinoids at the HPLC system outlet and the cross-checking of their identity in the GC process with flame ionization detection can further corroborate the identification and minimize possible errors due to interference.

  11. "I am Not a Statistic": Identities of African American Males in Advanced Science Courses

    NASA Astrophysics Data System (ADS)

    Johnson, Diane Wynn

    The United States Bureau of Labor Statistics (2010) expects new industries to generate approximately 2.7 million jobs in science and technology by the year 2018, and there is concern as to whether there will be enough trained individuals to fill these positions. A tremendous resource remains untapped, African American students, especially African American males (National Science Foundation, 2009). Historically, African American males have been omitted from the so called science pipeline. Fewer African American males pursue a science discipline due, in part; to limiting factors they experience in school and at home (Ogbu, 2004). This is a case study of African American males who are enrolled in advanced science courses at a predominantly African American (84%) urban high school. Guided by expectancy-value theory (EVT) of achievement related results (Eccles, 2009; Eccles et al., 1983), twelve African American male students in two advanced science courses were observed in their science classrooms weekly, participated in an in-depth interview, developed a presentation to share with students enrolled in a tenth grade science course, responded to an open-ended identity questionnaire, and were surveyed about their perceptions of school. Additionally, the students' teachers were interviewed, and seven of the students' parents. The interview data analyses highlighted the important role of supportive parents (key socializers) who had high expectations for their sons and who pushed them academically. The students clearly attributed their enrollment in advanced science courses to their high regard for their science teachers, which included positive relationships, hands-on learning in class, and an inviting and encouraging learning environment. Additionally, other family members and coaches played important roles in these young men's lives. Students' PowerPoint(c) presentations to younger high school students on why they should take advanced science courses highlighted these

  12. Recent Advances in Spaceborne Precipitation Radar Measurement Techniques and Technology

    NASA Technical Reports Server (NTRS)

    Im, Eastwood; Durden, Stephen L.; Tanelli, Simone

    2006-01-01

    NASA is currently developing advanced instrument concepts and technologies for future spaceborne atmospheric radars, with an over-arching objective of making such instruments more capable in supporting future science needs and more cost effective. Two such examples are the Second-Generation Precipitation Radar (PR-2) and the Nexrad-In-Space (NIS). PR-2 is a 14/35-GHz dual-frequency rain radar with a deployable 5-meter, wide-swath scanned membrane antenna, a dual-polarized/dual-frequency receiver, and a realtime digital signal processor. It is intended for Low Earth Orbit (LEO) operations to provide greatly enhanced rainfall profile retrieval accuracy while consuming only a fraction of the mass of the current TRMM Precipitation Radar (PR). NIS is designed to be a 35-GHz Geostationary Earth Orbiting (GEO) radar for providing hourly monitoring of the life cycle of hurricanes and tropical storms. It uses a 35-m, spherical, lightweight membrane antenna and Doppler processing to acquire 3-dimensional information on the intensity and vertical motion of hurricane rainfall.

  13. Coal and Coal Constituent Studies by Advanced EMR Techniques

    SciTech Connect

    Alex I. Smirnov; Mark J. Nilges; R. Linn Belford; Robert B. Clarkson

    1998-03-31

    Advanced electronic magnetic resonance (EMR) methods are used to examine properties of coals, chars, and molecular species related to constituents of coal. We have achieved substantial progress on upgrading the high field (HF) EMR (W-band, 95 GHz) spectrometers that are especially advantageous for such studies. Particularly, we have built a new second W-band instrument (Mark II) in addition to our Mark I. Briefly, Mark II features: (i) an Oxford custom-built 7 T superconducting magnet which is scannable from 0 to 7 T at up to 0.5 T/min; (ii) water-cooled coaxial solenoid with up to ±550 G scan under digital (15 bits resolution) computer control; (iii) custom-engineered precision feed-back circuit, which is used to drive this solenoid, is based on an Ultrastab 860R sensor that has linearity better than 5 ppm and resolution of 0.05 ppm; (iv) an Oxford CF 1200 cryostat for variable temperature studies from 1.8 to 340 K. During this grant period we have completed several key upgrades of both Mark I and II, particularly microwave bridge, W-band probehead, and computer interfaces. We utilize these improved instruments for HF EMR studies of spin-spin interaction and existence of different paramagnetic species in carbonaceous solids.

  14. Advanced Cell Culture Techniques for Cancer Drug Discovery

    PubMed Central

    Lovitt, Carrie J.; Shelper, Todd B.; Avery, Vicky M.

    2014-01-01

    Human cancer cell lines are an integral part of drug discovery practices. However, modeling the complexity of cancer utilizing these cell lines on standard plastic substrata, does not accurately represent the tumor microenvironment. Research into developing advanced tumor cell culture models in a three-dimensional (3D) architecture that more prescisely characterizes the disease state have been undertaken by a number of laboratories around the world. These 3D cell culture models are particularly beneficial for investigating mechanistic processes and drug resistance in tumor cells. In addition, a range of molecular mechanisms deconstructed by studying cancer cells in 3D models suggest that tumor cells cultured in two-dimensional monolayer conditions do not respond to cancer therapeutics/compounds in a similar manner. Recent studies have demonstrated the potential of utilizing 3D cell culture models in drug discovery programs; however, it is evident that further research is required for the development of more complex models that incorporate the majority of the cellular and physical properties of a tumor. PMID:24887773

  15. Advanced coding techniques for few mode transmission systems.

    PubMed

    Okonkwo, Chigo; van Uden, Roy; Chen, Haoshuo; de Waardt, Huug; Koonen, Ton

    2015-01-26

    We experimentally verify the advantage of employing advanced coding schemes such as space-time coding and 4 dimensional modulation formats to enhance the transmission performance of a 3-mode transmission system. The performance gain of space-time block codes for extending the optical signal-to-noise ratio tolerance in multiple-input multiple-output optical coherent spatial division multiplexing transmission systems with respect to single-mode transmission performance are evaluated. By exploiting the spatial diversity that few-mode-fibers offer, with respect to single mode fiber back-to-back performance, significant OSNR gains of 3.2, 4.1, 4.9, and 6.8 dB at the hard-decision forward error correcting limit are demonstrated for DP-QPSK 8, 16 and 32 QAM, respectively. Furthermore, by employing 4D constellations, 6 × 28Gbaud 128 set partitioned quadrature amplitude modulation is shown to outperform conventional 8 QAM transmission performance, whilst carrying an additional 0.5 bit/symbol.

  16. Advanced fabrication techniques for hydrogen-cooled engine structures

    NASA Technical Reports Server (NTRS)

    Buchmann, O. A.; Arefian, V. V.; Warren, H. A.; Vuigner, A. A.; Pohlman, M. J.

    1985-01-01

    Described is a program for development of coolant passage geometries, material systems, and joining processes that will produce long-life hydrogen-cooled structures for scramjet applications. Tests were performed to establish basic material properties, and samples constructed and evaluated to substantiate fabrication processes and inspection techniques. Results of the study show that the basic goal of increasing the life of hydrogen-cooled structures two orders of magnitude relative to that of the Hypersonic Research Engine can be reached with available means. Estimated life is 19000 cycles for the channels and 16000 cycles for pin-fin coolant passage configurations using Nickel 201. Additional research is required to establish the fatigue characteristics of dissimilar-metal coolant passages (Nickel 201/Inconel 718) and to investigate the embrittling effects of the hydrogen coolant.

  17. Advanced Process Monitoring Techniques for Safeguarding Reprocessing Facilities

    SciTech Connect

    Orton, Christopher R.; Bryan, Samuel A.; Schwantes, Jon M.; Levitskaia, Tatiana G.; Fraga, Carlos G.; Peper, Shane M.

    2010-11-30

    The International Atomic Energy Agency (IAEA) has established international safeguards standards for fissionable material at spent fuel reprocessing plants to ensure that significant quantities of weapons-grade nuclear material are not diverted from these facilities. For large throughput nuclear facilities, it is difficult to satisfy the IAEA safeguards accountancy goal for detection of abrupt diversion. Currently, methods to verify material control and accountancy (MC&A) at these facilities require time-consuming and resource-intensive destructive assay (DA). Leveraging new on-line non destructive assay (NDA) process monitoring techniques in conjunction with the traditional and highly precise DA methods may provide an additional measure to nuclear material accountancy which would potentially result in a more timely, cost-effective and resource efficient means for safeguards verification at such facilities. By monitoring process control measurements (e.g. flowrates, temperatures, or concentrations of reagents, products or wastes), abnormal plant operations can be detected. Pacific Northwest National Laboratory (PNNL) is developing on-line NDA process monitoring technologies, including both the Multi-Isotope Process (MIP) Monitor and a spectroscopy-based monitoring system, to potentially reduce the time and resource burden associated with current techniques. The MIP Monitor uses gamma spectroscopy and multivariate analysis to identify off-normal conditions in process streams. The spectroscopic monitor continuously measures chemical compositions of the process streams including actinide metal ions (U, Pu, Np), selected fission products, and major cold flowsheet chemicals using UV-Vis, Near IR and Raman spectroscopy. This paper will provide an overview of our methods and report our on-going efforts to develop and demonstrate the technologies.

  18. Identification of fungal phytopathogens using Fourier transform infrared-attenuated total reflection spectroscopy and advanced statistical methods

    NASA Astrophysics Data System (ADS)

    Salman, Ahmad; Lapidot, Itshak; Pomerantz, Ami; Tsror, Leah; Shufan, Elad; Moreh, Raymond; Mordechai, Shaul; Huleihel, Mahmoud

    2012-01-01

    The early diagnosis of phytopathogens is of a great importance; it could save large economical losses due to crops damaged by fungal diseases, and prevent unnecessary soil fumigation or the use of fungicides and bactericides and thus prevent considerable environmental pollution. In this study, 18 isolates of three different fungi genera were investigated; six isolates of Colletotrichum coccodes, six isolates of Verticillium dahliae and six isolates of Fusarium oxysporum. Our main goal was to differentiate these fungi samples on the level of isolates, based on their infrared absorption spectra obtained using the Fourier transform infrared-attenuated total reflection (FTIR-ATR) sampling technique. Advanced statistical and mathematical methods: principal component analysis (PCA), linear discriminant analysis (LDA), and k-means were applied to the spectra after manipulation. Our results showed significant spectral differences between the various fungi genera examined. The use of k-means enabled classification between the genera with a 94.5% accuracy, whereas the use of PCA [3 principal components (PCs)] and LDA has achieved a 99.7% success rate. However, on the level of isolates, the best differentiation results were obtained using PCA (9 PCs) and LDA for the lower wavenumber region (800-1775 cm-1), with identification success rates of 87%, 85.5%, and 94.5% for Colletotrichum, Fusarium, and Verticillium strains, respectively.

  19. Silicon and germanium crystallization techniques for advanced device applications

    NASA Astrophysics Data System (ADS)

    Liu, Yaocheng

    Three-dimensional architectures are believed to be one of the possible approaches to reduce interconnect delay in integrated circuits. Metal-induced crystallization (MIC) can produce reasonably high-quality Si crystals with low-temperature processing, enabling the monolithic integration of multilevel devices and circuits. A two-step MIC process was developed to make single-crystal Si pillars on insulator by forming a single-grain NiSi2 template in the first step and crystallizing the amorphous Si by NiSi2-mediated solid-phase epitaxy (SPE) in the second step. A transmission electron microscopy study clearly showed the quality improvement over the traditional MIC process. Another crystallization technique developed is rapid melt growth (RMG) for the fabrication of Ge crystals and Ge-on-insulator (GeOI) substrates. Ge is an important semiconductor with high carrier mobility and excellent optoelectronic properties. GeOI substrates are particularly desired to achieve high device performances and to solve the process problems traditionally associated with bulk Ge wafers. High-quality Ge crystals and GeOI structures were grown on Si substrates using the novel rapid melt growth technique that integrates the key elements in Czochralski growth---seeding, melting, epitaxy and defect necking. Growth velocity and nucleation rate were calculated to determine the RMG process window. Self-aligned microcrucibles were created to hold the Ge liquid during the RMG annealing. Material characterization showed a very low defect density in the RMG GeOI structures. The Ge films are relaxed, with their orientations controlled by the Si substrates. P-channel MOSFETs and p-i-n photodetectors were fabricated with the GeOI substrates. The device properties are comparable to those obtained with bulk Ge wafers, indicating that the RMG GeOI substrates are well suited for device fabrication. A new theory, growth-induced barrier lowering (GIBL), is proposed to understand the defect generation in

  20. Advanced Manufacturing Techniques Demonstrated for Fabricating Developmental Hardware

    NASA Technical Reports Server (NTRS)

    Redding, Chip

    2004-01-01

    NASA Glenn Research Center's Engineering Development Division has been working in support of innovative gas turbine engine systems under development by Glenn's Combustion Branch. These one-of-a-kind components require operation under extreme conditions. High-temperature ceramics were chosen for fabrication was because of the hostile operating environment. During the designing process, it became apparent that traditional machining techniques would not be adequate to produce the small, intricate features for the conceptual design, which was to be produced by stacking over a dozen thin layers with many small features that would then be aligned and bonded together into a one-piece unit. Instead of using traditional machining, we produced computer models in Pro/ENGINEER (Parametric Technology Corporation (PTC), Needham, MA) to the specifications of the research engineer. The computer models were exported in stereolithography standard (STL) format and used to produce full-size rapid prototype polymer models. These semi-opaque plastic models were used for visualization and design verification. The computer models also were exported in International Graphics Exchange Specification (IGES) format and sent to Glenn's Thermal/Fluids Design & Analysis Branch and Applied Structural Mechanics Branch for profiling heat transfer and mechanical strength analysis.

  1. Simulation of an advanced techniques of ion propulsion Rocket system

    NASA Astrophysics Data System (ADS)

    Bakkiyaraj, R.

    2016-07-01

    The ion propulsion rocket system is expected to become popular with the development of Deuterium,Argon gas and Hexagonal shape Magneto hydrodynamic(MHD) techniques because of the stimulation indirectly generated the power from ionization chamber,design of thrust range is 1.2 N with 40 KW of electric power and high efficiency.The proposed work is the study of MHD power generation through ionization level of Deuterium gas and combination of two gaseous ions(Deuterium gas ions + Argon gas ions) at acceleration stage.IPR consists of three parts 1.Hexagonal shape MHD based power generator through ionization chamber 2.ion accelerator 3.Exhaust of Nozzle.Initially the required energy around 1312 KJ/mol is carrying out the purpose of deuterium gas which is changed to ionization level.The ionized Deuterium gas comes out from RF ionization chamber to nozzle through MHD generator with enhanced velocity then after voltage is generated across the two pairs of electrode in MHD.it will produce thrust value with the help of mixing of Deuterium ion and Argon ion at acceleration position.The simulation of the IPR system has been carried out by MATLAB.By comparing the simulation results with the theoretical and previous results,if reaches that the proposed method is achieved of thrust value with 40KW power for simulating the IPR system.

  2. Advances in Current Rating Techniques for Flexible Printed Circuits

    NASA Technical Reports Server (NTRS)

    Hayes, Ron

    2014-01-01

    Twist Capsule Assemblies are power transfer devices commonly used in spacecraft mechanisms that require electrical signals to be passed across a rotating interface. Flexible printed circuits (flex tapes, see Figure 2) are used to carry the electrical signals in these devices. Determining the current rating for a given trace (conductor) size can be challenging. Because of the thermal conditions present in this environment the most appropriate approach is to assume that the only means by which heat is removed from the trace is thru the conductor itself, so that when the flex tape is long the temperature rise in the trace can be extreme. While this technique represents a worst-case thermal situation that yields conservative current ratings, this conservatism may lead to overly cautious designs when not all traces are used at their full rated capacity. A better understanding of how individual traces behave when they are not all in use is the goal of this research. In the testing done in support of this paper, a representative flex tape used for a flight Solar Array Drive Assembly (SADA) application was tested by energizing individual traces (conductors in the tape) in a vacuum chamber and the temperatures of the tape measured using both fine-gauge thermocouples and infrared thermographic imaging. We find that traditional derating schemes used for bundles of wires do not apply for the configuration tested. We also determine that single active traces located in the center of a flex tape operate at lower temperatures than those on the outside edges.

  3. Recent advances in techniques for tsetse-fly control*

    PubMed Central

    MacLennan, K. J. R.

    1967-01-01

    With the advent of modern persistent insecticides, it has become possible to utilize some of the knowledge that has accumulated on the ecology and bionomics of Glossina and to devise more effective techniques for the control and eventual extermination of these species. The present article, based on experience of the tsetse fly problem in Northern Nigeria, points out that the disadvantages of control techniques—heavy expenditure of money and manpower and undue damage to the biosystem—can now largely be overcome by basing the application of insecticides on knowledge of the habits of the particular species of Glossina in a particular environment. Two factors are essential to the success of a control project: the proper selection of sites for spraying (the concept of restricted application) and the degree of persistence of the insecticide used. Reinfestation from within or outside the project area must also be taken into account. These and other aspects are discussed in relation to experience gained from a successful extermination project carried out in the Sudan vegetation zone and from present control activities in the Northern Guinea vegetation zone. PMID:5301739

  4. Advanced pattern-matching techniques for autonomous acquisition

    NASA Astrophysics Data System (ADS)

    Narendra, P. M.; Westover, B. L.

    1981-01-01

    The key objective of this effort is the development of pattern-matching algorithms which can impart autonomous acquisition capability to precision-guided munitions such as Copperhead and Hellfire. Autonomous acquisition through pattern matching holds the promise of eliminating laser designation and enhancing fire power by multiple target prioritization. The pattern-matching approach being developed under this program is based on a symbolic pattern-matching framework, which is suited for the autonomous acquisition scenario. It is based on matching a symbolic representation derived from the two images, and it can accommodate the stringent pattern-matchine criteria established by the scenario: enormous differences in the scene perspective, aspect and range between the two sensors, differences in sensor characteristics and illumination, and scene changes such as target motion and obscuration from one view point ot the other. This report contains a description of an efficient branch-and-bound technique for symbolic pattern matching. Also presented are the results of applying a simulation of the algorithm to pairs of FLIR images of military vehicles in cluttered environments as well as pairs of images from different sensors (FLIR and silicon TV). The computational requirements are analyzed toward real-time implementation, and avenues of future work are recommended.

  5. Advanced signal processing technique for damage detection in steel tubes

    NASA Astrophysics Data System (ADS)

    Amjad, Umar; Yadav, Susheel Kumar; Dao, Cac Minh; Dao, Kiet; Kundu, Tribikram

    2016-04-01

    In recent years, ultrasonic guided waves gained attention for reliable testing and characterization of metals and composites. Guided wave modes are excited and detected by PZT (Lead Zirconate Titanate) transducers either in transmission or reflection mode. In this study guided waves are excited and detected in the transmission mode and the phase change of the propagating wave modes are recorded. In most of the other studies reported in the literature, the change in the received signal strength (amplitude) is investigated with varying degrees of damage while in this study the change in phase is correlated with the extent of damage. Feature extraction techniques are used for extracting phase and time-frequency information. The main advantage of this approach is that the bonding condition between the transducer and the specimen does not affect the phase while it can affect the strength of recorded signal. Therefore, if the specimen is not damaged but the transducer-specimen bonding is deteriorated then the received signal strength is altered but the phase remains same and thus false positive predictions for damage can be avoided.

  6. Analysis of Immune Complex Structure by Statistical Mechanics and Light Scattering Techniques.

    NASA Astrophysics Data System (ADS)

    Busch, Nathan Adams

    1995-01-01

    The size and structure of immune complexes determine their behavior in the immune system. The chemical physics of the complex formation is not well understood; this is due in part to inadequate characterization of the proteins involved, and in part by lack of sufficiently well developed theoretical techniques. Understanding the complex formation will permit rational design of strategies for inhibiting tissue deposition of the complexes. A statistical mechanical model of the proteins based upon the theory of associating fluids was developed. The multipole electrostatic potential for each protein used in this study was characterized for net protein charge, dipole moment magnitude, and dipole moment direction. The binding sites, between the model antigen and antibodies, were characterized for their net surface area, energy, and position relative to the dipole moment of the protein. The equilibrium binding graphs generated with the protein statistical mechanical model compares favorably with experimental data obtained from radioimmunoassay results. The isothermal compressibility predicted by the model agrees with results obtained from dynamic light scattering. The statistical mechanics model was used to investigate association between the model antigen and selected pairs of antibodies. It was found that, in accordance to expectations from thermodynamic arguments, the highest total binding energy yielded complex distributions which were skewed to higher complex size. From examination of the simulated formation of ring structures from linear chain complexes, and from the joint shape probability surfaces, it was found that ring configurations were formed by the "folding" of linear chains until the ends are within binding distance. By comparing the single antigen/two antibody system which differ only in their respective binding site locations, it was found that binding site location influences complex size and shape distributions only when ring formation occurs. The

  7. A statistical technique for defining rainfall forecast probabilities in southern Africa

    NASA Astrophysics Data System (ADS)

    Husak, G. J.; Magadzire, T.

    2010-12-01

    Probabilistic forecasts are currently produced by many climate centers and by just as many processes. These models use a number of inputs to generate the probability of rainfall falling in the lower/middle/upper tercile (frequently termed “below-normal”/”normal”/”above-normal”) of the historical rainfall distribution. Generation of forecasts for a season may be the result of a dynamic climate model, a statistical model, a consensus of a panel of experts, or a combination of some of the afore-mentioned techniques, among others. This last method is one most commonly accepted in Southern Africa, resulting from the Southern Africa Regional Climate Outlook Forum (SARCOF). While it has been noted that there is a reasonable chance of polygons having a dominant tercile of 60% probability or more, this has seldom been the case. Indeed, over the last four years, the SARCOF process has not produced any polygons with such a likelihood. In fact, the terciles in all of the SARCOFs since 2007 have been some combination of 40%, 35% and 25%. Discussions with SARCOF scientists suggests that the SARCOF process is currently using the probabilistic format to define relatively qualitative, rank-ordered outcomes in the format “most-likely”, “second-most likely” and “least likely” terciles. While this rank-ordered classification has its merits, it limits the sort of downstream quantitative statistical analysis that could potentially be of assistance to various decision makers. In this study we build a simple statistical model to analyze the probabilistic outcomes for the coming rainfall season, and analyze their resulting probabilities. The prediction model takes a similar approach to that already used in the SARCOF process: namely, using available historic rainfall data and SST information to create a linear regression between rainfall and SSTs, define a likely rainfall outcome, and analyze the cross-validation errors over the most recent 30 years. The cross

  8. On Advanced Estimation Techniques for Exoplanet Detection and Characterization using Ground-Based Coronagraphs

    NASA Technical Reports Server (NTRS)

    Lawson, Peter R.; Frazin, Richard; Barrett, Harrison; Caucci, Luca; Devaney, Nicholas; Furenlid, Lars; Gladysz, Szymon; Guyon, Olivier; Krist, John; Maire, Jerome; Marois, Christian; Mawet, Dimitri; Mouillet, David; Mugnier, Laurent; Perrin, Marshall; Poyneer, Lisa; Pueyo, Laurent; Savransky, Dmitry; Soummer, Remi

    2012-01-01

    The direct imaging of planets around nearby stars is exceedingly difficult. Only about 14 exoplanets have been imaged to date that have masses less than 13 times that of Jupiter. The next generation of planet-finding coronagraphs, including VLT-SPHERE, the Gemini Planet Imager, Palomar P1640, and Subaru HiCIAO have predicted contrast performance of roughly a thousand times less than would be needed to detect Earth-like planets. In this paper we review the state of the art in exoplanet imaging, most notably the method of Locally Optimized Combination of Images (LOCI), and we investigate the potential of improving the detectability of faint exoplanets through the use of advanced statistical methods based on the concepts of the ideal observer and the Hotelling observer. We provide a formal comparison of techniques through a blind data challenge and evaluate performance using the Receiver Operating Characteristic (ROC) and Localization ROC (LROC) curves. We place particular emphasis on the understanding and modeling of realistic sources of measurement noise in ground-based AO-corrected coronagraphs. The work reported in this paper is the result of interactions between the co-authors during a week-long workshop on exoplanet imaging that was held in Squaw Valley, California, in March of 2012.

  9. On Advanced Estimation Techniques for Exoplanet Detection and Characterization using Ground-based Coronagraphs

    NASA Technical Reports Server (NTRS)

    Lawson, Peter; Frazin, Richard

    2012-01-01

    The direct imaging of planets around nearby stars is exceedingly difficult. Only about 14 exoplanets have been imaged to date that have masses less than 13 times that of Jupiter. The next generation of planet-finding coronagraphs, including VLT-SPHERE, the Gemini Planet Imager, Palomar P1640, and Subaru HiCIAO have predicted contrast performance of roughly a thousand times less than would be needed to detect Earth-like planets. In this paper we review the state of the art in exoplanet imaging, most notably the method of Locally Optimized Combination of Images (LOCI), and we investigate the potential of improving the detectability of faint exoplanets through the use of advanced statistical methods based on the concepts of the ideal observer and the Hotelling observer. We propose a formal comparison of techniques using a blind data challenge with an evaluation of performance using the Receiver Operating Characteristic (ROC) and Localization ROC (LROC) curves. We place particular emphasis on the understanding and modeling of realistic sources of measurement noise in ground-based AO-corrected coronagraphs. The work reported in this paper is the result of interactions between the co-authors during a week-long workshop on exoplanet imaging that was held in Squaw Valley, California, in March of 2012

  10. On Advanced Estimation Techniques for Exoplanet Detection and Characterization Using Ground-based Coronagraphs

    PubMed Central

    Lawson, Peter R.; Poyneer, Lisa; Barrett, Harrison; Frazin, Richard; Caucci, Luca; Devaney, Nicholas; Furenlid, Lars; Gładysz, Szymon; Guyon, Olivier; Krist, John; Maire, Jérôme; Marois, Christian; Mawet, Dimitri; Mouillet, David; Mugnier, Laurent; Pearson, Iain; Perrin, Marshall; Pueyo, Laurent; Savransky, Dmitry

    2015-01-01

    The direct imaging of planets around nearby stars is exceedingly difficult. Only about 14 exoplanets have been imaged to date that have masses less than 13 times that of Jupiter. The next generation of planet-finding coronagraphs, including VLT-SPHERE, the Gemini Planet Imager, Palomar P1640, and Subaru HiCIAO have predicted contrast performance of roughly a thousand times less than would be needed to detect Earth-like planets. In this paper we review the state of the art in exoplanet imaging, most notably the method of Locally Optimized Combination of Images (LOCI), and we investigate the potential of improving the detectability of faint exoplanets through the use of advanced statistical methods based on the concepts of the ideal observer and the Hotelling observer. We propose a formal comparison of techniques using a blind data challenge with an evaluation of performance using the Receiver Operating Characteristic (ROC) and Localization ROC (LROC) curves. We place particular emphasis on the understanding and modeling of realistic sources of measurement noise in ground-based AO-corrected coronagraphs. The work reported in this paper is the result of interactions between the co-authors during a week-long workshop on exoplanet imaging that was held in Squaw Valley, California, in March of 2012. PMID:26347393

  11. Advanced Techniques for Reservoir Simulation and Modeling of Non-Conventional Wells

    SciTech Connect

    Durlofsky, Louis J.

    2000-08-28

    This project targets the development of (1) advanced reservoir simulation techniques for modeling non-conventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and well index (for use in simulation models), including the effects of wellbore flow; and (3) accurate approaches to account for heterogeneity in the near-well region.

  12. Biotechnology Apprenticeship for Secondary-Level Students: Teaching Advanced Cell Culture Techniques for Research

    ERIC Educational Resources Information Center

    Lewis, Jennifer R.; Kotur, Mark S.; Butt, Omar; Kulcarni, Sumant; Riley, Alyssa A.; Ferrell, Nick; Sullivan, Kathryn D.; Ferrari, Mauro

    2002-01-01

    The purpose of this article is to discuss "small-group apprenticeships (SGAs)" as a method to instruct cell culture techniques to high school participants. The study aimed to teach cell culture practices and to introduce advanced imaging techniques to solve various biomedical engineering problems. Participants designed and completed experiments…

  13. Biotechnology Apprenticeship for Secondary-Level Students: Teaching Advanced Cell Culture Techniques for Research.

    ERIC Educational Resources Information Center

    Lewis, Jennifer R.; Kotur, Mark S.; Butt, Omar; Kulcarni, Sumant; Riley, Alyssa A.; Ferrell, Nick; Sullivan, Kathryn D.; Ferrari, Mauro

    2002-01-01

    Discusses small-group apprenticeships (SGAs) as a method for introducing cell culture techniques to high school participants. Teaches cell culture practices and introduces advance imaging techniques to solve various biomedical engineering problems. Clarifies and illuminates the value of small-group laboratory apprenticeships. (Author/KHR)

  14. Advanced techniques for determining long term compatibility of materials with propellants

    NASA Technical Reports Server (NTRS)

    Green, R. L.; Stebbins, J. P.; Smith, A. W.; Pullen, K. E.

    1973-01-01

    A method for the prediction of propellant-material compatibility for periods of time up to ten years is presented. Advanced sensitive measurement techniques used in the prediction method are described. These include: neutron activation analysis, radioactive tracer technique, and atomic absorption spectroscopy with a graphite tube furnace sampler. The results of laboratory tests performed to verify the prediction method are presented.

  15. Coping, Stress, and Job Satisfaction as Predictors of Advanced Placement Statistics Teachers' Intention to Leave the Field

    ERIC Educational Resources Information Center

    McCarthy, Christopher J.; Lambert, Richard G.; Crowe, Elizabeth W.; McCarthy, Colleen J.

    2010-01-01

    This study examined the relationship of teachers' perceptions of coping resources and demands to job satisfaction factors. Participants were 158 Advanced Placement Statistics high school teachers who completed measures of personal resources for stress prevention, classroom demands and resources, job satisfaction, and intention to leave the field…

  16. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC 11: SPC & Graphs. Instructor Book.

    ERIC Educational Resources Information Center

    Averitt, Sallie D.

    This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills in working with line graphs and teaching…

  17. "I am Not a Statistic": Identities of African American Males in Advanced Science Courses

    NASA Astrophysics Data System (ADS)

    Johnson, Diane Wynn

    The United States Bureau of Labor Statistics (2010) expects new industries to generate approximately 2.7 million jobs in science and technology by the year 2018, and there is concern as to whether there will be enough trained individuals to fill these positions. A tremendous resource remains untapped, African American students, especially African American males (National Science Foundation, 2009). Historically, African American males have been omitted from the so called science pipeline. Fewer African American males pursue a science discipline due, in part; to limiting factors they experience in school and at home (Ogbu, 2004). This is a case study of African American males who are enrolled in advanced science courses at a predominantly African American (84%) urban high school. Guided by expectancy-value theory (EVT) of achievement related results (Eccles, 2009; Eccles et al., 1983), twelve African American male students in two advanced science courses were observed in their science classrooms weekly, participated in an in-depth interview, developed a presentation to share with students enrolled in a tenth grade science course, responded to an open-ended identity questionnaire, and were surveyed about their perceptions of school. Additionally, the students' teachers were interviewed, and seven of the students' parents. The interview data analyses highlighted the important role of supportive parents (key socializers) who had high expectations for their sons and who pushed them academically. The students clearly attributed their enrollment in advanced science courses to their high regard for their science teachers, which included positive relationships, hands-on learning in class, and an inviting and encouraging learning environment. Additionally, other family members and coaches played important roles in these young men's lives. Students' PowerPoint(c) presentations to younger high school students on why they should take advanced science courses highlighted these

  18. Endoscopic therapy for early gastric cancer: Standard techniques and recent advances in ESD

    PubMed Central

    Kume, Keiichiro

    2014-01-01

    The technique of endoscopic submucosal dissection (ESD) is now a well-known endoscopic therapy for early gastric cancer. ESD was introduced to resect large specimens of early gastric cancer in a single piece. ESD can provide precision of histologic diagnosis and can also reduce the recurrence rate. However, the drawback of ESD is its technical difficulty, and, consequently, it is associated with a high rate of complications, the need for advanced endoscopic techniques, and a lengthy procedure time. Various advances in the devices and techniques used for ESD have contributed to overcoming these drawbacks. PMID:24914364

  19. Assessment of water quality using multivariate statistical techniques in the coastal region of Visakhapatnam, India.

    PubMed

    Pati, Sangeeta; Dash, Mihir K; Mukherjee, C K; Dash, B; Pokhrel, S

    2014-10-01

    The present study was intended to develop a Water Quality Index (WQI) for the coastal water of Visakhapatnam, India from multiple measured water quality parameters using different multivariate statistical techniques. Cluster analysis was used to classify the data set into three major groups based on similar water quality characteristics. Discriminant analysis was used to generate a discriminant function for developing a WQI. Discriminant analysis gave the best result for analyzing the seasonal variation of water quality. It helped in data reduction and found the most discriminant parameters responsible for seasonal variation of water quality. Coastal water was classified into good, average, and poor quality considering WQI and the nutrient load. The predictive capacity of WQI was proved with random samples taken from coastal areas. High concentration of ammonia in surface water during winter was attributed to nitrogen fixation by the phytoplankton bloom which resulted due to East India Coastal Current. This study brings out the fact that water quality in the coastal region not only depends on the discharge from different pollution sources but also on the presence of different current patterns. It also illustrates the usefulness of WQI for analyzing the complex nutrient data for assessing the coastal water and identifying different pollution sources, considering reasons for seasonal variation of water quality.

  20. Statistical techniques for modeling extreme price dynamics in the energy market

    NASA Astrophysics Data System (ADS)

    Mbugua, L. N.; Mwita, P. N.

    2013-02-01

    Extreme events have large impact throughout the span of engineering, science and economics. This is because extreme events often lead to failure and losses due to the nature unobservable of extra ordinary occurrences. In this context this paper focuses on appropriate statistical methods relating to a combination of quantile regression approach and extreme value theory to model the excesses. This plays a vital role in risk management. Locally, nonparametric quantile regression is used, a method that is flexible and best suited when one knows little about the functional forms of the object being estimated. The conditions are derived in order to estimate the extreme value distribution function. The threshold model of extreme values is used to circumvent the lack of adequate observation problem at the tail of the distribution function. The application of a selection of these techniques is demonstrated on the volatile fuel market. The results indicate that the method used can extract maximum possible reliable information from the data. The key attraction of this method is that it offers a set of ready made approaches to the most difficult problem of risk modeling.

  1. [Our experience with the treatment of high perianal fistulas with the mucosal flap advancement technique].

    PubMed

    Marino, Giuseppe; Greco, Ettore; Gasparrini, Marcello; Romanzi, Aldo; Ottaviani, Maurizio; Nasi, Stefano; Pasquini, Giorgio

    2004-01-01

    The authors present their experience with the treatment of high transphincteric anal fistulas with the mucosal flap advancement technique. This technique, though by no means easy to perform, allows fistulas to be treated in a single surgical session in comparison to the technique in which setone is used or to the less well known transposition techniques, given the same long-term results in terms of continence and recurrence rate. After a brief overview of the problem, from the points of view of both aetiopathogenesis and classification, the principal surgical treatment techniques are described, presenting the results and complications observed in the authors' own case series. PMID:15038659

  2. Comparative study of four advanced 3d-conformal radiation therapy treatment planning techniques for head and neck cancer.

    PubMed

    Herrassi, Mohamed Yassine; Bentayeb, Farida; Malisan, Maria Rosa

    2013-04-01

    For the head-and-neck cancer bilateral irradiation, intensity-modulated radiation therapy (IMRT) is the most reported technique as it enables both target dose coverage and organ-at-risk (OAR) sparing. However, during the last 20 years, three-dimensional conformal radiotherapy (3DCRT) techniques have been introduced, which are tailored to improve the classic shrinking field technique, as regards both planning target volume (PTV) dose conformality and sparing of OAR's, such as parotid glands and spinal cord. In this study, we tested experimentally in a sample of 13 patients, four of these advanced 3DCRT techniques, all using photon beams only and a unique isocentre, namely Bellinzona, Forward-Planned Multisegments (FPMS), ConPas, and field-in-field (FIF) techniques. Statistical analysis of the main dosimetric parameters of PTV and OAR's DVH's as well as of homogeneity and conformity indexes was carried out in order to compare the performance of each technique. The results show that the PTV dose coverage is adequate for all the techniques, with the FPMS techniques providing the highest value for D95%; on the other hand, the best sparing of parotid glands is achieved using the FIF and ConPas techniques, with a mean dose of 26 Gy to parotid glands for a PTV prescription dose of 54 Gy. After taking into account both PTV coverage and parotid sparing, the best global performance was achieved by the FIF technique with results comparable to that of IMRT plans. This technique can be proposed as a valid alternative when IMRT equipment is not available or patient is not suitable for IMRT treatment.

  3. Comparative study of four advanced 3d-conformal radiation therapy treatment planning techniques for head and neck cancer

    PubMed Central

    Herrassi, Mohamed Yassine; Bentayeb, Farida; Malisan, Maria Rosa

    2013-01-01

    For the head-and-neck cancer bilateral irradiation, intensity-modulated radiation therapy (IMRT) is the most reported technique as it enables both target dose coverage and organ-at-risk (OAR) sparing. However, during the last 20 years, three-dimensional conformal radiotherapy (3DCRT) techniques have been introduced, which are tailored to improve the classic shrinking field technique, as regards both planning target volume (PTV) dose conformality and sparing of OAR’s, such as parotid glands and spinal cord. In this study, we tested experimentally in a sample of 13 patients, four of these advanced 3DCRT techniques, all using photon beams only and a unique isocentre, namely Bellinzona, Forward-Planned Multisegments (FPMS), ConPas, and field-in-field (FIF) techniques. Statistical analysis of the main dosimetric parameters of PTV and OAR’s DVH’s as well as of homogeneity and conformity indexes was carried out in order to compare the performance of each technique. The results show that the PTV dose coverage is adequate for all the techniques, with the FPMS techniques providing the highest value for D95%; on the other hand, the best sparing of parotid glands is achieved using the FIF and ConPas techniques, with a mean dose of 26 Gy to parotid glands for a PTV prescription dose of 54 Gy. After taking into account both PTV coverage and parotid sparing, the best global performance was achieved by the FIF technique with results comparable to that of IMRT plans. This technique can be proposed as a valid alternative when IMRT equipment is not available or patient is not suitable for IMRT treatment. PMID:23776314

  4. Evaluation of river water quality variations using multivariate statistical techniques: Sava River (Croatia): a case study.

    PubMed

    Marinović Ruždjak, Andrea; Ruždjak, Domagoj

    2015-04-01

    For the evaluation of seasonal and spatial variations and the interpretation of a large and complex water quality dataset obtained during a 7-year monitoring program of the Sava River in Croatia, different multivariate statistical techniques were applied in this study. Basic statistical properties and correlations of 18 water quality parameters (variables) measured at 18 sampling sites (a total of 56,952 values) were examined. Correlations between air temperature and some water quality parameters were found in agreement with the previous studies of relationship between climatic and hydrological parameters. Principal component analysis (PCA) was used to explore the most important factors determining the spatiotemporal dynamics of the Sava River. PCA has determined a reduced number of seven principal components that explain over 75 % of the data set variance. The results revealed that parameters related to temperature and organic pollutants (CODMn and TSS) were the most important parameters contributing to water quality variation. PCA analysis of seasonal subsets confirmed this result and showed that the importance of parameters is changing from season to season. PCA of the four seasonal data subsets yielded six PCs with eigenvalues greater than one explaining 73.6 % (spring), 71.4 % (summer), 70.3 % (autumn), and 71.3 % (winter) of the total variance. To check the influence of the outliers in the data set whose distribution strongly deviates from the normal one, in addition to standard principal component analysis algorithm, two robust estimates of covariance matrix were calculated and subjected to PCA. PCA in both cases yielded seven principal components explaining 75 % of the total variance, and the results do not differ significantly from the results obtained by the standard PCA algorithm. With the implementation of robust PCA algorithm, it is demonstrated that the usage of standard algorithm is justified for data sets with small numbers of missing data

  5. A Novel Microcharacterization Technique in the Measurement of Strain and Orientation Gradient in Advanced Materials

    NASA Technical Reports Server (NTRS)

    Garmestai, H.; Harris, K.; Lourenco, L.

    1997-01-01

    Representation of morphology and evolution of the microstructure during processing and their relation to properties requires proper experimental techniques. Residual strains, lattice distortion, and texture (micro-texture) at the interface and the matrix of a layered structure or a functionally gradient material and their variation are among parameters important in materials characterization but hard to measure with present experimental techniques. Current techniques available to measure changes in interred material parameters (residual stress, micro-texture, microplasticity) produce results which are either qualitative or unreliable. This problem becomes even more complicated in the case of a temperature variation. These parameters affect many of the mechanical properties of advanced materials including stress-strain relation, ductility, creep, and fatigue. A review of some novel experimental techniques using recent advances in electron microscopy is presented here to measure internal stress, (micro)texture, interracial strength and (sub)grain formation and realignment. Two of these techniques are combined in the chamber of an Environmental Scanning Electron Microscope to measure strain and orientation gradients in advanced materials. These techniques which include Backscattered Kikuchi Diffractometry (BKD) and Microscopic Strain Field Analysis are used to characterize metallic and intermetallic matrix composites and superplastic materials. These techniques are compared with the more conventional x-ray diffraction and indentation techniques.

  6. Remote Sensing of Precipitation Using Multiparameter Radar: Statistics, Processing Algorithms and Analysis Techniques.

    NASA Astrophysics Data System (ADS)

    Liu, Li.

    With the advent of the multiparameter weather radar, i.e., dual-polarization, dual-frequency, Doppler radar, radar meteorologists have been able to study physical processes in precipitation in more detail, and the quantitative measurement of rainfall as well as the identification of different types of hydrometeors have become possible. However, the effects of propagation through the rain medium must be carefully considered whenever dual-polarization techniques are considered. The correction of propagation effects such as attenuation, differential attenuation and differential propagation phase in precipitation are very important for quantitative interpretation of echo powers at high frequencies. In this dissertation, a simplified scattering matrix with propagation effects is described. A number of parameters are derived based on the covariance matrix of the scattering element array. The processing techniques for estimating some specific parameters, such as K_{dp }, A_{x} and intrinsic LDR using the CSU-CHILL and CP-2 radar measurements, are discussed. Recent research has suggested that the copolar correlation coefficient termed rho_ {hv}(0) can be used to identify large hail and improve polarization estimates of rainfall. The typical measured values of rho_{hv }(0) at S-band vary between 0.8-1.0. For applications to hail identification the required accuracy should be within +/-0.01 while for rainfall improvement a higher accuracy is necessary, e.g., within +/-0.001. We discuss the statistics of several estimators of rho_{hv }(0) using the Gaussian spectrum approximation from both an analytical approach and via simulations. The standard deviation and bias in rho _{hv}(0) are computed as a function of number of samples, Doppler spectral width and mean rho_{hv}(0). The effect of finite signal-to-noise ratio (SNR) and phase noise are also studied via simulations. Time series data collected with the CSU-CHILL radar are analyzed and compared with the simulations. Antenna

  7. Enhancing Local Climate Projections of Precipitation: Assets and Limitations of Quantile Mapping Techniques for Statistical Downscaling

    NASA Astrophysics Data System (ADS)

    Ivanov, Martin; Kotlarski, Sven; Schär, Christoph

    2015-04-01

    The Swiss CH2011 scenarios provide a portfolio of climate change scenarios for the region of Switzerland, specifically tailored for use in climate impact research. Although widely applied by a variety of end-users, these scenarios are subject to several limitations related to the underlying delta change methodology. Examples are difficulties to appropriately account for changes in the spatio-temporal variability of meteorological fields and for changes in extreme events. The recently launched ELAPSE project (Enhancing local and regional climate change projections for Switzerland) is connected to the EU COST Action VALUE (www.value-cost.eu) and aims at complementing CH2011 by further scenario products, including a bias-corrected version of daily scenarios at the site scale. For this purpose the well-established empirical quantile mapping (QM) methodology is employed. Here, daily temperature and precipitation output of 15 GCM-RCM model chains of the ENSEMBLES project is downscaled and bias-corrected to match observations at weather stations in Switzerland. We consider established QM techniques based on all empirical quantiles or linear interpolation between the empirical percentiles. In an attempt to improve the downscaling of extreme precipitation events, we also apply a parametric approximation of the daily precipitation distribution by a dynamically weighted mixture of a Gamma distribution for the bulk and a Pareto distribution for the right tail for the first time in the context of QM. All techniques are evaluated and intercompared in a cross-validation framework. The statistical downscaling substantially improves virtually all considered distributional and temporal characteristics as well as their spatial distribution. The empirical methods have in general very similar performances. The parametric method does not show an improvement over the empirical ones. Critical sites and seasons are highlighted and discussed. Special emphasis is placed on investigating the

  8. GIS-based statistical mapping technique for block-and-ash pyroclastic flow and surge hazards

    NASA Astrophysics Data System (ADS)

    Widiwijayanti, C.; Voight, B.; Hidayat, D.; Schilling, S.

    2008-12-01

    Assessments of pyroclastic flow (PF) hazards are commonly based on mapping of PF and surge deposits and estimations of inundation limits, and/or computer models of varying degrees of sophistication. In volcanic crises a PF hazard map may be sorely needed, but limited time, exposures, or safety aspects may preclude fieldwork, and insufficient time or baseline data may be available for reliable dynamic simulations. We have developed a statistically constrained simulation model for block-and-ash PFs to estimate potential areas of inundation by adapting methodology from Iverson et al. (1998) for lahars. The predictive equations for block-and-ash PFs are calibrated with data from many volcanoes and given by A = (0.05-0.1)V2/3, B = (35-40)V2/3 , where A is cross-sectional area of inundation, B is planimetric area and V is deposit volume. The proportionality coefficients were obtained from regression analyses and comparison of simulations to mapped deposits. The method embeds the predictive equations in a GIS program coupled with DEM topography, using the LAHARZ program of Schilling (1998). Although the method is objective and reproducible, any PF hazard zone so computed should be considered as an approximate guide only, due to uncertainties on coefficients applicable to individual PFs, DEM details, and release volumes. Gradational nested hazard maps produced by these simulations reflect in a sense these uncertainties. The model does not explicitly consider dynamic behavior, which can be important. Surge impacts must be extended beyond PF hazard zones and we have explored several approaches to do this. The method has been used to supply PF hazard maps in two crises: Merapi 2006; and Montserrat 2006- 2007. We have also compared our hazard maps to actual recent PF deposits and to maps generated by several other model techniques.

  9. Statistical techniques for detecting the intergalactic magnetic field from large samples of extragalactic Faraday rotation data

    SciTech Connect

    Akahori, Takuya; Gaensler, B. M.; Ryu, Dongsu E-mail: bryan.gaensler@sydney.edu.au

    2014-08-01

    Rotation measure (RM) grids of extragalactic radio sources have been widely used for studying cosmic magnetism. However, their potential for exploring the intergalactic magnetic field (IGMF) in filaments of galaxies is unclear, since other Faraday-rotation media such as the radio source itself, intervening galaxies, and the interstellar medium of our Galaxy are all significant contributors. We study statistical techniques for discriminating the Faraday rotation of filaments from other sources of Faraday rotation in future large-scale surveys of radio polarization. We consider a 30° × 30° field of view toward the south Galactic pole, while varying the number of sources detected in both present and future observations. We select sources located at high redshifts and toward which depolarization and optical absorption systems are not observed so as to reduce the RM contributions from the sources and intervening galaxies. It is found that a high-pass filter can satisfactorily reduce the RM contribution from the Galaxy since the angular scale of this component toward high Galactic latitudes would be much larger than that expected for the IGMF. Present observations do not yet provide a sufficient source density to be able to estimate the RM of filaments. However, from the proposed approach with forthcoming surveys, we predict significant residuals of RM that should be ascribable to filaments. The predicted structure of the IGMF down to scales of 0.°1 should be observable with data from the Square Kilometre Array, if we achieve selections of sources toward which sightlines do not contain intervening galaxies and RM errors are less than a few rad m{sup –2}.

  10. Assessment of arsenic and heavy metal contents in cockles (Anadara granosa) using multivariate statistical techniques.

    PubMed

    Abbas Alkarkhi, F M; Ismail, Norli; Easa, Azhar Mat

    2008-02-11

    Cockles (Anadara granosa) sample obtained from two rivers in the Penang State of Malaysia were analyzed for the content of arsenic (As) and heavy metals (Cr, Cd, Zn, Cu, Pb, and Hg) using a graphite flame atomic absorption spectrometer (GF-AAS) for Cr, Cd, Zn, Cu, Pb, As and cold vapor atomic absorption spectrometer (CV-AAS) for Hg. The two locations of interest with 20 sampling points of each location were Kuala Juru (Juru River) and Bukit Tambun (Jejawi River). Multivariate statistical techniques such as multivariate analysis of variance (MANOVA) and discriminant analysis (DA) were applied for analyzing the data. MANOVA showed a strong significant difference between the two rivers in term of As and heavy metals contents in cockles. DA gave the best result to identify the relative contribution for all parameters in discriminating (distinguishing) the two rivers. It provided an important data reduction as it used only two parameters (Zn and Cd) affording more than 72% correct assignations. Results indicated that the two rivers were different in terms of As and heavy metal contents in cockle, and the major difference was due to the contribution of Zn and Cd. A positive correlation was found between discriminate functions (DF) and Zn, Cd and Cr, whereas negative correlation was exhibited with other heavy metals. Therefore, DA allowed a reduction in the dimensionality of the data set, delineating a few indicator parameters responsible for large variations in heavy metals and arsenic content. Taking into account of these results, it can be suggested that a continuous monitoring of As and heavy metals in cockles be performed in these two rivers.

  11. Advances in high-resolution imaging – techniques for three-dimensional imaging of cellular structures

    PubMed Central

    Lidke, Diane S.; Lidke, Keith A.

    2012-01-01

    A fundamental goal in biology is to determine how cellular organization is coupled to function. To achieve this goal, a better understanding of organelle composition and structure is needed. Although visualization of cellular organelles using fluorescence or electron microscopy (EM) has become a common tool for the cell biologist, recent advances are providing a clearer picture of the cell than ever before. In particular, advanced light-microscopy techniques are achieving resolutions below the diffraction limit and EM tomography provides high-resolution three-dimensional (3D) images of cellular structures. The ability to perform both fluorescence and electron microscopy on the same sample (correlative light and electron microscopy, CLEM) makes it possible to identify where a fluorescently labeled protein is located with respect to organelle structures visualized by EM. Here, we review the current state of the art in 3D biological imaging techniques with a focus on recent advances in electron microscopy and fluorescence super-resolution techniques. PMID:22685332

  12. Modulation/demodulation techniques for satellite communications. Part 2: Advanced techniques. The linear channel

    NASA Technical Reports Server (NTRS)

    Omura, J. K.; Simon, M. K.

    1982-01-01

    A theory is presented for deducing and predicting the performance of transmitter/receivers for bandwidth efficient modulations suitable for use on the linear satellite channel. The underlying principle used is the development of receiver structures based on the maximum-likelihood decision rule. The application of the performance prediction tools, e.g., channel cutoff rate and bit error probability transfer function bounds to these modulation/demodulation techniques.

  13. POC-Scale Testing of an Advanced Fine Coal Dewatering Equipment/Technique

    SciTech Connect

    Karekh, B K; Tao, D; Groppo, J G

    1998-08-28

    Froth flotation technique is an effective and efficient process for recovering of ultra-fine (minus 74 mm) clean coal. Economical dewatering of an ultra-fine clean coal product to a 20% level moisture will be an important step in successful implementation of the advanced cleaning processes. This project is a step in the Department of Energy's program to show that ultra-clean coal could be effectively dewatered to 20% or lower moisture using either conventional or advanced dewatering techniques. The cost-sharing contract effort is for 45 months beginning September 30, 1994. This report discusses technical progress made during the quarter from January 1 - March 31, 1998.

  14. Advanced Telecommunications in U.S. Public Elementary and Secondary Schools, Fall 1996. Statistics in Brief.

    ERIC Educational Resources Information Center

    Heaviside, Sheila; And Others

    The "Survey of Advanced Telecommunications in U.S. Public Elementary and Secondary Schools, Fall 1996" collected information from 911 regular United States public elementary and secondary schools regarding the availability and use of advanced telecommunications, and in particular, access to the Internet, plans to obtain Internet access, use of…

  15. Modulation/demodulation techniques for satellite communications. Part 3: Advanced techniques. The nonlinear channel

    NASA Technical Reports Server (NTRS)

    Omura, J. K.; Simon, M. K.

    1982-01-01

    A theory for deducing and predicting the performance of transmitter/receivers for bandwidth efficient modulations suitable for use on the nonlinear satellite channel is presented. The underlying principle used throughout is the development of receiver structures based on the maximum likelihood decision rule and aproximations to it. The bit error probability transfer function bounds developed in great detail in Part 4 is applied to these modulation/demodulation techniques. The effects of the various degrees of receiver mismatch are considered both theoretically and by numerous illustrative examples.

  16. Parameter estimation techniques based on optimizing goodness-of-fit statistics for structural reliability

    NASA Technical Reports Server (NTRS)

    Starlinger, Alois; Duffy, Stephen F.; Palko, Joseph L.

    1993-01-01

    New methods are presented that utilize the optimization of goodness-of-fit statistics in order to estimate Weibull parameters from failure data. It is assumed that the underlying population is characterized by a three-parameter Weibull distribution. Goodness-of-fit tests are based on the empirical distribution function (EDF). The EDF is a step function, calculated using failure data, and represents an approximation of the cumulative distribution function for the underlying population. Statistics (such as the Kolmogorov-Smirnov statistic and the Anderson-Darling statistic) measure the discrepancy between the EDF and the cumulative distribution function (CDF). These statistics are minimized with respect to the three Weibull parameters. Due to nonlinearities encountered in the minimization process, Powell's numerical optimization procedure is applied to obtain the optimum value of the EDF. Numerical examples show the applicability of these new estimation methods. The results are compared to the estimates obtained with Cooper's nonlinear regression algorithm.

  17. Statistical Analysis of RTS Noise and Low Frequency Noise in 1M MOSFETs Using an Advanced TEG

    NASA Astrophysics Data System (ADS)

    Abe, K.; Sugawa, S.; Watabe, S.; Miyamoto, N.; Teramoto, A.; Toita, M.; Kamata, Y.; Shibusawa, K.; Ohmi, T.

    2007-07-01

    In this paper, we developed an advanced Test Element Group (TEG) which can measure Random Telegraph Signal (RTS) noise in over 106 nMOSFETs including various gate sizes with high accuracy in a very short time. We measured and analyzed these noises statistically, as the result, we confirmed that appearance probabilities in the TEG and noise intensities of RTS are dependent on gate sizes.

  18. Application of Advanced Magnetic Resonance Imaging Techniques in Evaluation of the Lower Extremity

    PubMed Central

    Braun, Hillary J.; Dragoo, Jason L.; Hargreaves, Brian A.; Levenston, Marc E.; Gold, Garry E.

    2012-01-01

    Synopsis This article reviews current magnetic resonance imaging techniques for imaging the lower extremity, focusing on imaging of the knee, ankle, and hip joints. Recent advancements in MRI include imaging at 7 Tesla, using multiple receiver channels, T2* imaging, and metal suppression techniques, allowing more detailed visualization of complex anatomy, evaluation of morphological changes within articular cartilage, and imaging around orthopedic hardware. PMID:23622097

  19. Clinical decision support systems for brain tumor characterization using advanced magnetic resonance imaging techniques.

    PubMed

    Tsolaki, Evangelia; Kousi, Evanthia; Svolos, Patricia; Kapsalaki, Efthychia; Theodorou, Kyriaki; Kappas, Constastine; Tsougos, Ioannis

    2014-04-28

    In recent years, advanced magnetic resonance imaging (MRI) techniques, such as magnetic resonance spectroscopy, diffusion weighted imaging, diffusion tensor imaging and perfusion weighted imaging have been used in order to resolve demanding diagnostic problems such as brain tumor characterization and grading, as these techniques offer a more detailed and non-invasive evaluation of the area under study. In the last decade a great effort has been made to import and utilize intelligent systems in the so-called clinical decision support systems (CDSS) for automatic processing, classification, evaluation and representation of MRI data in order for advanced MRI techniques to become a part of the clinical routine, since the amount of data from the aforementioned techniques has gradually increased. Hence, the purpose of the current review article is two-fold. The first is to review and evaluate the progress that has been made towards the utilization of CDSS based on data from advanced MRI techniques. The second is to analyze and propose the future work that has to be done, based on the existing problems and challenges, especially taking into account the new imaging techniques and parameters that can be introduced into intelligent systems to significantly improve their diagnostic specificity and clinical application.

  20. Softform for facial rejuvenation: historical review, operative techniques, and recent advances.

    PubMed

    Miller, P J; Levine, J; Ahn, M S; Maas, C S; Constantinides, M

    2000-01-01

    The deep nasolabial fold and other facial furrows and wrinkles have challenged the facial plastic surgeon. A variety of techniques have been used in the past to correct these troublesome defects. Advances in the last five years in new materials and design have created a subcutaneous implant that has excellent properties. This article reviews the development and use of Softform facial implant.

  1. Traditional Materials and Techniques Used as Instructional Devices in an Advanced Business Spanish Conversation Class.

    ERIC Educational Resources Information Center

    Valdivieso, Jorge

    Spanish language training at the Thunderbird Graduate School of International Management is discussed, focusing on the instructional materials and classroom techniques used in advanced Spanish conversation classes. While traditional materials (dialogues, dictation, literature, mass media, video- and audiotapes) and learning activities (recitation,…

  2. Recognizing and Managing Complexity: Teaching Advanced Programming Concepts and Techniques Using the Zebra Puzzle

    ERIC Educational Resources Information Center

    Crabtree, John; Zhang, Xihui

    2015-01-01

    Teaching advanced programming can be a challenge, especially when the students are pursuing different majors with diverse analytical and problem-solving capabilities. The purpose of this paper is to explore the efficacy of using a particular problem as a vehicle for imparting a broad set of programming concepts and problem-solving techniques. We…

  3. Real-time application of advanced three-dimensional graphic techniques for research aircraft simulation

    NASA Technical Reports Server (NTRS)

    Davis, Steven B.

    1990-01-01

    Visual aids are valuable assets to engineers for design, demonstration, and evaluation. Discussed here are a variety of advanced three-dimensional graphic techniques used to enhance the displays of test aircraft dynamics. The new software's capabilities are examined and possible future uses are considered.

  4. Fabrication of advanced electrochemical energy materials using sol-gel processing techniques

    NASA Technical Reports Server (NTRS)

    Chu, C. T.; Chu, Jay; Zheng, Haixing

    1995-01-01

    Advanced materials play an important role in electrochemical energy devices such as batteries, fuel cells, and electrochemical capacitors. They are being used as both electrodes and electrolytes. Sol-gel processing is a versatile solution technique used in fabrication of ceramic materials with tailored stoichiometry, microstructure, and properties. The application of sol-gel processing in the fabrication of advanced electrochemical energy materials will be presented. The potentials of sol-gel derived materials for electrochemical energy applications will be discussed along with some examples of successful applications. Sol-gel derived metal oxide electrode materials such as V2O5 cathodes have been demonstrated in solid-slate thin film batteries; solid electrolytes materials such as beta-alumina for advanced secondary batteries had been prepared by the sol-gel technique long time ago; and high surface area transition metal compounds for capacitive energy storage applications can also be synthesized with this method.

  5. Detection and Sizing of Fatigue Cracks in Steel Welds with Advanced Eddy Current Techniques

    NASA Astrophysics Data System (ADS)

    Todorov, E. I.; Mohr, W. C.; Lozev, M. G.

    2008-02-01

    Butt-welded specimens were fatigued to produce cracks in the weld heat-affected zone. Advanced eddy current (AEC) techniques were used to detect and size the cracks through a coating. AEC results were compared with magnetic particle and phased-array ultrasonic techniques. Validation through destructive crack measurements was also conducted. Factors such as geometry, surface treatment, and crack tightness interfered with depth sizing. AEC inspection techniques have the potential of providing more accurate and complete sizing flaw data for manufacturing and in-service inspections.

  6. An analytic technique for statistically modeling random atomic clock errors in estimation

    NASA Technical Reports Server (NTRS)

    Fell, P. J.

    1981-01-01

    Minimum variance estimation requires that the statistics of random observation errors be modeled properly. If measurements are derived through the use of atomic frequency standards, then one source of error affecting the observable is random fluctuation in frequency. This is the case, for example, with range and integrated Doppler measurements from satellites of the Global Positioning and baseline determination for geodynamic applications. An analytic method is presented which approximates the statistics of this random process. The procedure starts with a model of the Allan variance for a particular oscillator and develops the statistics of range and integrated Doppler measurements. A series of five first order Markov processes is used to approximate the power spectral density obtained from the Allan variance.

  7. Treatment of control data in lunar phototriangulation. [application of statistical procedures and development of mathematical and computer techniques

    NASA Technical Reports Server (NTRS)

    Wong, K. W.

    1974-01-01

    In lunar phototriangulation, there is a complete lack of accurate ground control points. The accuracy analysis of the results of lunar phototriangulation must, therefore, be completely dependent on statistical procedure. It was the objective of this investigation to examine the validity of the commonly used statistical procedures, and to develop both mathematical techniques and computer softwares for evaluating (1) the accuracy of lunar phototriangulation; (2) the contribution of the different types of photo support data on the accuracy of lunar phototriangulation; (3) accuracy of absolute orientation as a function of the accuracy and distribution of both the ground and model points; and (4) the relative slope accuracy between any triangulated pass points.

  8. Advanced imaging techniques for assessment of structure, composition and function in biofilm systems.

    PubMed

    Neu, Thomas R; Manz, Bertram; Volke, Frank; Dynes, James J; Hitchcock, Adam P; Lawrence, John R

    2010-04-01

    Scientific imaging represents an important and accepted research tool for the analysis and understanding of complex natural systems. Apart from traditional microscopic techniques such as light and electron microscopy, new advanced techniques have been established including laser scanning microscopy (LSM), magnetic resonance imaging (MRI) and scanning transmission X-ray microscopy (STXM). These new techniques allow in situ analysis of the structure, composition, processes and dynamics of microbial communities. The three techniques open up quantitative analytical imaging possibilities that were, until a few years ago, impossible. The microscopic techniques represent powerful tools for examination of mixed environmental microbial communities usually encountered in the form of aggregates and films. As a consequence, LSM, MRI and STXM are being used in order to study complex microbial biofilm systems. This mini review provides a short outline of the more recent applications with the intention to stimulate new research and imaging approaches in microbiology.

  9. Advanced techniques for determining long term compatibility of materials with propellants

    NASA Technical Reports Server (NTRS)

    Green, R. L.

    1972-01-01

    The search for advanced measurement techniques for determining long term compatibility of materials with propellants was conducted in several parts. A comprehensive survey of the existing measurement and testing technology for determining material-propellant interactions was performed. Selections were made from those existing techniques which were determined could meet or be made to meet the requirements. Areas of refinement or changes were recommended for improvement of others. Investigations were also performed to determine the feasibility and advantages of developing and using new techniques to achieve significant improvements over existing ones. The most interesting demonstration was that of the new technique, the volatile metal chelate analysis. Rivaling the neutron activation analysis in terms of sensitivity and specificity, the volatile metal chelate technique was fully demonstrated.

  10. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    ERIC Educational Resources Information Center

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  11. STATISTICAL TECHNIQUES FOR DETERMINATION AND PREDICTION OF FUNDAMENTAL FISH ASSEMBLAGES OF THE MID-ATLANTIC HIGHLANDS

    EPA Science Inventory

    A statistical software tool, Stream Fish Community Predictor (SFCP), based on EMAP stream sampling in the mid-Atlantic Highlands, was developed to predict stream fish communities using stream and watershed characteristics. Step one in the tool development was a cluster analysis t...

  12. Theory and analysis of statistical discriminant techniques as applied to remote sensing data

    NASA Technical Reports Server (NTRS)

    Odell, P. L.

    1973-01-01

    Classification of remote earth resources sensing data according to normed exponential density statistics is reported. The use of density models appropriate for several physical situations provides an exact solution for the probabilities of classifications associated with the Bayes discriminant procedure even when the covariance matrices are unequal.

  13. The Random Forests Statistical Technique: An Examination of Its Value for the Study of Reading

    ERIC Educational Resources Information Center

    Matsuki, Kazunaga; Kuperman, Victor; Van Dyke, Julie A.

    2016-01-01

    Studies investigating individual differences in reading ability often involve data sets containing a large number of collinear predictors and a small number of observations. In this article, we discuss the method of Random Forests and demonstrate its suitability for addressing the statistical concerns raised by such data sets. The method is…

  14. Flipping the Classroom and Student Performance in Advanced Statistics: Evidence from a Quasi-Experiment

    ERIC Educational Resources Information Center

    Touchton, Michael

    2015-01-01

    I administer a quasi-experiment using undergraduate political science majors in statistics classes to evaluate whether "flipping the classroom" (the treatment) alters students' applied problem-solving performance and satisfaction relative to students in a traditional classroom environment (the control). I also assess whether general…

  15. Nondestructive Characterization by Advanced Synchrotron Light Techniques: Spectromicroscopy and Coherent Radiology

    PubMed Central

    Margaritondo, Giorgio; Hwu, Yeukuang; Je, Jung Ho

    2008-01-01

    The advanced characteristics of synchrotron light has led in recent years to the development of a series of new experimental techniques to investigate chemical and physical properties on a microscopic scale. Although originally developed for materials science and biomedical research, such techniques find increasing applications in other domains – and could be quite useful for the study and conservation of cultural heritage. Specifically, they can nondestructively provide detailed chemical composition information that can be useful for the identification of specimens, for the discovery of historical links based on the sources of chemical raw materials and on chemical processes, for the analysis of damage, their causes and remedies and for many other issues. Likewise, morphological and structural information on a microscopic scale is useful for the identification, study and preservation of many different cultural and historical specimens. We concentrate here on two classes of techniques: in the first case, photoemission spectromicroscopy. This is the result of the advanced evolution of photoemission techniques like ESCA (Electron Microscopy for Chemical Analysis). By combining high lateral resolution to spectroscopy, photoemission spectromicroscopy can deliver fine chemical information on a microscopic scale in a nondestructive fashion. The second class of techniques exploits the high lateral coherence of modern synchrotron sources, a byproduct of the quest for high brightness or brilliance. We will see that such techniques now push radiology into the submicron scale and the submillisecond time domain. Furthermore, they can be implemented in a tomographic mode, increasing the information and becoming potentially quite useful for the analysis of cultural heritage specimens.

  16. Recent advances in 3D computed tomography techniques for simulation and navigation in hepatobiliary pancreatic surgery.

    PubMed

    Uchida, Masafumi

    2014-04-01

    A few years ago it could take several hours to complete a 3D image using a 3D workstation. Thanks to advances in computer science, obtaining results of interest now requires only a few minutes. Many recent 3D workstations or multimedia computers are equipped with onboard 3D virtual patient modeling software, which enables patient-specific preoperative assessment and virtual planning, navigation, and tool positioning. Although medical 3D imaging can now be conducted using various modalities, including computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), and ultrasonography (US) among others, the highest quality images are obtained using CT data, and CT images are now the most commonly used source of data for 3D simulation and navigation image. If the 2D source image is bad, no amount of 3D image manipulation in software will provide a quality 3D image. In this exhibition, the recent advances in CT imaging technique and 3D visualization of the hepatobiliary and pancreatic abnormalities are featured, including scan and image reconstruction technique, contrast-enhanced techniques, new application of advanced CT scan techniques, and new virtual reality simulation and navigation imaging.

  17. Statistical Techniques for Analyzing Process or "Similarity" Data in TID Hardness Assurance

    NASA Technical Reports Server (NTRS)

    Ladbury, R.

    2010-01-01

    We investigate techniques for estimating the contributions to TID hardness variability for families of linear bipolar technologies, determining how part-to-part and lot-to-lot variability change for different part types in the process.

  18. Under the influence of Malthus's law of population growth: Darwin eschews the statistical techniques of Aldolphe Quetelet.

    PubMed

    Ariew, André

    2007-03-01

    Charles Darwin, James Clerk Maxwell, and Francis Galton were all aware, by various means, of Aldolphe Quetelet's pioneering work in statistics. Darwin, Maxwell, and Galton all had reason to be interested in Quetelet's work: they were all working on some instance of how large-scale regularities emerge from individual events that vary from one another; all were rejecting the divine interventionistic theories of their contemporaries; and Quetelet's techniques provided them with a way forward. Maxwell and Galton all explicitly endorse Quetelet's techniques in their work; Darwin does not incorporate any of the statistical ideas of Quetelet, although natural selection post-twentieth century synthesis has. Why not Darwin? My answer is that by the time Darwin encountered Malthus's law of excess reproduction he had all he needed to answer about large scale regularities in extinctions, speciation, and adaptation. He didn't need Quetelet.

  19. Statistical Technique for Intermediate and Long-Range Estimation of 13-Month Smoothed Solar Flux and Geomagnetic Index

    NASA Technical Reports Server (NTRS)

    Niehuss, K. O.; Euler, H. C., Jr.; Vaughan, W. W.

    1996-01-01

    This report documents the Marshall Space Flight Center (MSFC) 13-month smoothed solar flux (F(sub 10.7)) and geomagnetic index (A(sub p)) intermediate (months) and long-range (years) statistical estimation technique, referred to as the MSFC Lagrangian Linear Regression Technique (MLLRT). Estimates of future solar activity are needed as updated input to upper atmosphere density models used for satellite and spacecraft orbital lifetime predictions. An assessment of the MLLRT computer program's products is provided for 5-year periods from the date estimates were made. This was accomplished for a number of past solar cycles.

  20. A Multivariate Technique for Evaluating the Statistical Homogeneity of Jointed Rock Masses

    NASA Astrophysics Data System (ADS)

    Li, Yanyan; Wang, Qing; Chen, Jianping; Song, Shengyuan; Ruan, Yunkai; Zhang, Qi

    2015-09-01

    Various approaches have been developed for identifying statistically homogeneous regions or structural domains in a jointed rock mass based on joint orientations or the other joint parameters; however, few studies have been conducted by integrating both. In this paper, nine parameters are considered for this identification, namely: orientation, spacing, aperture, roughness, trace length, trace type, filling, groundwater condition, and weathering. A statistical parameter, known as the correlation coefficient, is used to quantify the degree of the similarity between the joint parameters collected from four adjacent adits at the Songta dam site on the upper reaches of the Nu River in southwest China. Based on the analytic hierarchy process, the weights of the parameters are obtained. The overall homogeneity of the rock mass around the studied regions is determined based on the homogeneity index.

  1. Development of low-cost test techniques for advancing film cooling technology

    NASA Astrophysics Data System (ADS)

    Soechting, F. O.; Landis, K. K.; Dobrowolski, R.

    1987-06-01

    A program for studying advanced film hole geometries that will provide improved film effectiveness levels relative to those reported in the literature is described. A planar wind tunnel was used to conduct flow visualization studies on different film hole shapes, followed by film effectiveness measurements. The most promising geometries were then tested in a two-dimensional cascade to define the film effectiveness distributions, while duplicating a turbine airfoil curvature, Mach number, and acceleration characteristics. The test techniques are assessed and typical results are presented. It was shown that smoke flow visualization is an excellent low-cost technique for observing film coolant-to-mainstream characteristics and that reusable liquid crystal sheets provide an accurate low-cost technique for measuring near-hole film effectiveness contours. Cascade airfoils constructed using specially developed precision fabrication techniques provided high-quality film effectiveness data.

  2. Advances in the surface modification techniques of bone-related implants for last 10 years

    PubMed Central

    Qiu, Zhi-Ye; Chen, Cen; Wang, Xiu-Mei; Lee, In-Seop

    2014-01-01

    At the time of implanting bone-related implants into human body, a variety of biological responses to the material surface occur with respect to surface chemistry and physical state. The commonly used biomaterials (e.g. titanium and its alloy, Co–Cr alloy, stainless steel, polyetheretherketone, ultra-high molecular weight polyethylene and various calcium phosphates) have many drawbacks such as lack of biocompatibility and improper mechanical properties. As surface modification is very promising technology to overcome such problems, a variety of surface modification techniques have been being investigated. This review paper covers recent advances in surface modification techniques of bone-related materials including physicochemical coating, radiation grafting, plasma surface engineering, ion beam processing and surface patterning techniques. The contents are organized with different types of techniques to applicable materials, and typical examples are also described. PMID:26816626

  3. Event Screening Using a Cepstral F-Statistic Technique to Identify Depth Phases

    NASA Astrophysics Data System (ADS)

    Bonner, J. L.; Reiter, D. T.; Shumway, R. H.

    2001-05-01

    The depth of a seismic event is one of the most important criteria for screening events as either explosions or earthquakes. Unfortunately, the depth is also notoriously difficult to accurately determine. Some of the methods used to determine focal depth include waveform modeling, beamforming and cepstral methods for detecting depth phases such as pP and sP. To improve depth estimation using cepstral methods we focused on three primary goals: (1) formulating a method for determining the statistical significance of peaks in the cepstrum, (2) testing the method on synthetic data as well as earthquake data with well-determined hypocenters, and (3) evaluating the method as an operational analysis tool for determining event depths using varied datasets at both teleseismic and regional distances. We have formulated a cepstral F-statistic by using a classical approach to detecting a signal in a number of stationarily correlated time series. The method is particularly suited for regional array analysis; however, the method can also be applied to three-component data. Tests on synthetic data show the method works best when the P wave arrival has a signal-to-noise ratio (SNR) greater than between 8 and 10 with the depth phase exhibiting a SNR greater than between 2 and 4. These requirements in SNR were validated using events from the Hindu Kush region of Afghanistan with well-determined depths as recorded on arrays at teleseismic distances. To test the operational capabilities of this method as a tool for event screening at a data center, we analyzed 61 events located by the pIDC and/or the National Earthquake Information Center (NEIC). Our method determined statistically significant depths for 41 of 61 events with 10 of the events having low SNR at the recording arrays, while another 10 were either too shallow for analysis or did not exhibit depth phases. The method determined depths between 12 and 90 km for 7 of 17 events, which the pIDC had fixed to 0 km. The scatter

  4. Advanced semiconductor diagnosis by multidimensional electron-beam-induced current technique.

    PubMed

    Chen, J; Yuan, X; Sekiguchi, T

    2008-01-01

    We present advanced semiconductor diagnosis by using electron-beam-induced current (EBIC) technique. By varying the parameters such as temperature, accelerating voltage (V(acc)), bias voltage, and stressing time, it is possible to extend EBIC application from conventional defect characterization to advanced device diagnosis. As an electron beam can excite a certain volume even beneath the surface passive layer, EBIC can be effectively employed to diagnose complicated devices with hybrid structure. Three topics were selected to demonstrate EBIC applications. First, the recombination activities of grain boundaries and their interaction with Fe impurity in photovoltaic multicrystalline Si (mc-Si) are clarified by temperature-dependent EBIC. Second, the detection of dislocations between strained-Si and SiGe virtual substrate are shown to overcome the limitation of depletion region. Third, the observation of leakage sites in high-k gate dielectric is demonstrated for the characterization of advanced hybrid device structures.

  5. Generic Techniques for the Calibration of Robots with Application of the 3-D Fixtures and Statistical Technique on the PUMA 500 and ARID Robots

    NASA Technical Reports Server (NTRS)

    Tawfik, Hazem

    1991-01-01

    A relatively simple, inexpensive, and generic technique that could be used in both laboratories and some operation site environments is introduced at the Robotics Applications and Development Laboratory (RADL) at Kennedy Space Center (KSC). In addition, this report gives a detailed explanation of the set up procedure, data collection, and analysis using this new technique that was developed at the State University of New York at Farmingdale. The technique was used to evaluate the repeatability, accuracy, and overshoot of the Unimate Industrial Robot, PUMA 500. The data were statistically analyzed to provide an insight into the performance of the systems and components of the robot. Also, the same technique was used to check the forward kinematics against the inverse kinematics of RADL's PUMA robot. Recommendations were made for RADL to use this technique for laboratory calibration of the currently existing robots such as the ASEA, high speed controller, Automated Radiator Inspection Device (ARID) etc. Also, recommendations were made to develop and establish other calibration techniques that will be more suitable for site calibration environment and robot certification.

  6. Recent advancements in nanoelectrodes and nanopipettes used in combined scanning electrochemical microscopy techniques.

    PubMed

    Kranz, Christine

    2014-01-21

    In recent years, major developments in scanning electrochemical microscopy (SECM) have significantly broadened the application range of this electroanalytical technique from high-resolution electrochemical imaging via nanoscale probes to large scale mapping using arrays of microelectrodes. A major driving force in advancing the SECM methodology is based on developing more sophisticated probes beyond conventional micro-disc electrodes usually based on noble metals or carbon microwires. This critical review focuses on the design and development of advanced electrochemical probes particularly enabling combinations of SECM with other analytical measurement techniques to provide information beyond exclusively measuring electrochemical sample properties. Consequently, this critical review will focus on recent progress and new developments towards multifunctional imaging.

  7. POC-scale testing of an advanced fine coal dewatering equipment/technique

    SciTech Connect

    1998-09-01

    Froth flotation technique is an effective and efficient process for recovering of ultra-fine (minus 74 pm) clean coal. Economical dewatering of an ultra-fine clean-coal product to a 20% level moisture will be an important step in successful implementation of the advanced cleaning processes. This project is a step in the Department of Energy`s program to show that ultra-clean coal could be effectively dewatered to 20% or lower moisture using either conventional or advanced dewatering techniques. The cost-sharing contract effort is for 36 months beginning September 30, 1994. This report discusses technical progress made during the quarter from July 1 - September 30, 1997.

  8. The investigation of advanced remote sensing techniques for the measurement of aerosol characteristics

    NASA Technical Reports Server (NTRS)

    Deepak, A.; Becher, J.

    1979-01-01

    Advanced remote sensing techniques and inversion methods for the measurement of characteristics of aerosol and gaseous species in the atmosphere were investigated. Of particular interest were the physical and chemical properties of aerosols, such as their size distribution, number concentration, and complex refractive index, and the vertical distribution of these properties on a local as well as global scale. Remote sensing techniques for monitoring of tropospheric aerosols were developed as well as satellite monitoring of upper tropospheric and stratospheric aerosols. Computer programs were developed for solving multiple scattering and radiative transfer problems, as well as inversion/retrieval problems. A necessary aspect of these efforts was to develop models of aerosol properties.

  9. Advanced digital modulation: Communication techniques and monolithic GaAs technology

    NASA Technical Reports Server (NTRS)

    Wilson, S. G.; Oliver, J. D., Jr.; Kot, R. C.; Richards, C. R.

    1983-01-01

    Communications theory and practice are merged with state-of-the-art technology in IC fabrication, especially monolithic GaAs technology, to examine the general feasibility of a number of advanced technology digital transmission systems. Satellite-channel models with (1) superior throughput, perhaps 2 Gbps; (2) attractive weight and cost; and (3) high RF power and spectrum efficiency are discussed. Transmission techniques possessing reasonably simple architectures capable of monolithic fabrication at high speeds were surveyed. This included a review of amplitude/phase shift keying (APSK) techniques and the continuous-phase-modulation (CPM) methods, of which MSK represents the simplest case.

  10. Combined preputial advancement and phallopexy as a revision technique for treating paraphimosis in a dog.

    PubMed

    Wasik, S M; Wallace, A M

    2014-11-01

    A 7-year-old neutered male Jack Russell terrier-cross was presented for signs of recurrent paraphimosis, despite previous surgical enlargement of the preputial ostium. Revision surgery was performed using a combination of preputial advancement and phallopexy, which resulted in complete and permanent coverage of the glans penis by the prepuce, and at 1 year postoperatively, no recurrence of paraphimosis had been observed. The combined techniques allow preservation of the normal penile anatomy, are relatively simple to perform and provide a cosmetic result. We recommend this combination for the treatment of paraphimosis in the dog, particularly when other techniques have failed. PMID:25348145

  11. Development of advanced electron holographic techniques and application to industrial materials and devices.

    PubMed

    Yamamoto, Kazuo; Hirayama, Tsukasa; Tanji, Takayoshi

    2013-06-01

    The development of a transmission electron microscope equipped with a field emission gun paved the way for electron holography to be put to practical use in various fields. In this paper, we review three advanced electron holography techniques: on-line real-time electron holography, three-dimensional (3D) tomographic holography and phase-shifting electron holography, which are becoming important techniques for materials science and device engineering. We also describe some applications of electron holography to the analysis of industrial materials and devices: GaAs compound semiconductors, solid oxide fuel cells and all-solid-state lithium ion batteries.

  12. An example of requirements for Advanced Subsonic Civil Transport (ASCT) flight control system using structured techniques

    NASA Technical Reports Server (NTRS)

    Mclees, Robert E.; Cohen, Gerald C.

    1991-01-01

    The requirements are presented for an Advanced Subsonic Civil Transport (ASCT) flight control system generated using structured techniques. The requirements definition starts from initially performing a mission analysis to identify the high level control system requirements and functions necessary to satisfy the mission flight. The result of the study is an example set of control system requirements partially represented using a derivative of Yourdon's structured techniques. Also provided is a research focus for studying structured design methodologies and in particular design-for-validation philosophies.

  13. Study of advanced techniques for determining the long term performance of components

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The application of existing and new technology to the problem of determining the long-term performance capability of liquid rocket propulsion feed systems is discussed. The long term performance of metal to metal valve seats in a liquid propellant fuel system is stressed. The approaches taken in conducting the analysis are: (1) advancing the technology of characterizing components through the development of new or more sensitive techniques and (2) improving the understanding of the physical of degradation.

  14. Deriving Criteria-supporting Benchmark Values from Empirical Response Relationships: Comparison of Statistical Techniques and Effect of Log-transforming the Nutrient Variable

    EPA Science Inventory

    In analyses supporting the development of numeric nutrient criteria, multiple statistical techniques can be used to extract critical values from stressor response relationships. However there is little guidance for choosing among techniques, and the extent to which log-transfor...

  15. ADVANCING THE FUNDAMENTAL UNDERSTANDING AND SCALE-UP OF TRISO FUEL COATERS VIA ADVANCED MEASUREMENT AND COMPUTATIONAL TECHNIQUES

    SciTech Connect

    Biswas, Pratim; Al-Dahhan, Muthanna

    2012-11-01

    to advance the fundamental understanding of the hydrodynamics by systematically investigating the effect of design and operating variables, to evaluate the reported dimensionless groups as scaling factors, and to establish a reliable scale-up methodology for the TRISO fuel particle spouted bed coaters based on hydrodynamic similarity via advanced measurement and computational techniques. An additional objective is to develop an on-line non-invasive measurement technique based on gamma ray densitometry (i.e. Nuclear Gauge Densitometry) that can be installed and used for coater process monitoring to ensure proper performance and operation and to facilitate the developed scale-up methodology. To achieve the objectives set for the project, the work will use optical probes and gamma ray computed tomography (CT) (for the measurements of solids/voidage holdup cross-sectional distribution and radial profiles along the bed height, spouted diameter, and fountain height) and radioactive particle tracking (RPT) (for the measurements of the 3D solids flow field, velocity, turbulent parameters, circulation time, solids lagrangian trajectories, and many other of spouted bed related hydrodynamic parameters). In addition, gas dynamic measurement techniques and pressure transducers will be utilized to complement the obtained information. The measurements obtained by these techniques will be used as benchmark data to evaluate and validate the computational fluid dynamic (CFD) models (two fluid model or discrete particle model) and their closures. The validated CFD models and closures will be used to facilitate the developed methodology for scale-up, design and hydrodynamic similarity. Successful execution of this work and the proposed tasks will advance the fundamental understanding of the coater flow field and quantify it for proper and safe design, scale-up, and performance. Such achievements will overcome the barriers to AGR applications and will help assure that the US maintains

  16. Integrating Organic Matter Structure with Ecosystem Function using Advanced Analytical Chemistry Techniques

    NASA Astrophysics Data System (ADS)

    Boot, C. M.

    2012-12-01

    Microorganisms are the primary transformers of organic matter in terrestrial and aquatic ecosystems. The structure of organic matter controls its bioavailability and researchers have long sought to link the chemical characteristics of the organic matter pool to its lability. To date this effort has been primarily attempted using low resolution descriptive characteristics (e.g. organic matter content, carbon to nitrogen ratio, aromaticity, etc .). However, recent progress in linking these two important ecosystem components has been advanced using advanced high resolution tools (e.g. nuclear magnetic resonance (NMR) spectroscopy, and mass spectroscopy (MS)-based techniques). A series of experiments will be presented that highlight the application of high resolution techniques in a variety of terrestrial and aquatic ecosystems with the focus on how these data explicitly provide the foundation for integrating organic matter structure into our concept of ecosystem function. The talk will highlight results from a series of experiments including: an MS-based metabolomics and fluorescence excitation emission matrix approach evaluating seasonal and vegetation based changes in dissolved organic matter (DOM) composition from arctic soils; Fourier transform ion cyclotron resonance (FTICR) MS and MS metabolomics analysis of DOM from three lakes in an alpine watershed; and the transformation of 13C labeled glucose track with NMR during a rewetting experiment from Colorado grassland soils. These data will be synthesized to illustrate how the application of advanced analytical techniques provides novel insight into our understanding of organic matter processing in a wide range of ecosystems.

  17. POC-scale testing of an advanced fine coal dewatering equipment/technique

    SciTech Connect

    Groppo, J.G.; Parekh, B.K.; Rawls, P.

    1995-11-01

    Froth flotation technique is an effective and efficient process for recovering of ultra-fine (minus 74 {mu}m) clean coal. Economical dewatering of an ultra-fine clean coal product to a 20 percent level moisture will be an important step in successful implementation of the advanced cleaning processes. This project is a step in the Department of Energy`s program to show that ultra-clean coal could be effectively dewatered to 20 percent or lower moisture using either conventional or advanced dewatering techniques. As the contract title suggests, the main focus of the program is on proof-of-concept testing of a dewatering technique for a fine clean coal product. The coal industry is reluctant to use the advanced fine coal recovery technology due to the non-availability of an economical dewatering process. in fact, in a recent survey conducted by U.S. DOE and Battelle, dewatering of fine clean coal was identified as the number one priority for the coal industry. This project will attempt to demonstrate an efficient and economic fine clean coal slurry dewatering process.

  18. Advancing the Science of Spatial Neglect Rehabilitation: An Improved Statistical Approach with Mixed Linear Modeling

    PubMed Central

    Goedert, Kelly M.; Boston, Raymond C.; Barrett, A. M.

    2013-01-01

    Valid research on neglect rehabilitation demands a statistical approach commensurate with the characteristics of neglect rehabilitation data: neglect arises from impairment in distinct brain networks leading to large between-subject variability in baseline symptoms and recovery trajectories. Studies enrolling medically ill, disabled patients, may suffer from missing, unbalanced data, and small sample sizes. Finally, assessment of rehabilitation requires a description of continuous recovery trajectories. Unfortunately, the statistical method currently employed in most studies of neglect treatment [repeated measures analysis of variance (ANOVA), rANOVA] does not well-address these issues. Here we review an alternative, mixed linear modeling (MLM), that is more appropriate for assessing change over time. MLM better accounts for between-subject heterogeneity in baseline neglect severity and in recovery trajectory. MLM does not require complete or balanced data, nor does it make strict assumptions regarding the data structure. Furthermore, because MLM better models between-subject heterogeneity it often results in increased power to observe treatment effects with smaller samples. After reviewing current practices in the field, and the assumptions of rANOVA, we provide an introduction to MLM. We review its assumptions, uses, advantages, and disadvantages. Using real and simulated data, we illustrate how MLM may improve the ability to detect effects of treatment over ANOVA, particularly with the small samples typical of neglect research. Furthermore, our simulation analyses result in recommendations for the design of future rehabilitation studies. Because between-subject heterogeneity is one important reason why studies of neglect treatments often yield conflicting results, employing statistical procedures that model this heterogeneity more accurately will increase the efficiency of our efforts to find treatments to improve the lives of individuals with neglect. PMID

  19. Using Candy Samples to Learn about Sampling Techniques and Statistical Data Evaluation

    ERIC Educational Resources Information Center

    Canaes, Larissa S.; Brancalion, Marcel L.; Rossi, Adriana V.; Rath, Susanne

    2008-01-01

    A classroom exercise for undergraduate and beginning graduate students that takes about one class period is proposed and discussed. It is an easy, interesting exercise that demonstrates important aspects of sampling techniques (sample amount, particle size, and the representativeness of the sample in relation to the bulk material). The exercise…

  20. New advances in methodology for statistical tests useful in geostatistical studies

    SciTech Connect

    Borgman, L.E.

    1988-05-01

    Methodology for statistical procedures to perform tests of hypothesis pertaining to various aspects of geostatistical investigations has been slow in developing. The correlated nature of the data precludes most classical tests and makes the design of new tests difficult. Recent studies have led to modifications of the classical t test which allow for the intercorrelation. In addition, results for certain nonparametric tests have been obtained. The conclusions of these studies provide a variety of new tools for the geostatistician in deciding questions on significant differences and magnitudes.

  1. An assessment of water quality in the Coruh Basin (Turkey) using multivariate statistical techniques.

    PubMed

    Bilgin, Ayla

    2015-11-01

    The purpose of this study was to assess the impact of 24 water parameters, measured semi-annually between 2011 and 2013 in Coruh Basin (Turkey), based on the quality of the water. The study utilised analysis of variance (ANOVA), principal component analysis (PCA) and factor analysis (FA) methods. The water-quality data was obtained from a total of four sites by the 26th Regional Directorate of the State Hydraulic Works (DSI). ANOVA was carried out to identify the differences between the parameters at the different measuring sites. The variables were classified using factor analysis, and at the end of the ANOVA test, it was established that there was a statistically significant difference between the downstream and upstream waste waters released by the Black Sea copper companies and between the Murgul and Borcka Dams, in terms of water quality, while no statistically significant difference was observed between the Murgul and Borcka Dams. It was determined through factor analysis that five factors explained 81.3% of the total variance. It was concluded that domestic, industrial and agricultural activities, in combination with physicochemical properties, were factors affecting the quality of the water in the Coruh Basin. PMID:26514804

  2. Advanced Time-Resolved Fluorescence Microscopy Techniques for the Investigation of Peptide Self-Assembly

    NASA Astrophysics Data System (ADS)

    Anthony, Neil R.

    The ubiquitous cross beta sheet peptide motif is implicated in numerous neurodegenerative diseases while at the same time offers remarkable potential for constructing isomorphic high-performance bionanomaterials. Despite an emerging understanding of the complex folding landscape of cross beta structures in determining disease etiology and final structure, we lack knowledge of the critical initial stages of nucleation and growth. In this dissertation, I advance our understanding of these key stages in the cross-beta nucleation and growth pathways using cutting-edge microscopy techniques. In addition, I present a new combined time-resolved fluorescence analysis technique with the potential to advance our current understanding of subtle molecular level interactions that play a pivotal role in peptide self-assembly. Using the central nucleating core of Alzheimer's Amyloid-beta protein, Abeta(16 22), as a model system, utilizing electron, time-resolved, and non-linear microscopy, I capture the initial and transient nucleation stages of peptide assembly into the cross beta motif. In addition, I have characterized the nucleation pathway, from monomer to paracrystalline nanotubes in terms of morphology and fluorescence lifetime, corroborating the predicted desolvation process that occurs prior to cross-beta nucleation. Concurrently, I have identified unique heterogeneous cross beta domains contained within individual nanotube structures, which have potential bionanomaterials applications. Finally, I describe a combined fluorescence theory and analysis technique that dramatically increases the sensitivity of current time-resolved techniques. Together these studies demonstrate the potential for advanced microscopy techniques in the identification and characterization of the cross-beta folding pathway, which will further our understanding of both amyloidogenesis and bionanomaterials.

  3. Chorus wave-normal statistics in the Earth's radiation belts from ray tracing technique

    NASA Astrophysics Data System (ADS)

    Breuillard, H.; Zaliznyak, Y.; Krasnoselskikh, V.; Agapitov, O.; Artemyev, A.; Rolland, G.

    2012-08-01

    Discrete ELF/VLF (Extremely Low Frequency/Very Low Frequency) chorus emissions are one of the most intense electromagnetic plasma waves observed in radiation belts and in the outer terrestrial magnetosphere. These waves play a crucial role in the dynamics of radiation belts, and are responsible for the loss and the acceleration of energetic electrons. The objective of our study is to reconstruct the realistic distribution of chorus wave-normals in radiation belts for all magnetic latitudes. To achieve this aim, the data from the electric and magnetic field measurements onboard Cluster satellite are used to determine the wave-vector distribution of the chorus signal around the equator region. Then the propagation of such a wave packet is modeled using three-dimensional ray tracing technique, which employs K. Rönnmark's WHAMP to solve hot plasma dispersion relation along the wave packet trajectory. The observed chorus wave distributions close to waves source are first fitted to form the initial conditions which then propagate numerically through the inner magnetosphere in the frame of the WKB approximation. Ray tracing technique allows one to reconstruct wave packet properties (electric and magnetic fields, width of the wave packet in k-space, etc.) along the propagation path. The calculations show the spatial spreading of the signal energy due to propagation in the inhomogeneous and anisotropic magnetized plasma. Comparison of wave-normal distribution obtained from ray tracing technique with Cluster observations up to 40° latitude demonstrates the reliability of our approach and applied numerical schemes.

  4. A statistical technique for determining rainfall over land employing Nimbus-6 ESMR measurements

    NASA Technical Reports Server (NTRS)

    Rodgers, E.; Siddalingaiah, H.; Chang, A. T. C.; Wilheit, T. T.

    1978-01-01

    At 37 GHz, the frequency at which the Nimbus 6 Electrically Scanning Microwave Radiometer (ESMR 6) measures upwelling radiance, it was shown theoretically that the atmospheric scattering and the relative independence on electromagnetic polarization of the radiances emerging from hydrometers make it possible to monitor remotely active rainfall over land. In order to verify experimentally these theoretical findings and to develop an algorithm to monitor rainfall over land, the digitized ESMR 6 measurements were examined statistically. Horizontally and vertically polarized brightness temperature pairs (TH, TV) from ESMR 6 were sampled for areas of rainfall over land as determined from the rain recording stations and the WSR 57 radar, and areas of wet and dry ground (whose thermodynamic temperatures were greater than 5 C) over the Southeastern United States. These three categories of brightness temperatures were found to be significantly different in the sense that the chances that the mean vectors of any two populations coincided were less than 1 in 100.

  5. Source Evaluation and Trace Metal Contamination in Benthic Sediments from Equatorial Ecosystems Using Multivariate Statistical Techniques.

    PubMed

    Benson, Nsikak U; Asuquo, Francis E; Williams, Akan B; Essien, Joseph P; Ekong, Cyril I; Akpabio, Otobong; Olajire, Abaas A

    2016-01-01

    Trace metals (Cd, Cr, Cu, Ni and Pb) concentrations in benthic sediments were analyzed through multi-step fractionation scheme to assess the levels and sources of contamination in estuarine, riverine and freshwater ecosystems in Niger Delta (Nigeria). The degree of contamination was assessed using the individual contamination factors (ICF) and global contamination factor (GCF). Multivariate statistical approaches including principal component analysis (PCA), cluster analysis and correlation test were employed to evaluate the interrelationships and associated sources of contamination. The spatial distribution of metal concentrations followed the pattern Pb>Cu>Cr>Cd>Ni. Ecological risk index by ICF showed significant potential mobility and bioavailability for Cu, Cu and Ni. The ICF contamination trend in the benthic sediments at all studied sites was Cu>Cr>Ni>Cd>Pb. The principal component and agglomerative clustering analyses indicate that trace metals contamination in the ecosystems was influenced by multiple pollution sources. PMID:27257934

  6. Source Evaluation and Trace Metal Contamination in Benthic Sediments from Equatorial Ecosystems Using Multivariate Statistical Techniques.

    PubMed

    Benson, Nsikak U; Asuquo, Francis E; Williams, Akan B; Essien, Joseph P; Ekong, Cyril I; Akpabio, Otobong; Olajire, Abaas A

    2016-01-01

    Trace metals (Cd, Cr, Cu, Ni and Pb) concentrations in benthic sediments were analyzed through multi-step fractionation scheme to assess the levels and sources of contamination in estuarine, riverine and freshwater ecosystems in Niger Delta (Nigeria). The degree of contamination was assessed using the individual contamination factors (ICF) and global contamination factor (GCF). Multivariate statistical approaches including principal component analysis (PCA), cluster analysis and correlation test were employed to evaluate the interrelationships and associated sources of contamination. The spatial distribution of metal concentrations followed the pattern Pb>Cu>Cr>Cd>Ni. Ecological risk index by ICF showed significant potential mobility and bioavailability for Cu, Cu and Ni. The ICF contamination trend in the benthic sediments at all studied sites was Cu>Cr>Ni>Cd>Pb. The principal component and agglomerative clustering analyses indicate that trace metals contamination in the ecosystems was influenced by multiple pollution sources.

  7. A statistical comparison of two carbon fiber/epoxy fabrication techniques

    NASA Technical Reports Server (NTRS)

    Hodge, A. J.

    1991-01-01

    A statistical comparison of the compression strengths of specimens that were fabricated by either a platen press or an autoclave were performed on IM6/3501-6 carbon/epoxy composites of 16-ply (0,+45,90,-45)(sub S2) lay-up configuration. The samples were cured with the same parameters and processing materials. It was found that the autoclaved panels were thicker than the platen press cured samples. Two hundred samples of each type of cure process were compression tested. The autoclaved samples had an average strength of 450 MPa (65.5 ksi), while the press cured samples had an average strength of 370 MPa (54.0 ksi). A Weibull analysis of the data showed that there is only a 30 pct. probability that the two types of cure systems yield specimens that can be considered from the same family.

  8. Source Evaluation and Trace Metal Contamination in Benthic Sediments from Equatorial Ecosystems Using Multivariate Statistical Techniques

    PubMed Central

    Benson, Nsikak U.; Asuquo, Francis E.; Williams, Akan B.; Essien, Joseph P.; Ekong, Cyril I.; Akpabio, Otobong; Olajire, Abaas A.

    2016-01-01

    Trace metals (Cd, Cr, Cu, Ni and Pb) concentrations in benthic sediments were analyzed through multi-step fractionation scheme to assess the levels and sources of contamination in estuarine, riverine and freshwater ecosystems in Niger Delta (Nigeria). The degree of contamination was assessed using the individual contamination factors (ICF) and global contamination factor (GCF). Multivariate statistical approaches including principal component analysis (PCA), cluster analysis and correlation test were employed to evaluate the interrelationships and associated sources of contamination. The spatial distribution of metal concentrations followed the pattern Pb>Cu>Cr>Cd>Ni. Ecological risk index by ICF showed significant potential mobility and bioavailability for Cu, Cu and Ni. The ICF contamination trend in the benthic sediments at all studied sites was Cu>Cr>Ni>Cd>Pb. The principal component and agglomerative clustering analyses indicate that trace metals contamination in the ecosystems was influenced by multiple pollution sources. PMID:27257934

  9. Discrimination of nylon polymers using attenuated total reflection mid-infrared spectra and multivariate statistical techniques.

    PubMed

    Enlow, Elizabeth M; Kennedy, Jennifer L; Nieuwland, Alexander A; Hendrix, James E; Morgan, Stephen L

    2005-08-01

    Nylons are an important class of synthetic polymers, from an industrial, as well as forensic, perspective. A spectroscopic method, such as Fourier transform infrared (FT-IR) spectroscopy, is necessary to determine the nylon subclasses (e. g., nylon 6 or nylon 6,6). Library searching using absolute difference and absolute derivative difference algorithms gives inconsistent results for identifying nylon subclasses. The objective of this study was to evaluate the usefulness of peak ratio analysis and multivariate statistics for the identification of nylon subclasses using attenuated total reflection (ATR) spectral data. Many nylon subclasses could not be distinguished by the peak ratio of the N-H vibrational stretch to the sp(3) C-H(2) vibrational stretch intensities. Linear discriminant analysis, however, provided a graphical visualization of differences between nylon subclasses and was able to correctly classify a set of 270 spectra from eight different subclasses with 98.5% cross-validated accuracy.

  10. Sensitivity of Statistical Downscaling Techniques to Reanalysis Choice and Implications for Regional Climate Change Scenarios

    NASA Astrophysics Data System (ADS)

    Manzanas, R., Sr.; Brands, S.; San Martin, D., Sr.; Gutiérrez, J. M., Sr.

    2014-12-01

    This work shows that local-scale climate projections obtained by means of statistical downscaling are sensitive to the choice of reanalysis used for calibration. To this aim, a Generalized Linear Model (GLM) approach is applied to downscale daily precipitation in the Philippines. First, the GLMs are trained and tested -under a cross-validation scheme- separately for two distinct reanalyses (ERA-Interim and JRA-25) for the period 1981-2000. When the observed and downscaled time-series are compared, the attained performance is found to be sensitive to the reanalysis considered if climate change signal bearing variables (temperature and/or specific humidity) are included in the predictor field. Moreover, performance differences are shown to be in correspondence with the disagreement found between the raw predictors from the two reanalyses. Second, the regression coefficients calibrated either with ERA-Interim or JRA-25 are subsequently applied to the output of a Global Climate Model (MPI-ECHAM5) in order to assess the sensitivity of local-scale climate change projections (up to 2100) to reanalysis choice. In this case, the differences detected in present climate conditions are considerably amplified, leading to "delta-change" estimates differing by up to a 35% (on average for the entire country) depending on the reanalysis used for calibration. Therefore, reanalysis choice is shown to importantly contribute to the uncertainty of local-scale climate change projections, and, consequently, should be treated with equal care as other, well-known, sources of uncertainty -e.g., the choice of the GCM and/or downscaling method.- Implications of the results for the entire tropics, as well as for the Model Output Statistics downscaling approach are also briefly discussed.

  11. Optimization of Sinter Plant Operating Conditions Using Advanced Multivariate Statistics: Intelligent Data Processing

    NASA Astrophysics Data System (ADS)

    Fernández-González, Daniel; Martín-Duarte, Ramón; Ruiz-Bustinza, Íñigo; Mochón, Javier; González-Gasca, Carmen; Verdeja, Luis Felipe

    2016-08-01

    Blast furnace operators expect to get sinter with homogenous and regular properties (chemical and mechanical), necessary to ensure regular blast furnace operation. Blends for sintering also include several iron by-products and other wastes that are obtained in different processes inside the steelworks. Due to their source, the availability of such materials is not always consistent, but their total production should be consumed in the sintering process, to both save money and recycle wastes. The main scope of this paper is to obtain the least expensive iron ore blend for the sintering process, which will provide suitable chemical and mechanical features for the homogeneous and regular operation of the blast furnace. The systematic use of statistical tools was employed to analyze historical data, including linear and partial correlations applied to the data and fuzzy clustering based on the Sugeno Fuzzy Inference System to establish relationships among the available variables.

  12. Optimization of Sinter Plant Operating Conditions Using Advanced Multivariate Statistics: Intelligent Data Processing

    NASA Astrophysics Data System (ADS)

    Fernández-González, Daniel; Martín-Duarte, Ramón; Ruiz-Bustinza, Íñigo; Mochón, Javier; González-Gasca, Carmen; Verdeja, Luis Felipe

    2016-06-01

    Blast furnace operators expect to get sinter with homogenous and regular properties (chemical and mechanical), necessary to ensure regular blast furnace operation. Blends for sintering also include several iron by-products and other wastes that are obtained in different processes inside the steelworks. Due to their source, the availability of such materials is not always consistent, but their total production should be consumed in the sintering process, to both save money and recycle wastes. The main scope of this paper is to obtain the least expensive iron ore blend for the sintering process, which will provide suitable chemical and mechanical features for the homogeneous and regular operation of the blast furnace. The systematic use of statistical tools was employed to analyze historical data, including linear and partial correlations applied to the data and fuzzy clustering based on the Sugeno Fuzzy Inference System to establish relationships among the available variables.

  13. Accelerated Testing Methodology in Constant Stress-Rate Testing for Advanced Structural Ceramics: A Preloading Technique

    NASA Technical Reports Server (NTRS)

    Choi, Sung R.; Gyekenyesi, John P.; Huebert, Dean; Bartlett, Allen; Choi, Han-Ho

    2001-01-01

    Preloading technique was used as a means of an accelerated testing methodology in constant stress-rate ('dynamic fatigue') testing for two different brittle materials. The theory developed previously for fatigue strength as a function of preload was further verified through extensive constant stress-rate testing for glass-ceramic and CRT glass in room temperature distilled water. The preloading technique was also used in this study to identify the prevailing failure mechanisms at elevated temperatures, particularly at lower test rate in which a series of mechanisms would be associated simultaneously with material failure, resulting in significant strength increase or decrease. Two different advanced ceramics including SiC whisker-reinforced composite silicon nitride and 96 wt% alumina were used at elevated temperatures. It was found that the preloading technique can be used as an additional tool to pinpoint the dominant failure mechanism that is associated with such a phenomenon of considerable strength increase or decrease.

  14. Accelerated Testing Methodology in Constant Stress-Rate Testing for Advanced Structural Ceramics: A Preloading Technique

    NASA Technical Reports Server (NTRS)

    Choi, Sung R.; Gyekenyesi, John P.; Huebert, Dean; Bartlett, Allen; Choi, Han-Ho

    2001-01-01

    Preloading technique was used as a means of an accelerated testing methodology in constant stress-rate (dynamic fatigue) testing for two different brittle materials. The theory developed previously for fatigue strength as a function of preload was further verified through extensive constant stress-rate testing for glass-ceramic and CRT glass in room temperature distilled water. The preloading technique was also used in this study to identify the prevailing failure mechanisms at elevated temperatures, particularly at lower test rates in which a series of mechanisms would be associated simultaneously with material failure, resulting in significant strength increase or decrease. Two different advanced ceramics including SiC whisker-reinforced composite silicon nitride and 96 wt% alumina were used at elevated temperatures. It was found that the preloading technique can be used as an additional tool to pinpoint the dominant failure mechanism that is associated with such a phenomenon of considerable strength increase or decrease.

  15. Applications of Advanced Nondestructive Measurement Techniques to Address Safety of Flight Issues on NASA Spacecraft

    NASA Technical Reports Server (NTRS)

    Prosser, Bill

    2016-01-01

    Advanced nondestructive measurement techniques are critical for ensuring the reliability and safety of NASA spacecraft. Techniques such as infrared thermography, THz imaging, X-ray computed tomography and backscatter X-ray are used to detect indications of damage in spacecraft components and structures. Additionally, sensor and measurement systems are integrated into spacecraft to provide structural health monitoring to detect damaging events that occur during flight such as debris impacts during launch and assent or from micrometeoroid and orbital debris, or excessive loading due to anomalous flight conditions. A number of examples will be provided of how these nondestructive measurement techniques have been applied to resolve safety critical inspection concerns for the Space Shuttle, International Space Station (ISS), and a variety of launch vehicles and unmanned spacecraft.

  16. Advanced spatio-temporal filtering techniques for photogrammetric image sequence analysis in civil engineering material testing

    NASA Astrophysics Data System (ADS)

    Liebold, F.; Maas, H.-G.

    2016-01-01

    The paper shows advanced spatial, temporal and spatio-temporal filtering techniques which may be used to reduce noise effects in photogrammetric image sequence analysis tasks and tools. As a practical example, the techniques are validated in a photogrammetric spatio-temporal crack detection and analysis tool applied in load tests in civil engineering material testing. The load test technique is based on monocular image sequences of a test object under varying load conditions. The first image of a sequence is defined as a reference image under zero load, wherein interest points are determined and connected in a triangular irregular network structure. For each epoch, these triangles are compared to the reference image triangles to search for deformations. The result of the feature point tracking and triangle comparison process is a spatio-temporally resolved strain value field, wherein cracks can be detected, located and measured via local discrepancies. The strains can be visualized as a color-coded map. In order to improve the measuring system and to reduce noise, the strain values of each triangle must be treated in a filtering process. The paper shows the results of various filter techniques in the spatial and in the temporal domain as well as spatio-temporal filtering techniques applied to these data. The best results were obtained by a bilateral filter in the spatial domain and by a spatio-temporal EOF (empirical orthogonal function) filtering technique.

  17. Techniques for Statistically Scrutinizing Stochastic Model Assumptions Using a Single Noisily Measured Trajectory

    NASA Astrophysics Data System (ADS)

    Calderon, Christopher

    2014-03-01

    The increased spatial and temporal resolution afforded by recent single-molecule experiments has inspired researchers to consider new techniques for quantifying molecular-level kinetics. Many researchers have contributed methods for improving the quality of estimators characterizing single-molecule kinetics, however techniques for checking the consistency of implicit distributional assumptions behind an assumed stochastic against a single experimental trajectory are under-developed. In this talk, likelihood-based goodness-of-fit testing and other model-based hypotheses tests accounting for the complexities of single-molecule trajectory analysis (heterogeneity, transient kinetic regime shifts, measurement noise, etc.) are discussed. Utility of the testing procedures are demonstrated on (i) single particle tracking (SPT) experiments characterizing mRNA motion in the cytoplasm of yeast cells and (ii) protein kinetics in the primary cilium of mammalian cells. In both cases, the testing procedures facilitated the discovery of new kinetic signatures of molecular motor facilitated transport not accounted for in traditional SPT models. NSF SBIR Award #:1314897.

  18. Review of recent advances in analytical techniques for the determination of neurotransmitters

    PubMed Central

    Perry, Maura; Li, Qiang; Kennedy, Robert T.

    2009-01-01

    Methods and advances for monitoring neurotransmitters in vivo or for tissue analysis of neurotransmitters over the last five years are reviewed. The review is organized primarily by neurotransmitter type. Transmitter and related compounds may be monitored by either in vivo sampling coupled to analytical methods or implanted sensors. Sampling is primarily performed using microdialysis, but low-flow push-pull perfusion may offer advantages of spatial resolution while minimizing the tissue disruption associated with higher flow rates. Analytical techniques coupled to these sampling methods include liquid chromatography, capillary electrophoresis, enzyme assays, sensors, and mass spectrometry. Methods for the detection of amino acid, monoamine, neuropeptide, acetylcholine, nucleoside, and soluable gas neurotransmitters have been developed and improved upon. Advances in the speed and sensitivity of these methods have enabled improvements in temporal resolution and increased the number of compounds detectable. Similar advances have enabled improved detection at tissue samples, with a substantial emphasis on single cell and other small samples. Sensors provide excellent temporal and spatial resolution for in vivo monitoring. Advances in application to catecholamines, indoleamines, and amino acids have been prominent. Improvements in stability, sensitivity, and selectivity of the sensors have been of paramount interest. PMID:19800472

  19. Improved equilibrium reconstructions by advanced statistical weighting of the internal magnetic measurements

    NASA Astrophysics Data System (ADS)

    Murari, A.; Gelfusa, M.; Peluso, E.; Gaudio, P.; Mazon, D.; Hawkes, N.; Point, G.; Alper, B.; Eich, T.

    2014-12-01

    In a Tokamak the configuration of the magnetic fields remains the key element to improve performance and to maximise the scientific exploitation of the device. On the other hand, the quality of the reconstructed fields depends crucially on the measurements available. Traditionally in the least square minimisation phase of the algorithms, used to obtain the magnetic field topology, all the diagnostics are given the same weights, a part from a corrective factor taking into account the error bars. This assumption unduly penalises complex diagnostics, such as polarimetry, which have a limited number of highly significant measurements. A completely new method to choose the weights, to be given to the internal measurements of the magnetic fields for improved equilibrium reconstructions, is presented in this paper. The approach is based on various statistical indicators applied to the residuals, the difference between the actual measurements and their estimates from the reconstructed equilibrium. The potential of the method is exemplified using the measurements of the Faraday rotation derived from JET polarimeter. The results indicate quite clearly that the weights have to be determined carefully, since the inappropriate choice can have significant repercussions on the quality of the magnetic reconstruction both in the edge and in the core. These results confirm the limitations of the assumption that all the diagnostics have to be given the same weight, irrespective of the number of measurements they provide and the region of the plasma they probe.

  20. STEAM - Statistical Template Estimation for Abnormality Mapping: A personalized DTI analysis technique with applications to the screening of preterm infants.

    PubMed

    Booth, Brian G; Miller, Steven P; Brown, Colin J; Poskitt, Kenneth J; Chau, Vann; Grunau, Ruth E; Synnes, Anne R; Hamarneh, Ghassan

    2016-01-15

    We introduce the STEAM DTI analysis engine: a whole brain voxel-based analysis technique for the examination of diffusion tensor images (DTIs). Our STEAM analysis technique consists of two parts. First, we introduce a collection of statistical templates that represent the distribution of DTIs for a normative population. These templates include various diffusion measures from the full tensor, to fractional anisotropy, to 12 other tensor features. Second, we propose a voxel-based analysis (VBA) pipeline that is reliable enough to identify areas in individual DTI scans that differ significantly from the normative group represented in the STEAM statistical templates. We identify and justify choices in the VBA pipeline relating to multiple comparison correction, image smoothing, and dealing with non-normally distributed data. Finally, we provide a proof of concept for the utility of STEAM on a cohort of 134 very preterm infants. We generated templates from scans of 55 very preterm infants whose T1 MRI scans show no abnormalities and who have normal neurodevelopmental outcome. The remaining 79 infants were then compared to the templates using our VBA technique. We show: (a) that our statistical templates display the white matter development expected over the modeled time period, and (b) that our VBA results detect abnormalities in the diffusion measurements that relate significantly with both the presence of white matter lesions and with neurodevelopmental outcomes at 18months. Most notably, we show that STEAM produces personalized results while also being able to highlight abnormalities across the whole brain and at the scale of individual voxels. While we show the value of STEAM on DTI scans from a preterm infant cohort, STEAM can be equally applied to other cohorts as well. To facilitate this whole-brain personalized DTI analysis, we made STEAM publicly available at http://www.sfu.ca/bgb2/steam. PMID:26515903

  1. STEAM - Statistical Template Estimation for Abnormality Mapping: A personalized DTI analysis technique with applications to the screening of preterm infants.

    PubMed

    Booth, Brian G; Miller, Steven P; Brown, Colin J; Poskitt, Kenneth J; Chau, Vann; Grunau, Ruth E; Synnes, Anne R; Hamarneh, Ghassan

    2016-01-15

    We introduce the STEAM DTI analysis engine: a whole brain voxel-based analysis technique for the examination of diffusion tensor images (DTIs). Our STEAM analysis technique consists of two parts. First, we introduce a collection of statistical templates that represent the distribution of DTIs for a normative population. These templates include various diffusion measures from the full tensor, to fractional anisotropy, to 12 other tensor features. Second, we propose a voxel-based analysis (VBA) pipeline that is reliable enough to identify areas in individual DTI scans that differ significantly from the normative group represented in the STEAM statistical templates. We identify and justify choices in the VBA pipeline relating to multiple comparison correction, image smoothing, and dealing with non-normally distributed data. Finally, we provide a proof of concept for the utility of STEAM on a cohort of 134 very preterm infants. We generated templates from scans of 55 very preterm infants whose T1 MRI scans show no abnormalities and who have normal neurodevelopmental outcome. The remaining 79 infants were then compared to the templates using our VBA technique. We show: (a) that our statistical templates display the white matter development expected over the modeled time period, and (b) that our VBA results detect abnormalities in the diffusion measurements that relate significantly with both the presence of white matter lesions and with neurodevelopmental outcomes at 18months. Most notably, we show that STEAM produces personalized results while also being able to highlight abnormalities across the whole brain and at the scale of individual voxels. While we show the value of STEAM on DTI scans from a preterm infant cohort, STEAM can be equally applied to other cohorts as well. To facilitate this whole-brain personalized DTI analysis, we made STEAM publicly available at http://www.sfu.ca/bgb2/steam.

  2. Quantifying heterogeneous responses of fish community size structure using novel combined statistical techniques.

    PubMed

    Marshall, Abigail M; Bigg, Grant R; van Leeuwen, Sonja M; Pinnegar, John K; Wei, Hua-Liang; Webb, Thomas J; Blanchard, Julia L

    2016-05-01

    To understand changes in ecosystems, the appropriate scale at which to study them must be determined. Large marine ecosystems (LMEs) cover thousands of square kilometres and are a useful classification scheme for ecosystem monitoring and assessment. However, averaging across LMEs may obscure intricate dynamics within. The purpose of this study is to mathematically determine local and regional patterns of ecological change within an LME using empirical orthogonal functions (EOFs). After using EOFs to define regions with distinct patterns of change, a statistical model originating from control theory is applied (Nonlinear AutoRegressive Moving Average with eXogenous input - NARMAX) to assess potential drivers of change within these regions. We have selected spatial data sets (0.5° latitude × 1°longitude) of fish abundance from North Sea fisheries research surveys (spanning 1980-2008) as well as of temperature, oxygen, net primary production and a fishing pressure proxy, to which we apply the EOF and NARMAX methods. Two regions showed significant changes since 1980: the central North Sea displayed a decrease in community size structure which the NARMAX model suggested was linked to changes in fishing; and the Norwegian trench region displayed an increase in community size structure which, as indicated by NARMAX results, was primarily linked to changes in sea-bottom temperature. These regions were compared to an area of no change along the eastern Scottish coast where the model determined the community size structure was most strongly associated to net primary production. This study highlights the multifaceted effects of environmental change and fishing pressures in different regions of the North Sea. Furthermore, by highlighting this spatial heterogeneity in community size structure change, important local spatial dynamics are often overlooked when the North Sea is considered as a broad-scale, homogeneous ecosystem (as normally is the case within the political

  3. Quantifying heterogeneous responses of fish community size structure using novel combined statistical techniques.

    PubMed

    Marshall, Abigail M; Bigg, Grant R; van Leeuwen, Sonja M; Pinnegar, John K; Wei, Hua-Liang; Webb, Thomas J; Blanchard, Julia L

    2016-05-01

    To understand changes in ecosystems, the appropriate scale at which to study them must be determined. Large marine ecosystems (LMEs) cover thousands of square kilometres and are a useful classification scheme for ecosystem monitoring and assessment. However, averaging across LMEs may obscure intricate dynamics within. The purpose of this study is to mathematically determine local and regional patterns of ecological change within an LME using empirical orthogonal functions (EOFs). After using EOFs to define regions with distinct patterns of change, a statistical model originating from control theory is applied (Nonlinear AutoRegressive Moving Average with eXogenous input - NARMAX) to assess potential drivers of change within these regions. We have selected spatial data sets (0.5° latitude × 1°longitude) of fish abundance from North Sea fisheries research surveys (spanning 1980-2008) as well as of temperature, oxygen, net primary production and a fishing pressure proxy, to which we apply the EOF and NARMAX methods. Two regions showed significant changes since 1980: the central North Sea displayed a decrease in community size structure which the NARMAX model suggested was linked to changes in fishing; and the Norwegian trench region displayed an increase in community size structure which, as indicated by NARMAX results, was primarily linked to changes in sea-bottom temperature. These regions were compared to an area of no change along the eastern Scottish coast where the model determined the community size structure was most strongly associated to net primary production. This study highlights the multifaceted effects of environmental change and fishing pressures in different regions of the North Sea. Furthermore, by highlighting this spatial heterogeneity in community size structure change, important local spatial dynamics are often overlooked when the North Sea is considered as a broad-scale, homogeneous ecosystem (as normally is the case within the political

  4. New Generation of High Resolution Ultrasonic Imaging Technique for Advanced Material Characterization: Review

    NASA Astrophysics Data System (ADS)

    Maev, R. Gr.

    The role of non-destructive material characterization and NDT is changing at a rapid rate, continuing to evolve alongside the dramatic development of novel techniques based on the principles of high-resolution imaging. The modern use of advanced optical, thermal, ultrasonic, laser-ultrasound, acoustic emission, vibration, electro-magnetic, and X-ray techniques, etc., as well as refined measurement and signal/data processing devices, allows for continuous generation of on-line information. As a result real-time process monitoring can be achieved, leading to the more effective and efficient control of numerous processes, greatly improving manufacturing as a whole. Indeed, concurrent quality inspection has become an attainable reality. With the advent of new materials for use in various structures, joints, and parts, however, innovative applications of modern NDT imaging techniques are necessary to monitor as many stages of manufacturing as possible. Simply put, intelligent advance manufacturing is impossible without actively integrating modern non-destructive evaluation into the production system.

  5. Application of Advanced Atomic Force Microscopy Techniques to Study Quantum Dots and Bio-materials

    NASA Astrophysics Data System (ADS)

    Guz, Nataliia

    In recent years, there has been an increase in research towards micro- and nanoscale devices as they have proliferated into diverse areas of scientific exploration. Many of the general fields of study that have greatly affected the advancement of these devices includes the investigation of their properties. The sensitivity of Atomic Force Microscopy (AFM) allows detecting charges up to the single electron value in quantum dots in ambient conditions, the measurement of steric forces on the surface of the human cell brush, determination of cell mechanics, magnetic forces, and other important properties. Utilizing AFM methods, the fast screening of quantum dot efficiency and the differences between cancer, normal (healthy) and precancer (immortalized) human cells has been investigated. The current research using AFM techniques can help to identify biophysical differences of cancer cells to advance our understanding of the resistance of the cells against the existing medicine.

  6. Impact of advanced microstructural characterization techniques on modeling and analysis of radiation damage

    SciTech Connect

    Garner, F.A.; Odette, G.R.

    1980-01-01

    The evolution of radiation-induced alterations of dimensional and mechanical properties has been shown to be a direct and often predictable consequence of radiation-induced microstructural changes. Recent advances in understanding of the nature and role of each microstructural component in determining the property of interest has led to a reappraisal of the type and priority of data needed for further model development. This paper presents an overview of the types of modeling and analysis activities in progress, the insights that prompted these activities, and specific examples of successful and ongoing efforts. A review is presented of some problem areas that in the authors' opinion are not yet receiving sufficient attention and which may benefit from the application of advanced techniques of microstructural characterization. Guidelines based on experience gained in previous studies are also provided for acquisition of data in a form most applicable to modeling needs.

  7. Machine learning and statistical methods for the prediction of maximal oxygen uptake: recent advances

    PubMed Central

    Abut, Fatih; Akay, Mehmet Fatih

    2015-01-01

    Maximal oxygen uptake (VO2max) indicates how many milliliters of oxygen the body can consume in a state of intense exercise per minute. VO2max plays an important role in both sport and medical sciences for different purposes, such as indicating the endurance capacity of athletes or serving as a metric in estimating the disease risk of a person. In general, the direct measurement of VO2max provides the most accurate assessment of aerobic power. However, despite a high level of accuracy, practical limitations associated with the direct measurement of VO2max, such as the requirement of expensive and sophisticated laboratory equipment or trained staff, have led to the development of various regression models for predicting VO2max. Consequently, a lot of studies have been conducted in the last years to predict VO2max of various target audiences, ranging from soccer athletes, nonexpert swimmers, cross-country skiers to healthy-fit adults, teenagers, and children. Numerous prediction models have been developed using different sets of predictor variables and a variety of machine learning and statistical methods, including support vector machine, multilayer perceptron, general regression neural network, and multiple linear regression. The purpose of this study is to give a detailed overview about the data-driven modeling studies for the prediction of VO2max conducted in recent years and to compare the performance of various VO2max prediction models reported in related literature in terms of two well-known metrics, namely, multiple correlation coefficient (R) and standard error of estimate. The survey results reveal that with respect to regression methods used to develop prediction models, support vector machine, in general, shows better performance than other methods, whereas multiple linear regression exhibits the worst performance. PMID:26346869

  8. Machine learning and statistical methods for the prediction of maximal oxygen uptake: recent advances.

    PubMed

    Abut, Fatih; Akay, Mehmet Fatih

    2015-01-01

    Maximal oxygen uptake (VO2max) indicates how many milliliters of oxygen the body can consume in a state of intense exercise per minute. VO2max plays an important role in both sport and medical sciences for different purposes, such as indicating the endurance capacity of athletes or serving as a metric in estimating the disease risk of a person. In general, the direct measurement of VO2max provides the most accurate assessment of aerobic power. However, despite a high level of accuracy, practical limitations associated with the direct measurement of VO2max, such as the requirement of expensive and sophisticated laboratory equipment or trained staff, have led to the development of various regression models for predicting VO2max. Consequently, a lot of studies have been conducted in the last years to predict VO2max of various target audiences, ranging from soccer athletes, nonexpert swimmers, cross-country skiers to healthy-fit adults, teenagers, and children. Numerous prediction models have been developed using different sets of predictor variables and a variety of machine learning and statistical methods, including support vector machine, multilayer perceptron, general regression neural network, and multiple linear regression. The purpose of this study is to give a detailed overview about the data-driven modeling studies for the prediction of VO2max conducted in recent years and to compare the performance of various VO2max prediction models reported in related literature in terms of two well-known metrics, namely, multiple correlation coefficient (R) and standard error of estimate. The survey results reveal that with respect to regression methods used to develop prediction models, support vector machine, in general, shows better performance than other methods, whereas multiple linear regression exhibits the worst performance.

  9. Extrusion based rapid prototyping technique: an advanced platform for tissue engineering scaffold fabrication.

    PubMed

    Hoque, M Enamul; Chuan, Y Leng; Pashby, Ian

    2012-02-01

    Advances in scaffold design and fabrication technology have brought the tissue engineering field stepping into a new era. Conventional techniques used to develop scaffolds inherit limitations, such as lack of control over the pore morphology and architecture as well as reproducibility. Rapid prototyping (RP) technology, a layer-by-layer additive approach offers a unique opportunity to build complex 3D architectures overcoming those limitations that could ultimately be tailored to cater for patient-specific applications. Using RP methods, researchers have been able to customize scaffolds to mimic the biomechanical properties (in terms of structural integrity, strength, and microenvironment) of the organ or tissue to be repaired/replaced quite closely. This article provides intensive description on various extrusion based scaffold fabrication techniques and review their potential utility for TE applications. The extrusion-based technique extrudes the molten polymer as a thin filament through a nozzle onto a platform layer-by-layer and thus building 3D scaffold. The technique allows full control over pore architecture and dimension in the x- and y- planes. However, the pore height in z-direction is predetermined by the extruding nozzle diameter rather than the technique itself. This review attempts to assess the current state and future prospects of this technology.

  10. Assessment of Surface Water Quality Using Multivariate Statistical Techniques in a Part of River Cauvery, Tamil Nadu, India.

    PubMed

    Hema, S; Subramani, T; Elango, L

    2014-07-01

    The study explains water quality of the Cauvery River in the southern region of Peninsular India. Thirteen parameters including trace elements (Cd, As, Cu, Cr, Zn and Pb) have been monitored on 50 sampling points from a hydro-geochemical survey, conducted in the river stretch under study. Several water quality parameters showed considerable changes due to increased runoff from the catchments and other seasonal factors. Multivariate discriminant analysis delineated a few parameters responsible for temporal variation in water quality. Factor analysis (FA) identified three factors responsible for data structure explaining 91% of total variance in surface water. It allowed grouping selected parameters according to common features. The results indicated that point source pollutants primarily affected the water quality of this region. This study indicates the necessity and usefulness of multivariate statistical techniques for evaluation and interpretation of the data. It facilitates better information about the water quality and designs some remedial techniques to prevent future contamination. PMID:26563077

  11. A statistical watermark detection technique without using original images for resolving rightful ownerships of digital images.

    PubMed

    Zeng, W; Liu, B

    1999-01-01

    Digital watermarking has been proposed as the means for copyright protection of multimedia data. Many of existing watermarking schemes focused on the robust means to mark an image invisibly without really addressing the ends of these schemes. This paper first discusses some scenarios in which many current watermarking schemes fail to resolve the rightful ownership of an image. The key problems are then identified, and some crucial requirements for a valid invisible watermark detection are discussed. In particular, we show that, for the particular application of resolving rightful ownership using invisible watermarks, it might be crucial to require that the original image not be directly involved in the watermark detection process. A general framework for validly detecting the invisible watermarks is then proposed. Some requirements on the claimed signature/watermarks to be used for detection are discussed to prevent the existence of any counterfeit scheme. The optimal detection strategy within the framework is derived. We show the effectiveness of this technique based on some visual-model-based watermark encoding schemes. PMID:18267429

  12. The utilization of six sigma and statistical process control techniques in surgical quality improvement.

    PubMed

    Sedlack, Jeffrey D

    2010-01-01

    Surgeons have been slow to incorporate industrial reliability techniques. Process control methods were applied to surgeon waiting time between cases, and to length of stay (LOS) after colon surgery. Waiting times between surgeries were evaluated by auditing the operating room records of a single hospital over a 1-month period. The medical records of 628 patients undergoing colon surgery over a 5-year period were reviewed. The average surgeon wait time between cases was 53 min, and the busiest surgeon spent 291/2 hr in 1 month waiting between surgeries. Process control charting demonstrated poor overall control of the room turnover process. Average LOS after colon resection also demonstrated very poor control. Mean LOS was 10 days. Weibull's conditional analysis revealed a conditional LOS of 9.83 days. Serious process management problems were identified in both analyses. These process issues are both expensive and adversely affect the quality of service offered by the institution. Process control mechanisms were suggested or implemented to improve these surgical processes. Industrial reliability and quality management tools can easily and effectively identify process control problems that occur on surgical services. PMID:20946422

  13. Techniques for measurement of the thermal expansion of advanced composite materials

    NASA Technical Reports Server (NTRS)

    Tompkins, Stephen S.

    1989-01-01

    Techniques available to measure small thermal displacements in flat laminates and structural tubular elements of advanced composite materials are described. Emphasis is placed on laser interferometry and the laser interferometric dilatometer system used at the National Aeronautics and Space Administration (NASA) Langley Research Center. Thermal expansion data are presented for graphite-fiber reinforced 6061 and 2024 aluminum laminates and for graphite fiber reinforced AZ91 C and QH21 A magnesium laminates before and after processing to minimize or eliminate thermal strain hysteresis. Data are also presented on the effects of reinforcement volume content on thermal expansion of silicon-carbide whisker and particulate reinforced aluminum.

  14. Advanced techniques in IR thermography as a tool for the pest management professional

    NASA Astrophysics Data System (ADS)

    Grossman, Jon L.

    2006-04-01

    Within the past five years, the Pest Management industry has become aware that IR thermography can aid in the detection of pest infestations and locate other conditions that are within the purview of the industry. This paper will review the applications that can be utilized by the pest management professional and discuss the advanced techniques that may be required in conjunction with thermal imaging to locate insect and other pest infestations, moisture within structures, the verification of data and the special challenges associated with the inspection process.

  15. Advancing Our Understanding of the Link between Statistical Learning and Language Acquisition: The Need for Longitudinal Data

    PubMed Central

    Arciuli, Joanne; Torkildsen, Janne von Koss

    2012-01-01

    Mastery of language can be a struggle for some children. Amongst those that succeed in achieving this feat there is variability in proficiency. Cognitive scientists remain intrigued by this variation. A now substantial body of research suggests that language acquisition is underpinned by a child’s capacity for statistical learning (SL). Moreover, a growing body of research has demonstrated that variability in SL is associated with variability in language proficiency. Yet, there is a striking lack of longitudinal data. To date, there has been no comprehensive investigation of whether a capacity for SL in young children is, in fact, associated with language proficiency in subsequent years. Here we review key studies that have led to the need for this longitudinal research. Advancing the language acquisition debate via longitudinal research has the potential to transform our understanding of typical development as well as disorders such as autism, specific language impairment, and dyslexia. PMID:22969746

  16. Advancing Our Understanding of the Link between Statistical Learning and Language Acquisition: The Need for Longitudinal Data.

    PubMed

    Arciuli, Joanne; Torkildsen, Janne von Koss

    2012-01-01

    Mastery of language can be a struggle for some children. Amongst those that succeed in achieving this feat there is variability in proficiency. Cognitive scientists remain intrigued by this variation. A now substantial body of research suggests that language acquisition is underpinned by a child's capacity for statistical learning (SL). Moreover, a growing body of research has demonstrated that variability in SL is associated with variability in language proficiency. Yet, there is a striking lack of longitudinal data. To date, there has been no comprehensive investigation of whether a capacity for SL in young children is, in fact, associated with language proficiency in subsequent years. Here we review key studies that have led to the need for this longitudinal research. Advancing the language acquisition debate via longitudinal research has the potential to transform our understanding of typical development as well as disorders such as autism, specific language impairment, and dyslexia.

  17. Arthroscopically assisted Sauvé-Kapandji procedure: an advanced technique for distal radioulnar joint arthritis.

    PubMed

    Luchetti, Riccardo; Khanchandani, Prakash; Da Rin, Ferdinando; Borelli, Pierpaolo P; Mathoulin, Christophe; Atzei, Andrea

    2008-12-01

    Osteoarthritis of distal radioulnar joint (DRUJ) leads to chronic wrist pain, weakness of grip strength, and limitation of motion, all of which affect the quality of life of the patient. Over the years, several procedures have been used for the treatment of this condition; however, this condition still remains a therapeutic challenge for the hand surgeons. Many procedures such as Darrach procedure, Bower procedure, Sauvé-Kapandji procedure, and ulnar head replacement have been used. Despite many advances in wrist arthroscopy, arthroscopy has not been used for the treatment of arthritis of the DRUJ. We describe a novel technique of arthroscopically assisted Sauvé-Kapandji procedure for the arthritis of the DRUJ. The advantages of this technique are its less invasive nature, preservation of the extensor retinaculum, more anatomical position of the DRUJ, faster rehabilitation, and a better cosmesis.

  18. A comparison of conventional and advanced ultrasonic inspection techniques in the characterization of TMC materials

    NASA Technical Reports Server (NTRS)

    Holland, Mark R.; Handley, Scott M.; Miller, James G.; Reighard, Mark K.

    1992-01-01

    Results obtained with a conventional ultrasonic inspection technique as well as those obtained with more advanced ultrasonic NDE methods in the characterization of an 8-ply quasi-isotropic titanium matrix composite (TMC) specimen are presented. Images obtained from a conventional ultrasonic inspection of TMC material are compared with those obtained using more sophisticated ultrasonic inspection methods. It is suggested that the latter techniques are able to provide quantitative images of TMC material. They are able to reveal the same potential defect indications while simultaneously providing more quantitative information concerning the material's inherent properties. Band-limited signal loss and slope-of-attenuation images provide quantitative data on the inherent material characteristics and defects in TMC.

  19. Chemistry of Metal-organic Frameworks Monitored by Advanced X-ray Diffraction and Scattering Techniques.

    PubMed

    Mazaj, Matjaž; Kaučič, Venčeslav; Zabukovec Logar, Nataša

    2016-01-01

    The research on metal-organic frameworks (MOFs) experienced rapid progress in recent years due to their structure diversity and wide range of application opportunities. Continuous progress of X-ray and neutron diffraction methods enables more and more detailed insight into MOF's structural features and significantly contributes to the understanding of their chemistry. Improved instrumentation and data processing in high-resolution X-ray diffraction methods enables the determination of new complex MOF crystal structures in powdered form. By the use of neutron diffraction techniques, a lot of knowledge about the interaction of guest molecules with crystalline framework has been gained in the past few years. Moreover, in-situ time-resolved studies by various diffraction and scattering techniques provided comprehensive information about crystallization kinetics, crystal growth mechanism and structural dynamics triggered by external physical or chemical stimuli. The review emphasizes most relevant advanced structural studies of MOFs based on powder X-ray and neutron scattering. PMID:27640372

  20. Chemistry of Metal-organic Frameworks Monitored by Advanced X-ray Diffraction and Scattering Techniques.

    PubMed

    Mazaj, Matjaž; Kaučič, Venčeslav; Zabukovec Logar, Nataša

    2016-01-01

    The research on metal-organic frameworks (MOFs) experienced rapid progress in recent years due to their structure diversity and wide range of application opportunities. Continuous progress of X-ray and neutron diffraction methods enables more and more detailed insight into MOF's structural features and significantly contributes to the understanding of their chemistry. Improved instrumentation and data processing in high-resolution X-ray diffraction methods enables the determination of new complex MOF crystal structures in powdered form. By the use of neutron diffraction techniques, a lot of knowledge about the interaction of guest molecules with crystalline framework has been gained in the past few years. Moreover, in-situ time-resolved studies by various diffraction and scattering techniques provided comprehensive information about crystallization kinetics, crystal growth mechanism and structural dynamics triggered by external physical or chemical stimuli. The review emphasizes most relevant advanced structural studies of MOFs based on powder X-ray and neutron scattering.

  1. Individual Particle Analysis of Ambient PM 2.5 Using Advanced Electron Microscopy Techniques

    SciTech Connect

    Gerald J. Keeler; Masako Morishita

    2006-12-31

    The overall goal of this project was to demonstrate a combination of advanced electron microscopy techniques that can be effectively used to identify and characterize individual particles and their sources. Specific techniques to be used include high-angle annular dark field scanning transmission electron microscopy (HAADF-STEM), STEM energy dispersive X-ray spectrometry (EDX), and energy-filtered TEM (EFTEM). A series of ambient PM{sub 2.5} samples were collected in communities in southwestern Detroit, MI (close to multiple combustion sources) and Steubenville, OH (close to several coal fired utility boilers). High-resolution TEM (HRTEM) -imaging showed a series of nano-metal particles including transition metals and elemental composition of individual particles in detail. Submicron and nano-particles with Al, Fe, Ti, Ca, U, V, Cr, Si, Ba, Mn, Ni, K and S were observed and characterized from the samples. Among the identified nano-particles, combinations of Al, Fe, Si, Ca and Ti nano-particles embedded in carbonaceous particles were observed most frequently. These particles showed very similar characteristics of ultrafine coal fly ash particles that were previously reported. By utilizing HAADF-STEM, STEM-EDX, and EF-TEM, this investigation was able to gain information on the size, morphology, structure, and elemental composition of individual nano-particles collected in Detroit and Steubenville. The results showed that the contributions of local combustion sources - including coal fired utilities - to ultrafine particle levels were significant. Although this combination of advanced electron microscopy techniques by itself can not identify source categories, these techniques can be utilized as complementary analytical tools that are capable of providing detailed information on individual particles.

  2. Recent Advances and New Techniques in Visualization of Ultra-short Relativistic Electron Bunches

    SciTech Connect

    Xiang, Dao; /SLAC

    2012-06-05

    Ultrashort electron bunches with rms length of {approx} 1 femtosecond (fs) can be used to generate ultrashort x-ray pulses in FELs that may open up many new regimes in ultrafast sciences. It is also envisioned that ultrashort electron bunches may excite {approx}TeV/m wake fields for plasma wake field acceleration and high field physics studies. Recent success of using 20 pC electron beam to drive an x-ray FEL at LCLS has stimulated world-wide interests in using low charge beam (1 {approx} 20 pC) to generate ultrashort x-ray pulses (0.1 fs {approx} 10 fs) in FELs. Accurate measurement of the length (preferably the temporal profile) of the ultrashort electron bunch is essential for understanding the physics associated with the bunch compression and transportation. However, the shorter and shorter electron bunch greatly challenges the present beam diagnostic methods. In this paper we review the recent advances in the measurement of ultra-short electron bunches. We will focus on several techniques and their variants that provide the state-of-the-art temporal resolution. Methods to further improve the resolution of these techniques and the promise to break the 1 fs time barrier is discussed. We review recent advances in the measurement of ultrashort relativistic electron bunches. We will focus on several techniques and their variants that are capable of breaking the femtosecond time barrier in measurements of ultrashort bunches. Techniques for measuring beam longitudinal phase space as well as the x-ray pulse shape in an x-ray FEL are also discussed.

  3. Multi-Site and Multi-Variables Statistical Downscaling Technique in the Monsoon Dominated Region of Pakistan

    NASA Astrophysics Data System (ADS)

    Khan, Firdos; Pilz, Jürgen

    2016-04-01

    South Asia is under the severe impacts of changing climate and global warming. The last two decades showed that climate change or global warming is happening and the first decade of 21st century is considered as the warmest decade over Pakistan ever in history where temperature reached 53 0C in 2010. Consequently, the spatio-temporal distribution and intensity of precipitation is badly effected and causes floods, cyclones and hurricanes in the region which further have impacts on agriculture, water, health etc. To cope with the situation, it is important to conduct impact assessment studies and take adaptation and mitigation remedies. For impact assessment studies, we need climate variables at higher resolution. Downscaling techniques are used to produce climate variables at higher resolution; these techniques are broadly divided into two types, statistical downscaling and dynamical downscaling. The target location of this study is the monsoon dominated region of Pakistan. One reason for choosing this area is because the contribution of monsoon rains in this area is more than 80 % of the total rainfall. This study evaluates a statistical downscaling technique which can be then used for downscaling climatic variables. Two statistical techniques i.e. quantile regression and copula modeling are combined in order to produce realistic results for climate variables in the area under-study. To reduce the dimension of input data and deal with multicollinearity problems, empirical orthogonal functions will be used. Advantages of this new method are: (1) it is more robust to outliers as compared to ordinary least squares estimates and other estimation methods based on central tendency and dispersion measures; (2) it preserves the dependence among variables and among sites and (3) it can be used to combine different types of distributions. This is important in our case because we are dealing with climatic variables having different distributions over different meteorological

  4. Assessment of Coastal and Urban Flooding Hazards Applying Extreme Value Analysis and Multivariate Statistical Techniques: A Case Study in Elwood, Australia

    NASA Astrophysics Data System (ADS)

    Guimarães Nobre, Gabriela; Arnbjerg-Nielsen, Karsten; Rosbjerg, Dan; Madsen, Henrik

    2016-04-01

    Traditionally, flood risk assessment studies have been carried out from a univariate frequency analysis perspective. However, statistical dependence between hydrological variables, such as extreme rainfall and extreme sea surge, is plausible to exist, since both variables to some extent are driven by common meteorological conditions. Aiming to overcome this limitation, multivariate statistical techniques has the potential to combine different sources of flooding in the investigation. The aim of this study was to apply a range of statistical methodologies for analyzing combined extreme hydrological variables that can lead to coastal and urban flooding. The study area is the Elwood Catchment, which is a highly urbanized catchment located in the city of Port Phillip, Melbourne, Australia. The first part of the investigation dealt with the marginal extreme value distributions. Two approaches to extract extreme value series were applied (Annual Maximum and Partial Duration Series), and different probability distribution functions were fit to the observed sample. Results obtained by using the Generalized Pareto distribution demonstrate the ability of the Pareto family to model the extreme events. Advancing into multivariate extreme value analysis, first an investigation regarding the asymptotic properties of extremal dependence was carried out. As a weak positive asymptotic dependence between the bivariate extreme pairs was found, the Conditional method proposed by Heffernan and Tawn (2004) was chosen. This approach is suitable to model bivariate extreme values, which are relatively unlikely to occur together. The results show that the probability of an extreme sea surge occurring during a one-hour intensity extreme precipitation event (or vice versa) can be twice as great as what would occur when assuming independent events. Therefore, presuming independence between these two variables would result in severe underestimation of the flooding risk in the study area.

  5. Recent advances in molecular techniques to study microbial communities in food-associated matrices and processes.

    PubMed

    Justé, A; Thomma, B P H J; Lievens, B

    2008-09-01

    In the last two decades major changes have occurred in how microbial ecologists study microbial communities. Limitations associated with traditional culture-based methods have pushed for the development of culture-independent techniques, which are primarily based on the analysis of nucleic acids. These methods are now increasingly applied in food microbiology as well. This review presents an overview of current community profiling techniques with their (potential) applications in food and food-related ecosystems. We critically assessed both the power and limitations of these techniques and present recent advances in the field of food microbiology attained by their application. It is unlikely that a single approach will be universally applicable for analyzing microbial communities in unknown matrices. However, when screening samples for well-defined species or functions, techniques such as DNA arrays and real-time PCR have the potential to overtake current culture-based methods. Most importantly, molecular methods will allow us to surpass our current culturing limitations, thus revealing the extent and importance of the 'non-culturable' microbial flora that occurs in food matrices and production.

  6. Advanced techniques for array processing. Final report, 1 Mar 89-30 Apr 91

    SciTech Connect

    Friedlander, B.

    1991-05-30

    Array processing technology is expected to be a key element in communication systems designed for the crowded and hostile environment of the future battlefield. While advanced array processing techniques have been under development for some time, their practical use has been very limited. This project addressed some of the issues which need to be resolved for a successful transition of these promising techniques from theory into practice. The main problem which was studied was that of finding the directions of multiple co-channel transmitters from measurements collected by an antenna array. Two key issues related to high-resolution direction finding were addressed: effects of system calibration errors, and effects of correlation between the received signals due to multipath propagation. A number of useful theoretical performance analysis results were derived, and computationally efficient direction estimation algorithms were developed. These results include: self-calibration techniques for antenna arrays, sensitivity analysis for high-resolution direction finding, extensions of the root-MUSIC algorithm to arbitrary arrays and to arrays with polarization diversity, and new techniques for direction finding in the presence of multipath based on array interpolation. (Author)

  7. Advancement of an Infra-Red Technique for Whole-Field Concentration Measurements in Fluidized Beds

    PubMed Central

    Medrano, Jose A.; de Nooijer, Niek C. A.; Gallucci, Fausto; van Sint Annaland, Martin

    2016-01-01

    For a better understanding and description of the mass transport phenomena in dense multiphase gas-solids systems such as fluidized bed reactors, detailed and quantitative experimental data on the concentration profiles is required, which demands advanced non-invasive concentration monitoring techniques with a high spatial and temporal resolution. A novel technique based on the selective detection of a gas component in a gas mixture using infra-red properties has been further developed. The first stage development was carried out using a very small sapphire reactor and CO2 as tracer gas. Although the measuring principle was demonstrated, the real application was hindered by the small reactor dimensions related to the high costs and difficult handling of large sapphire plates. In this study, a new system has been developed, that allows working at much larger scales and yet with higher resolution. In the new system, propane is used as tracer gas and quartz as reactor material. In this study, a thorough optimization and calibration of the technique is presented which is subsequently applied for whole-field measurements with high temporal resolution. The developed technique allows the use of a relatively inexpensive configuration for the measurement of detailed concentration fields and can be applied to a large variety of important chemical engineering topics. PMID:26927127

  8. Biotechnology Apprenticeship for Secondary-Level Students: Teaching Advanced Cell Culture Techniques for Research

    PubMed Central

    Lewis, Jennifer R.; Kotur, Mark S.; Butt, Omar; Kulcarni, Sumant; Riley, Alyssa A.; Ferrell, Nick; Sullivan, Kathryn D.; Ferrari, Mauro

    2002-01-01

    The purpose of this article is to discuss small-group apprenticeships (SGAs) as a method to instruct cell culture techniques to high school participants. The study aimed to teach cell culture practices and to introduce advanced imaging techniques to solve various biomedical engineering problems. Participants designed and completed experiments using both flow cytometry and laser scanning cytometry during the 1-month summer apprenticeship. In addition to effectively and efficiently teaching cell biology laboratory techniques, this course design provided an opportunity for research training, career exploration, and mentoring. Students participated in active research projects, working with a skilled interdisciplinary team of researchers in a large research institution with access to state-of-the-art instrumentation. The instructors, composed of graduate students, laboratory managers, and principal investigators, worked well together to present a real and worthwhile research experience. The students enjoyed learning cell culture techniques while contributing to active research projects. The institution's researchers were equally enthusiastic to instruct and serve as mentors. In this article, we clarify and illuminate the value of small-group laboratory apprenticeships to the institution and the students by presenting the results and experiences of seven middle and high school participants and their instructors. PMID:12587031

  9. Where in the Cell Are You? Probing HIV-1 Host Interactions through Advanced Imaging Techniques

    PubMed Central

    Dirk, Brennan S.; Van Nynatten, Logan R.; Dikeakos, Jimmy D.

    2016-01-01

    Viruses must continuously evolve to hijack the host cell machinery in order to successfully replicate and orchestrate key interactions that support their persistence. The type-1 human immunodeficiency virus (HIV-1) is a prime example of viral persistence within the host, having plagued the human population for decades. In recent years, advances in cellular imaging and molecular biology have aided the elucidation of key steps mediating the HIV-1 lifecycle and viral pathogenesis. Super-resolution imaging techniques such as stimulated emission depletion (STED) and photoactivation and localization microscopy (PALM) have been instrumental in studying viral assembly and release through both cell–cell transmission and cell–free viral transmission. Moreover, powerful methods such as Forster resonance energy transfer (FRET) and bimolecular fluorescence complementation (BiFC) have shed light on the protein-protein interactions HIV-1 engages within the host to hijack the cellular machinery. Specific advancements in live cell imaging in combination with the use of multicolor viral particles have become indispensable to unravelling the dynamic nature of these virus-host interactions. In the current review, we outline novel imaging methods that have been used to study the HIV-1 lifecycle and highlight advancements in the cell culture models developed to enhance our understanding of the HIV-1 lifecycle. PMID:27775563

  10. Management of metastatic malignant thymoma with advanced radiation and chemotherapy techniques: report of a rare case.

    PubMed

    D'Andrea, Mark A; Reddy, G Kesava

    2015-02-25

    Malignant thymomas are rare epithelial neoplasms of the anterior superior mediastinum that are typically invasive in nature and have a higher risk of relapse that may ultimately lead to death. Here we report a case of an advanced malignant thymoma that was successfully treated with neoadjuvant chemotherapy followed by surgical resection and subsequently with advanced and novel radiation therapy techniques. A 65-year-old male was diagnosed with a stage IV malignant thymoma with multiple metastatic lesions involving the left peripheral lung and pericardium. Initial neoadjuvant chemotherapy with a cisplatin-based regimen resulted in a partial response allowing the inoperable tumor to become operable. Following surgical resection of the residual disease, the tumor recurred within a year. The patient then underwent a course of targeted three-dimensional intensity modulated radiation therapy (IMRT) and image-guided radiation therapy (IGRT). Five years after radiation therapy, the localized soft tissue thickening at the left upper lung anterior pleural space had resolved. Seven years after radiation therapy the tumor mass had completely resolved. No recurrences were seen and the patient is well even 8 years after IMRT/IGRT with a favorable outcome. Chemotherapy with targeted three-dimensional IMRT/IGRT should be considered the primary modality for the management of advanced malignant thymoma patients.

  11. Advanced MRI Techniques in the Evaluation of Complex Cystic Breast Lesions

    PubMed Central

    Popli, Manju Bala; Gupta, Pranav; Arse, Devraj; Kumar, Pawan; Kaur, Prabhjot

    2016-01-01

    OBJECTIVE The purpose of this research work was to evaluate complex cystic breast lesions by advanced MRI techniques and correlating imaging with histologic findings. METHODS AND MATERIALS In a cross-sectional design from September 2013 to August 2015, 50 patients having sonographically detected complex cystic lesions of the breast were included in the study. Morphological characteristics were assessed. Dynamic contrast-enhanced MRI along with diffusion-weighted imaging and MR spectroscopy were used to further classify lesions into benign and malignant categories. All the findings were correlated with histopathology. RESULTS Of the 50 complex cystic lesions, 32 proved to be benign and 18 were malignant on histopathology. MRI features of heterogeneous enhancement on CE-MRI (13/18), Type III kinetic curve (13/18), reduced apparent diffusion coefficient (18/18), and tall choline peak (17/18) were strong predictors of malignancy. Thirteen of the 18 lesions showed a combination of Type III curve, reduced apparent diffusion coefficient value, and tall choline peak. CONCLUSIONS Advanced MRI techniques like dynamic imaging, diffusion-weighted sequences, and MR spectroscopy provide a high level of diagnostic confidence in the characterization of complex cystic breast lesion, thus allowing early diagnosis and significantly reducing patient morbidity and mortality. From our study, lesions showing heterogeneous contrast enhancement, Type III kinetic curve, diffusion restriction, and tall choline peak were significantly associated with malignant complex cystic lesions of the breast. PMID:27330299

  12. Advanced grazing-incidence techniques for modern soft-matter materials analysis

    PubMed Central

    Hexemer, Alexander; Müller-Buschbaum, Peter

    2015-01-01

    The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS), new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities for in situ and in operando GISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in the soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed. PMID:25610632

  13. Use of Advanced Magnetic Resonance Imaging Techniques in Neuromyelitis Optica Spectrum Disorder.

    PubMed

    Kremer, Stephane; Renard, Felix; Achard, Sophie; Lana-Peixoto, Marco A; Palace, Jacqueline; Asgari, Nasrin; Klawiter, Eric C; Tenembaum, Silvia N; Banwell, Brenda; Greenberg, Benjamin M; Bennett, Jeffrey L; Levy, Michael; Villoslada, Pablo; Saiz, Albert; Fujihara, Kazuo; Chan, Koon Ho; Schippling, Sven; Paul, Friedemann; Kim, Ho Jin; de Seze, Jerome; Wuerfel, Jens T; Cabre, Philippe; Marignier, Romain; Tedder, Thomas; van Pelt, Danielle; Broadley, Simon; Chitnis, Tanuja; Wingerchuk, Dean; Pandit, Lekha; Leite, Maria Isabel; Apiwattanakul, Metha; Kleiter, Ingo; Prayoonwiwat, Naraporn; Han, May; Hellwig, Kerstin; van Herle, Katja; John, Gareth; Hooper, D Craig; Nakashima, Ichiro; Sato, Douglas; Yeaman, Michael R; Waubant, Emmanuelle; Zamvil, Scott; Stüve, Olaf; Aktas, Orhan; Smith, Terry J; Jacob, Anu; O'Connor, Kevin

    2015-07-01

    Brain parenchymal lesions are frequently observed on conventional magnetic resonance imaging (MRI) scans of patients with neuromyelitis optica (NMO) spectrum disorder, but the specific morphological and temporal patterns distinguishing them unequivocally from lesions caused by other disorders have not been identified. This literature review summarizes the literature on advanced quantitative imaging measures reported for patients with NMO spectrum disorder, including proton MR spectroscopy, diffusion tensor imaging, magnetization transfer imaging, quantitative MR volumetry, and ultrahigh-field strength MRI. It was undertaken to consider the advanced MRI techniques used for patients with NMO by different specialists in the field. Although quantitative measures such as proton MR spectroscopy or magnetization transfer imaging have not reproducibly revealed diffuse brain injury, preliminary data from diffusion-weighted imaging and brain tissue volumetry indicate greater white matter than gray matter degradation. These findings could be confirmed by ultrahigh-field MRI. The use of nonconventional MRI techniques may further our understanding of the pathogenic processes in NMO spectrum disorders and may help us identify the distinct radiographic features corresponding to specific phenotypic manifestations of this disease.

  14. Use of Advanced Magnetic Resonance Imaging Techniques in Neuromyelitis Optica Spectrum Disorder

    PubMed Central

    Kremer, Stephane; Renard, Felix; Achard, Sophie; Lana-Peixoto, Marco A.; Palace, Jacqueline; Asgari, Nasrin; Klawiter, Eric C.; Tenembaum, Silvia N.; Banwell, Brenda; Greenberg, Benjamin M.; Bennett, Jeffrey L.; Levy, Michael; Villoslada, Pablo; Saiz, Albert; Fujihara, Kazuo; Chan, Koon Ho; Schippling, Sven; Paul, Friedemann; Kim, Ho Jin; de Seze, Jerome; Wuerfel, Jens T.

    2016-01-01

    Brain parenchymal lesions are frequently observed on conventional magnetic resonance imaging (MRI) scans of patients with neuromyelitis optica (NMO) spectrum disorder, but the specific morphological and temporal patterns distinguishing them unequivocally from lesions caused by other disorders have not been identified. This literature review summarizes the literature on advanced quantitative imaging measures reported for patients with NMO spectrum disorder, including proton MR spectroscopy, diffusion tensor imaging, magnetization transfer imaging, quantitative MR volumetry, and ultrahigh-field strength MRI. It was undertaken to consider the advanced MRI techniques used for patients with NMO by different specialists in the field. Although quantitative measures such as proton MR spectroscopy or magnetization transfer imaging have not reproducibly revealed diffuse brain injury, preliminary data from diffusion-weighted imaging and brain tissue volumetry indicate greater white matter than gray matter degradation. These findings could be confirmed by ultrahigh-field MRI. The use of nonconventional MRI techniques may further our understanding of the pathogenic processes in NMO spectrum disorders and may help us identify the distinct radiographic features corresponding to specific phenotypic manifestations of this disease. PMID:26010909

  15. Advanced grazing-incidence techniques for modern soft-matter materials analysis

    DOE PAGES

    Hexemer, Alexander; Müller-Buschbaum, Peter

    2015-01-01

    The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS), new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities forin situandin operandoGISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in themore » soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed.« less

  16. Advanced grazing-incidence techniques for modern soft-matter materials analysis

    SciTech Connect

    Hexemer, Alexander; Müller-Buschbaum, Peter

    2015-01-01

    The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS), new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities forin situandin operandoGISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in the soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed.

  17. Use of Advanced Magnetic Resonance Imaging Techniques in Neuromyelitis Optica Spectrum Disorder.

    PubMed

    Kremer, Stephane; Renard, Felix; Achard, Sophie; Lana-Peixoto, Marco A; Palace, Jacqueline; Asgari, Nasrin; Klawiter, Eric C; Tenembaum, Silvia N; Banwell, Brenda; Greenberg, Benjamin M; Bennett, Jeffrey L; Levy, Michael; Villoslada, Pablo; Saiz, Albert; Fujihara, Kazuo; Chan, Koon Ho; Schippling, Sven; Paul, Friedemann; Kim, Ho Jin; de Seze, Jerome; Wuerfel, Jens T; Cabre, Philippe; Marignier, Romain; Tedder, Thomas; van Pelt, Danielle; Broadley, Simon; Chitnis, Tanuja; Wingerchuk, Dean; Pandit, Lekha; Leite, Maria Isabel; Apiwattanakul, Metha; Kleiter, Ingo; Prayoonwiwat, Naraporn; Han, May; Hellwig, Kerstin; van Herle, Katja; John, Gareth; Hooper, D Craig; Nakashima, Ichiro; Sato, Douglas; Yeaman, Michael R; Waubant, Emmanuelle; Zamvil, Scott; Stüve, Olaf; Aktas, Orhan; Smith, Terry J; Jacob, Anu; O'Connor, Kevin

    2015-07-01

    Brain parenchymal lesions are frequently observed on conventional magnetic resonance imaging (MRI) scans of patients with neuromyelitis optica (NMO) spectrum disorder, but the specific morphological and temporal patterns distinguishing them unequivocally from lesions caused by other disorders have not been identified. This literature review summarizes the literature on advanced quantitative imaging measures reported for patients with NMO spectrum disorder, including proton MR spectroscopy, diffusion tensor imaging, magnetization transfer imaging, quantitative MR volumetry, and ultrahigh-field strength MRI. It was undertaken to consider the advanced MRI techniques used for patients with NMO by different specialists in the field. Although quantitative measures such as proton MR spectroscopy or magnetization transfer imaging have not reproducibly revealed diffuse brain injury, preliminary data from diffusion-weighted imaging and brain tissue volumetry indicate greater white matter than gray matter degradation. These findings could be confirmed by ultrahigh-field MRI. The use of nonconventional MRI techniques may further our understanding of the pathogenic processes in NMO spectrum disorders and may help us identify the distinct radiographic features corresponding to specific phenotypic manifestations of this disease. PMID:26010909

  18. Development of Advanced Nuclide Separation and Recovery Methods using Ion-Exchanhge Techniques in Nuclear Backend

    NASA Astrophysics Data System (ADS)

    Miura, Hitoshi

    The development of compact separation and recovery methods using selective ion-exchange techniques is very important for the reprocessing and high-level liquid wastes (HLLWs) treatment in the nuclear backend field. The selective nuclide separation techniques are effective for the volume reduction of wastes and the utilization of valuable nuclides, and expected for the construction of advanced nuclear fuel cycle system and the rationalization of waste treatment. In order to accomplish the selective nuclide separation, the design and synthesis of novel adsorbents are essential for the development of compact and precise separation processes. The present paper deals with the preparation of highly functional and selective hybrid microcapsules enclosing nano-adsorbents in the alginate gel polymer matrices by sol-gel methods, their characterization and the clarification of selective adsorption properties by batch and column methods. The selective separation of Cs, Pd and Re in real HLLW was further accomplished by using novel microcapsules, and an advanced nuclide separation system was proposed by the combination of selective processes using microcapsules.

  19. Advanced intensity-modulation continuous-wave lidar techniques for ASCENDS CO2 column measurements

    NASA Astrophysics Data System (ADS)

    Campbell, Joel F.; Lin, Bing; Nehrir, Amin R.; Harrison, F. W.; Obland, Michael D.; Meadows, Byron

    2015-10-01

    Global atmospheric carbon dioxide (CO2) measurements for the NASA Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) space mission are critical for improving our understanding of global CO2 sources and sinks. Advanced Intensity- Modulated Continuous-Wave (IM-CW) lidar techniques are investigated as a means of facilitating CO2 measurements from space to meet the ASCENDS measurement requirements. In recent numerical, laboratory and flight experiments we have successfully used the Binary Phase Shift Keying (BPSK) modulation technique to uniquely discriminate surface lidar returns from intermediate aerosol and cloud contamination. We demonstrate the utility of BPSK to eliminate sidelobes in the range profile as a means of making Integrated Path Differential Absorption (IPDA) column CO2 measurements in the presence of optically thin clouds, thereby eliminating the need to correct for sidelobe bias errors caused by the clouds. Furthermore, high accuracy and precision ranging to the surface as well as to the top of intermediate cloud layers, which is a requirement for the inversion of column CO2 number density measurements to column CO2 mixing ratios, has been demonstrated using new hyperfine interpolation techniques that takes advantage of the periodicity of the modulation waveforms. This approach works well for both BPSK and linear swept-frequency modulation techniques. The BPSK technique under investigation has excellent auto-correlation properties while possessing a finite bandwidth. A comparison of BPSK and linear swept-frequency is also discussed in this paper. These results are extended to include Richardson-Lucy deconvolution techniques to extend the resolution of the lidar beyond that implied by limit of the bandwidth of the modulation, where it is shown useful for making tree canopy measurements.

  20. Advanced Intensity-Modulation Continuous-Wave Lidar Techniques for Column CO2 Measurements

    NASA Astrophysics Data System (ADS)

    Campbell, J. F.; Lin, B.; Nehrir, A. R.; Obland, M. D.; Liu, Z.; Browell, E. V.; Chen, S.; Kooi, S. A.; Fan, T. F.

    2015-12-01

    Global and regional atmospheric carbon dioxide (CO2) measurements for the NASA Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) space mission and Atmospheric Carbon and Transport (ACT) - America airborne investigation are critical for improving our understanding of global CO2 sources and sinks. Advanced Intensity-Modulated Continuous-Wave (IM-CW) lidar techniques are being investigated as a means of facilitating CO2 measurements from space and airborne platforms to meet the mission science measurement requirements. In recent numerical, laboratory and flight experiments we have successfully used the Binary Phase Shift Keying (BPSK) modulation technique to uniquely discriminate surface lidar returns from intermediate aerosol and cloud returns. We demonstrate the utility of BPSK to eliminate sidelobes in the range profile as a means of making Integrated Path Differential Absorption (IPDA) column CO2 measurements in the presence of intervening optically thin clouds, thereby minimizing bias errors caused by the clouds. Furthermore, high accuracy and precision ranging to the Earth's surface as well as to the top of intermediate cloud layers, which is a requirement for the inversion of column CO2 number density measurements to column CO2 mixing ratios, has been demonstrated using new hyperfine interpolation techniques that takes advantage of the periodicity of the modulation waveforms. This approach works well for both BPSK and linear swept-frequency modulation techniques and provides very high (at sub-meter level) range resolution. The BPSK technique under investigation has excellent auto-correlation properties while possessing a finite bandwidth. A comparison of BPSK and linear swept-frequency is also discussed in this paper. These techniques are used in a new data processing architecture to support the ASCENDS CarbonHawk Experiment Simulator (ACES) and ACT-America programs.

  1. Advanced Intensity-Modulation Continuous-Wave Lidar Techniques for ASCENDS O2 Column Measurements

    NASA Technical Reports Server (NTRS)

    Campbell, Joel F.; Lin, Bing; Nehrir, Amin R.; Harrison, F. Wallace; Obland, Michael D.; Meadows, Byron

    2015-01-01

    Global atmospheric carbon dioxide (CO2) measurements for the NASA Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) space mission are critical for improving our understanding of global CO2 sources and sinks. Advanced Intensity- Modulated Continuous-Wave (IM-CW) lidar techniques are investigated as a means of facilitating CO2 measurements from space to meet the ASCENDS measurement requirements. In recent numerical, laboratory and flight experiments we have successfully used the Binary Phase Shift Keying (BPSK) modulation technique to uniquely discriminate surface lidar returns from intermediate aerosol and cloud contamination. We demonstrate the utility of BPSK to eliminate sidelobes in the range profile as a means of making Integrated Path Differential Absorption (IPDA) column CO2 measurements in the presence of optically thin clouds, thereby eliminating the need to correct for sidelobe bias errors caused by the clouds. Furthermore, high accuracy and precision ranging to the surface as well as to the top of intermediate cloud layers, which is a requirement for the inversion of column CO2 number density measurements to column CO2 mixing ratios, has been demonstrated using new hyperfine interpolation techniques that takes advantage of the periodicity of the modulation waveforms. This approach works well for both BPSK and linear swept-frequency modulation techniques. The BPSK technique under investigation has excellent auto-correlation properties while possessing a finite bandwidth. A comparison of BPSK and linear swept-frequency is also discussed in this paper. These results are extended to include Richardson-Lucy deconvolution techniques to extend the resolution of the lidar beyond that implied by limit of the bandwidth of the modulation, where it is shown useful for making tree canopy measurements.

  2. Advances in iterative non-uniformity correction techniques for infrared scene projection

    NASA Astrophysics Data System (ADS)

    Danielson, Tom; Franks, Greg; LaVeigne, Joe; Prewarski, Marcus; Nehring, Brian

    2015-05-01

    Santa Barbara Infrared (SBIR) is continually developing improved methods for non-uniformity correction (NUC) of its Infrared Scene Projectors (IRSPs) as part of its comprehensive efforts to achieve the best possible projector performance. The most recent step forward, Advanced Iterative NUC (AI-NUC), improves upon previous NUC approaches in several ways. The key to NUC performance is achieving the most accurate possible input drive-to-radiance output mapping for each emitter pixel. This requires many highly-accurate radiance measurements of emitter output, as well as sophisticated manipulation of the resulting data set. AI-NUC expands the available radiance data set to include all measurements made of emitter output at any point. In addition, it allows the user to efficiently manage that data for use in the construction of a new NUC table that is generated from an improved fit of the emitter response curve. Not only does this improve the overall NUC by offering more statistics for interpolation than previous approaches, it also simplifies the removal of erroneous data from the set so that it does not propagate into the correction tables. AI-NUC is implemented by SBIR's IRWindows4 automated test software as part its advanced turnkey IRSP product (the Calibration Radiometry System or CRS), which incorporates all necessary measurement, calibration and NUC table generation capabilities. By employing AI-NUC on the CRS, SBIR has demonstrated the best uniformity results on resistive emitter arrays to date.

  3. Dynamic rain fade compensation techniques for the advanced communications technology satellite

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1992-01-01

    The dynamic and composite nature of propagation impairments that are incurred on earth-space communications links at frequencies in and above the 30/20 GHz Ka band necessitate the use of dynamic statistical identification and prediction processing of the fading signal in order to optimally estimate and predict the levels of each of the deleterious attenuation components. Such requirements are being met in NASA's Advanced Communications Technology Satellite (ACTS) project by the implementation of optimal processing schemes derived through the use of the ACTS Rain Attenuation Prediction Model and nonlinear Markov filtering theory. The ACTS Rain Attenuation Prediction Model discerns climatological variations on the order of 0.5 deg in latitude and longitude in the continental U.S. The time-dependent portion of the model gives precise availability predictions for the 'spot beam' links of ACTS. However, the structure of the dynamic portion of the model, which yields performance parameters such as fade duration probabilities, is isomorphic to the state-variable approach of stochastic control theory and is amenable to the design of such statistical fade processing schemes which can be made specific to the particular climatological location at which they are employed.

  4. Statistical Entry, Descent, and Landing Flight Reconstruction with Flush Air Data System Observations using Inertial Navigation and Monte Carlo Techniques

    NASA Astrophysics Data System (ADS)

    Lugo, Rafael Andres

    A method is introduced to consider flush air data system (FADS) pressures using a technique based on inertial navigation to reconstruct the trajectory of an atmospheric entry vehicle. The approach augments the recently-developed Inertial Navigation Statistical Trajectory and Atmosphere Reconstruction (INSTAR), which is an extension of inertial navigation that provides statistical uncertainties by utilizing Monte Carlo dispersion techniques and is an alternative to traditional statistical approaches to entry, descent, and landing trajectory and atmosphere reconstruction. The method is demonstrated using flight data from the Mars Science Laboratory (MSL) entry vehicle, which contained an inertial measurement unit and a flush air data system called the Mars Entry Atmospheric Data System (MEADS). An MSL trajectory and atmosphere solution that was updated using landing site location in INSTAR is first presented. This solution and corresponding uncertainties, which were obtained from Monte Carlo dispersions, are then used in a minimum variance algorithm to obtain aerodynamic estimates and uncertainties from the MEADS observations. MEADS-derived axial force coefficient and freestream density estimates and uncertainties are also derived from the minimum variance solutions independent of the axial force coefficients derived from computation fluid dynamics (CFD), which have relatively high a priori uncertainty. Results from probabilistic analyses of the solutions are also presented. This dissertation also introduces a method to consider correlated CFD uncertainties in INSTAR. From a priori CFD uncertainties, CFD force and pressure coefficients are dispersed in a Monte Carlo sense and carried over into the reconstructions. An analysis of the subsequent effects on the trajectory, atmosphere, and aerodynamic estimates and statistics is presented. Trajectory, atmospheric, and aerodynamic estimates compare favorably to extended Kalman filter solutions obtained by the MSL

  5. PREFACE: 16th International workshop on Advanced Computing and Analysis Techniques in physics research (ACAT2014)

    NASA Astrophysics Data System (ADS)

    Fiala, L.; Lokajicek, M.; Tumova, N.

    2015-05-01

    This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program

  6. Utilization of advanced calibration techniques in stochastic rock fall analysis of quarry slopes

    NASA Astrophysics Data System (ADS)

    Preh, Alexander; Ahmadabadi, Morteza; Kolenprat, Bernd

    2016-04-01

    In order to study rock fall dynamics, a research project was conducted by the Vienna University of Technology and the Austrian Central Labour Inspectorate (Federal Ministry of Labour, Social Affairs and Consumer Protection). A part of this project included 277 full-scale drop tests at three different quarries in Austria and recording key parameters of the rock fall trajectories. The tests involved a total of 277 boulders ranging from 0.18 to 1.8 m in diameter and from 0.009 to 8.1 Mg in mass. The geology of these sites included strong rock belonging to igneous, metamorphic and volcanic types. In this paper the results of the tests are used for calibration and validation a new stochastic computer model. It is demonstrated that the error of the model (i.e. the difference between observed and simulated results) has a lognormal distribution. Selecting two parameters, advanced calibration techniques including Markov Chain Monte Carlo Technique, Maximum Likelihood and Root Mean Square Error (RMSE) are utilized to minimize the error. Validation of the model based on the cross validation technique reveals that in general, reasonable stochastic approximations of the rock fall trajectories are obtained in all dimensions, including runout, bounce heights and velocities. The approximations are compared to the measured data in terms of median, 95% and maximum values. The results of the comparisons indicate that approximate first-order predictions, using a single set of input parameters, are possible and can be used to aid practical hazard and risk assessment.

  7. Visualizing epigenetics: current advances and advantages in HDAC PET imaging techniques.

    PubMed

    Wang, C; Schroeder, F A; Hooker, J M

    2014-04-01

    Abnormal gene regulation as a consequence of flawed epigenetic mechanisms may be central to the initiation and persistence of many human diseases. However, the association of epigenetic dysfunction with disease and the development of therapeutic agents for treatment are slow. Developing new methodologies used to visualize chromatin-modifying enzymes and their function in the human brain would be valuable for the diagnosis of brain disorders and drug discovery. We provide an overview of current invasive and noninvasive techniques for measuring expression and functions of chromatin-modifying enzymes in the brain, emphasizing tools applicable to histone deacetylase (HDAC) enzymes as a leading example. The majority of current techniques are invasive and difficult to translate to what is happening within a human brain in vivo. However, recent progress in molecular imaging provides new, noninvasive ways to visualize epigenetics in the human brain. Neuroimaging tool development presents a unique set of challenges in order to identify and validate CNS radiotracers for HDACs and other histone-modifying enzymes. We summarize advances in the effort to image HDACs and HDAC inhibitory effects in the brain using positron emission tomography (PET) and highlight generalizable techniques that can be adapted to investigate other specific components of epigenetic machinery. Translational tools like neuroimaging by PET and magnetic resonance imaging provide the best way to link our current understanding of epigenetic changes with in vivo function in normal and diseased brains. These tools will be a critical addition to ex vivo methods to evaluate - and intervene - in CNS dysfunction.

  8. Measurements of the subcriticality using advanced technique of shooting source during operation of NPP reactors

    SciTech Connect

    Lebedev, G. V. Petrov, V. V.; Bobylyov, V. T.; Butov, R. I.; Zhukov, A. M.; Sladkov, A. A.

    2014-12-15

    According to the rules of nuclear safety, the measurements of the subcriticality of reactors should be carried out in the process of performing nuclear hazardous operations. An advanced technique of shooting source of neutrons is proposed to meet this requirement. As such a source, a pulsed neutron source (PNS) is used. In order to realize this technique, it is recommended to enable a PNS with a frequency of 1–20 Hz. The PNS is stopped after achieving a steady-state (on average) number of neutrons in the reactor volume. The change in the number of neutrons in the reactor volume is measured in time with an interval of discreteness of ∼0.1 s. The results of these measurements with the application of a system of point-kinetics equations are used in order to calculate the sought subcriticality. The basic idea of the proposed technique used to measure the subcriticality is elaborated in a series of experiments on the Kvant assembly. The conditions which should be implemented in order to obtain a positive result of measurements are formulated. A block diagram of the basic version of the experimental setup is presented, whose main element is a pulsed neutron generator.

  9. New advanced surface modification technique: titanium oxide ceramic surface implants: long-term clinical results

    NASA Astrophysics Data System (ADS)

    Szabo, Gyorgy; Kovacs, Lajos; Barabas, Jozsef; Nemeth, Zsolt; Maironna, Carlo

    2001-11-01

    The purpose of this paper is to discuss the background to advanced surface modification technologies and to present a new technique, involving the formation of a titanium oxide ceramic coating, with relatively long-term results of its clinical utilization. Three general techniques are used to modify surfaces: the addition or removal of material and the change of material already present. Surface properties can also be changed without the addition or removal of material, through the laser or electron beam thermal treatment. The new technique outlined in this paper relates to the production of a corrosion-resistant 2000-2500 A thick, ceramic oxide layer with a coherent crystalline structure on the surface of titanium implants. The layer is grown electrochemically from the bulk of the metal and is modified by heat treatment. Such oxide ceramic-coated implants have a number of advantageous properties relative to implants covered with various other coatings: a higher external hardness, a greater force of adherence between the titanium and the oxide ceramic coating, a virtually perfect insulation between the organism and the metal (no possibility of metal allergy), etc. The coated implants were subjected to various physical, chemical, electronmicroscopic, etc. tests for a qualitative characterization. Finally, these implants (plates, screws for maxillofacial osteosynthesis and dental root implants) were applied in surgical practice for a period of 10 years. Tests and the experience acquired demonstrated the good properties of the titanium oxide ceramic-coated implants.

  10. Measurements of the subcriticality using advanced technique of shooting source during operation of NPP reactors

    NASA Astrophysics Data System (ADS)

    Lebedev, G. V.; Petrov, V. V.; Bobylyov, V. T.; Butov, R. I.; Zhukov, A. M.; Sladkov, A. A.

    2014-12-01

    According to the rules of nuclear safety, the measurements of the subcriticality of reactors should be carried out in the process of performing nuclear hazardous operations. An advanced technique of shooting source of neutrons is proposed to meet this requirement. As such a source, a pulsed neutron source (PNS) is used. In order to realize this technique, it is recommended to enable a PNS with a frequency of 1-20 Hz. The PNS is stopped after achieving a steady-state (on average) number of neutrons in the reactor volume. The change in the number of neutrons in the reactor volume is measured in time with an interval of discreteness of ˜0.1 s. The results of these measurements with the application of a system of point-kinetics equations are used in order to calculate the sought subcriticality. The basic idea of the proposed technique used to measure the subcriticality is elaborated in a series of experiments on the Kvant assembly. The conditions which should be implemented in order to obtain a positive result of measurements are formulated. A block diagram of the basic version of the experimental setup is presented, whose main element is a pulsed neutron generator.

  11. Translation of Untranslatable Words — Integration of Lexical Approximation and Phrase-Table Extension Techniques into Statistical Machine Translation

    NASA Astrophysics Data System (ADS)

    Paul, Michael; Arora, Karunesh; Sumita, Eiichiro

    This paper proposes a method for handling out-of-vocabulary (OOV) words that cannot be translated using conventional phrase-based statistical machine translation (SMT) systems. For a given OOV word, lexical approximation techniques are utilized to identify spelling and inflectional word variants that occur in the training data. All OOV words in the source sentence are then replaced with appropriate word variants found in the training corpus, thus reducing the number of OOV words in the input. Moreover, in order to increase the coverage of such word translations, the SMT translation model is extended by adding new phrase translations for all source language words that do not have a single-word entry in the original phrase-table but only appear in the context of larger phrases. The effectiveness of the proposed methods is investigated for the translation of Hindi to English, Chinese, and Japanese.

  12. Assessment of surface water quality using multivariate statistical techniques: case study of the Nampong River and Songkhram River, Thailand.

    PubMed

    Muangthong, Somphinith; Shrestha, Sangam

    2015-09-01

    Multivariate statistical techniques such as cluster analysis (CA), principal component analysis (PCA), factor analysis (FA), and discriminant analysis (DA) were applied for the assessment of spatial and temporal variations of a large complex water quality data set of the Nampong River and Songkhram River, generated for more than 10 years (1996-2012) by monitoring of 16 parameters at different sites. According to the water quality characteristics, hierarchical CA grouped 13 sampling sites of the Nampong River into two clusters, i.e., upper stream (US) and lower stream (LS) sites, and five sampling sites of the Songkhram River into three clusters, i.e., upper stream (US), middle stream (MS) and lower stream (LS) sites. PCA/FA applied to the data sets thus obtained five latent factors explaining 69.80 and 69.32 % of the total variance in water quality data sets of LS and US areas, respectively, in the Nampong River and six latent factors explaining 80.80, 73.95, and 73.78 % of the total variance in water quality data sets of LS, MS, and US areas, respectively, in the Songkhram River. This study highlights the usefulness of multivariate statistical assessment of complex databases in the identification of pollution sources to better comprehend the spatial and temporal variations for effective river water quality management.

  13. Comparison of Statistical Estimation Techniques for Mars Entry, Descent, and Landing Reconstruction from MEDLI-like Data Sources

    NASA Technical Reports Server (NTRS)

    Dutta, Soumyo; Braun, Robert D.; Russell, Ryan P.; Clark, Ian G.; Striepe, Scott A.

    2012-01-01

    Flight data from an entry, descent, and landing (EDL) sequence can be used to reconstruct the vehicle's trajectory, aerodynamic coefficients and the atmospheric profile experienced by the vehicle. Past Mars missions have contained instruments that do not provide direct measurement of the freestream atmospheric conditions. Thus, the uncertainties in the atmospheric reconstruction and the aerodynamic database knowledge could not be separated. The upcoming Mars Science Laboratory (MSL) will take measurements of the pressure distribution on the aeroshell forebody during entry and will allow freestream atmospheric conditions to be partially observable. This data provides a mean to separate atmospheric and aerodynamic uncertainties and is part of the MSL EDL Instrumentation (MEDLI) project. Methods to estimate the flight performance statistically using on-board measurements are demonstrated here through the use of simulated Mars data. Different statistical estimators are used to demonstrate which estimator best quantifies the uncertainties in the flight parameters. The techniques demonstrated herein are planned for application to the MSL flight dataset after the spacecraft lands on Mars in August 2012.

  14. Advanced statistical process control of a chemical vapor tungsten deposition process on an Applied Materials Centura reactor

    NASA Astrophysics Data System (ADS)

    Stefani, Jerry A.; Poarch, Scott; Saxena, Sharad; Mozumder, P. K.

    1994-09-01

    An advanced multivariable off-line process control system, which combines traditional Statistical Process Control (SPC) with feedback control, has been applied to the CVD tungsten process on an Applied Materials Centura reactor. The goal of the model-based controller is to compensate for shifts in the process and maintain the wafer state responses on target. In the present application the controller employs measurements made on test wafers by off-line metrology tools to track the process behavior. This is accomplished by using model- bases SPC, which compares the measurements with predictions obtained from empirically-derived process models. For CVD tungsten, a physically-based modeling approach was employed based on the kinetically-limited H2 reduction of WF6. On detecting a statistically significant shift in the process, the controller calculates adjustments to the settings to bring the process responses back on target. To achieve this a few additional test wafers are processed at slightly different settings than the nominal. This local experiment allows the models to be updated to reflect the current process performance. The model updates are expressed as multiplicative or additive changes in the process inputs and a change in the model constant. This approach for model updating not only tracks the present process/equipment state, but it also provides some diagnostic capability regarding the cause of the process shift. The updated models are used by an optimizer to compute new settings to bring the responses back to target. The optimizer is capable of incrementally entering controllables into the strategy, reflecting the degree to which the engineer desires to manipulates each setting. The capability of the controller to compensate for shifts in the CVD tungsten process has been demonstrated. Targets for film bulk resistivity and deposition rate were maintained while satisfying constraints on film stress and WF6 conversion efficiency.

  15. Planning and scheduling the Hubble Space Telescope: Practical application of advanced techniques

    NASA Technical Reports Server (NTRS)

    Miller, Glenn E.

    1994-01-01

    NASA's Hubble Space Telescope (HST) is a major astronomical facility that was launched in April, 1990. In late 1993, the first of several planned servicing missions refurbished the telescope, including corrections for a manufacturing flaw in the primary mirror. Orbiting above the distorting effects of the Earth's atmosphere, the HST provides an unrivaled combination of sensitivity, spectral coverage and angular resolution. The HST is arguably the most complex scientific observatory ever constructed and effective use of this valuable resource required novel approaches to astronomical observation and the development of advanced software systems including techniques to represent scheduling preferences and constraints, a constraint satisfaction problem (CSP) based scheduler and a rule based planning system. This paper presents a discussion of these systems and the lessons learned from operational experience.

  16. Vibrio parahaemolyticus: a review on the pathogenesis, prevalence, and advance molecular identification techniques

    PubMed Central

    Letchumanan, Vengadesh; Chan, Kok-Gan; Lee, Learn-Han

    2014-01-01

    Vibrio parahaemolyticus is a Gram-negative halophilic bacterium that is found in estuarine, marine and coastal environments. V. parahaemolyticus is the leading causal agent of human acute gastroenteritis following the consumption of raw, undercooked, or mishandled marine products. In rare cases, V. parahaemolyticus causes wound infection, ear infection or septicaemia in individuals with pre-existing medical conditions. V. parahaemolyticus has two hemolysins virulence factors that are thermostable direct hemolysin (tdh)-a pore-forming protein that contributes to the invasiveness of the bacterium in humans, and TDH-related hemolysin (trh), which plays a similar role as tdh in the disease pathogenesis. In addition, the bacterium is also encodes for adhesions and type III secretion systems (T3SS1 and T3SS2) to ensure its survival in the environment. This review aims at discussing the V. parahaemolyticus growth and characteristics, pathogenesis, prevalence and advances in molecular identification techniques. PMID:25566219

  17. Integrating advanced materials simulation techniques into an automated data analysis workflow at the Spallation Neutron Source

    SciTech Connect

    Borreguero Calvo, Jose M; Campbell, Stuart I; Delaire, Olivier A; Doucet, Mathieu; Goswami, Monojoy; Hagen, Mark E; Lynch, Vickie E; Proffen, Thomas E; Ren, Shelly; Savici, Andrei T; Sumpter, Bobby G

    2014-01-01

    This presentation will review developments on the integration of advanced modeling and simulation techniques into the analysis step of experimental data obtained at the Spallation Neutron Source. A workflow framework for the purpose of refining molecular mechanics force-fields against quasi-elastic neutron scattering data is presented. The workflow combines software components to submit model simulations to remote high performance computers, a message broker interface for communications between the optimizer engine and the simulation production step, and tools to convolve the simulated data with the experimental resolution. A test application shows the correction to a popular fixed-charge water model in order to account polarization effects due to the presence of solvated ions. Future enhancements to the refinement workflow are discussed. This work is funded through the DOE Center for Accelerating Materials Modeling.

  18. Development of advanced techniques for rotorcraft state estimation and parameter identification

    NASA Technical Reports Server (NTRS)

    Hall, W. E., Jr.; Bohn, J. G.; Vincent, J. H.

    1980-01-01

    An integrated methodology for rotorcraft system identification consists of rotorcraft mathematical modeling, three distinct data processing steps, and a technique for designing inputs to improve the identifiability of the data. These elements are as follows: (1) a Kalman filter smoother algorithm which estimates states and sensor errors from error corrupted data. Gust time histories and statistics may also be estimated; (2) a model structure estimation algorithm for isolating a model which adequately explains the data; (3) a maximum likelihood algorithm for estimating the parameters and estimates for the variance of these estimates; and (4) an input design algorithm, based on a maximum likelihood approach, which provides inputs to improve the accuracy of parameter estimates. Each step is discussed with examples to both flight and simulated data cases.

  19. Recent advances in molecular medicine techniques for the diagnosis, prevention, and control of infectious diseases.

    PubMed

    França, R F O; da Silva, C C; De Paula, S O

    2013-06-01

    In recent years we have observed great advances in our ability to combat infectious diseases. Through the development of novel genetic methodologies, including a better understanding of pathogen biology, pathogenic mechanisms, advances in vaccine development, designing new therapeutic drugs, and optimization of diagnostic tools, significant infectious diseases are now better controlled. Here, we briefly describe recent reports in the literature concentrating on infectious disease control. The focus of this review is to describe the molecular methods widely used in the diagnosis, prevention, and control of infectious diseases with regard to the innovation of molecular techniques. Since the list of pathogenic microorganisms is extensive, we emphasize some of the major human infectious diseases (AIDS, tuberculosis, malaria, rotavirus, herpes virus, viral hepatitis, and dengue fever). As a consequence of these developments, infectious diseases will be more accurately and effectively treated; safe and effective vaccines are being developed and rapid detection of infectious agents now permits countermeasures to avoid potential outbreaks and epidemics. But, despite considerable progress, infectious diseases remain a strong challenge to human survival. PMID:23339016

  20. Recent advances in molecular medicine techniques for the diagnosis, prevention, and control of infectious diseases.

    PubMed

    França, R F O; da Silva, C C; De Paula, S O

    2013-06-01

    In recent years we have observed great advances in our ability to combat infectious diseases. Through the development of novel genetic methodologies, including a better understanding of pathogen biology, pathogenic mechanisms, advances in vaccine development, designing new therapeutic drugs, and optimization of diagnostic tools, significant infectious diseases are now better controlled. Here, we briefly describe recent reports in the literature concentrating on infectious disease control. The focus of this review is to describe the molecular methods widely used in the diagnosis, prevention, and control of infectious diseases with regard to the innovation of molecular techniques. Since the list of pathogenic microorganisms is extensive, we emphasize some of the major human infectious diseases (AIDS, tuberculosis, malaria, rotavirus, herpes virus, viral hepatitis, and dengue fever). As a consequence of these developments, infectious diseases will be more accurately and effectively treated; safe and effective vaccines are being developed and rapid detection of infectious agents now permits countermeasures to avoid potential outbreaks and epidemics. But, despite considerable progress, infectious diseases remain a strong challenge to human survival.

  1. Identification and comparison of electrical tapes using instrumental and statistical techniques: I. Microscopic surface texture and elemental composition.

    PubMed

    Goodpaster, John V; Sturdevant, Amanda B; Andrews, Kristen L; Brun-Conti, Leanora

    2007-05-01

    Comparisons of polyvinyl chloride electrical tape typically rely upon evaluating class characteristics such as physical dimensions, surface texture, and chemical composition. Given the various techniques that are available for this purpose, a comprehensive study has been undertaken to establish an optimal analytical scheme for electrical tape comparisons. Of equal importance is the development of a quantitative means for sample discrimination. In this study, 67 rolls of black electrical tape representing 34 different nominal brands were analyzed via scanning electron microscopy and energy dispersive spectroscopy. Differences in surface roughness, calendering marks, and filler particle size were readily apparent, including between some rolls of the same nominal brand. The relative amounts of magnesium, aluminum, silicon, sulfur, lead, chlorine, antimony, calcium, titanium, and zinc varied greatly between brands and, in some cases, could be linked to the year of manufacture. For the first time, quantitative differentiation of electrical tapes was achieved through multivariate statistical techniques, with 36 classes identified within the sample population. A single-blind study was also completed where questioned tape samples were correctly associated with known exemplars. Finally, two case studies are presented where tape recovered from an improvised explosive device is compared with tape recovered from a suspect. PMID:17456089

  2. The problem of sexual imbalance and techniques of the self in the Diagnostic and Statistical Manual of Mental Disorders.

    PubMed

    Flore, Jacinthe

    2016-09-01

    This article examines the problematization of sexual appetite and its imbalances in the development of the Diagnostic and Statistical Manual of Mental Disorders (DSM) in the twentieth and twenty-first centuries. The dominant strands of historiographies of sexuality have focused on historicizing sexual object choice and understanding the emergence of sexual identities. This article emphasizes the need to contextualize these histories within a broader frame of historical interest in the problematization of sexual appetite. The first part highlights how sexual object choice, as a paradigm of sexual dysfunctions, progressively receded from medical interest in the twentieth century as the clinical gaze turned to the problem of sexual appetite and its imbalances. The second part uses the example of the newly introduced Female Sexual Interest/Arousal Disorder in the DSM-5 to explore how the Manual functions as a technique for taking care of the self. I argue that the design of the Manual and associated inventories and questionnaires paved the way for their interpretation and application as techniques for self-examination.

  3. Detection of methemoglobin in whole blood based on confocal micro-Raman spectroscopy and multivariate statistical techniques.

    PubMed

    Zhu, M F; Ye, X P; Huang, Y Y; Guo, Z Y; Zhuang, Z F; Liu, S H

    2014-01-01

    Raman spectroscopy has been shown to have the potential for revealing oxygenated and spin ability of hemoglobin. In this study, confocal micro-Raman spectroscopy is developed to monitor the effect of sodium nitrite on oxyhemoglobin (HbO2 ) in whole blood. We observe that the band at 1,638 cm(-1) which is sensitive to the oxidation state decreases dramatically, while the 1,586 cm(-1) (low-spin state band) reduces both in methemoglobin (MetHb) and poisoning blood. Our results show that adding in sodium nitrite lead to the transition from HbO2 (Fe(2+) ) to MetHb (Fe(3+) ) in whole blood, and the iron atom converts from the low spin state to the high spin state with a delocalization from porphyrin plane. Moreover, multivariate statistical techniques, including principal components analysis (PCA) and linear discriminant analysis (LDA) are employed to develop effective diagnostic algorithms for classification of spectra between pure blood and poisoning blood. The diagnostic algorithms based on PCA-LDA yield a diagnostic sensitivity of 100% and specificity of 100% for separating poisoning blood from normal blood. Receiver operating characteristic (ROC) curve further confirms the effectiveness of the diagnostic algorithm based on PCA-LDA technique. The results from this study demonstrate that Raman spectroscopy combined with PCA-LDA algorithms has tremendous potential for the non-invasive detection of nitrite poisoning blood. PMID:24729434

  4. Recent advances on techniques and theories of feedforward networks with supervised learning

    NASA Astrophysics Data System (ADS)

    Xu, Lei; Klasa, Stan

    1992-07-01

    The rediscovery and popularization of the back propagation training technique for multilayer perceptrons as well as the invention of the Boltzmann Machine learning algorithm has given a new boost to the study of supervised learning networks. In recent years, besides the widely spread applications and the various further improvements of the classical back propagation technique, many new supervised learning models, techniques as well as theories, have also been proposed in a vast number of publications. This paper tries to give a rather systematical review on the recent advances on supervised learning techniques and theories for static feedforward networks. We summarize a great number of developments into four aspects: (1) Various improvements and variants made on the classical back propagation techniques for multilayer (static) perceptron nets, for speeding up training, avoiding local minima, increasing the generalization ability, as well as for many other interesting purposes. (2) A number of other learning methods for training multilayer (static) perceptron, such as derivative estimation by perturbation, direct weight update by perturbation, genetic algorithms, recursive least square estimate and extended Kalman filter, linear programming, the policy of fixing one layer while updating another, constructing networks by converting decision tree classifiers, and others. (3) Various other feedforward models which are also able to implement function approximation, probability density estimation and classification, including various models of basis function expansion (e.g., radial basis functions, restricted coulomb energy, multivariate adaptive regression splines, trigonometric and polynomial bases, projection pursuit, basis function tree, and may others), and several other supervised learning models. (4) Models with complex structures, e.g., modular architecture, hierarchy architecture, and others. (5) A number of theoretical issues involving the universal

  5. Effects of operating parameters on advanced oxidation of diuron by the Fenton's reagent: a statistical design approach.

    PubMed

    Catalkaya, Ebru Cokay; Kargi, Fikret

    2007-09-01

    Advanced oxidation of diuron in aqueous solution by Fenton's reagent using FeSO(4) as source of Fe(II) was investigated in the absence of light. Effects of operating parameters namely the concentrations of pesticide (diuron), H(2)O(2) and Fe(II) on oxidation of diuron was investigated by using Box-Behnken statistical experiment design and the surface response analysis. Diuron oxidation by the Fenton reagent was evaluated by determining the total organic carbon (TOC), diuron, and adsorbable organic halogen (AOX) removals. Concentration ranges of the reagents resulting in the highest level of diuron oxidation were determined. Diuron removal increased with increasing H(2)O(2) and Fe(II) concentrations up to a certain level. Diuron concentration had a more profound effect than H(2)O(2) and Fe(II) in removal of diuron, TOC and AOX from the aqueous solution. Nearly complete (98.5%) disappearance of diuron was achieved after 15min reaction period. However, only 58% of diuron was mineralized after 240min under optimal operating conditions indicating formation of some intermediate products. Optimal H(2)O(2)/Fe(II)/diuron ratio resulting in the maximum diuron removal (98.5%) was found to be 302/38/20 (mgl(-1)).

  6. Advancements in sensing and perception using structured lighting techniques :an LDRD final report.

    SciTech Connect

    Novick, David Keith; Padilla, Denise D.; Davidson, Patrick A. Jr.; Carlson, Jeffrey J.

    2005-09-01

    This report summarizes the analytical and experimental efforts for the Laboratory Directed Research and Development (LDRD) project entitled ''Advancements in Sensing and Perception using Structured Lighting Techniques''. There is an ever-increasing need for robust, autonomous ground vehicles for counterterrorism and defense missions. Although there has been nearly 30 years of government-sponsored research, it is undisputed that significant advancements in sensing and perception are necessary. We developed an innovative, advanced sensing technology for national security missions serving the Department of Energy, the Department of Defense, and other government agencies. The principal goal of this project was to develop an eye-safe, robust, low-cost, lightweight, 3D structured lighting sensor for use in broad daylight outdoor applications. The market for this technology is wide open due to the unavailability of such a sensor. Currently available laser scanners are slow, bulky and heavy, expensive, fragile, short-range, sensitive to vibration (highly problematic for moving platforms), and unreliable for outdoor use in bright sunlight conditions. Eye-safety issues are a primary concern for currently available laser-based sensors. Passive, stereo-imaging sensors are available for 3D sensing but suffer from several limitations : computationally intensive, require a lighted environment (natural or man-made light source), and don't work for many scenes or regions lacking texture or with ambiguous texture. Our approach leveraged from the advanced capabilities of modern CCD camera technology and Center 6600's expertise in 3D world modeling, mapping, and analysis, using structured lighting. We have a diverse customer base for indoor mapping applications and this research extends our current technology's lifecycle and opens a new market base for outdoor 3D mapping. Applications include precision mapping, autonomous navigation, dexterous manipulation, surveillance and

  7. A review of ocean color remote sensing methods and statistical techniques for the detection, mapping and analysis of phytoplankton blooms in coastal and open oceans

    NASA Astrophysics Data System (ADS)

    Blondeau-Patissier, David; Gower, James F. R.; Dekker, Arnold G.; Phinn, Stuart R.; Brando, Vittorio E.

    2014-04-01

    The need for more effective environmental monitoring of the open and coastal ocean has recently led to notable advances in satellite ocean color technology and algorithm research. Satellite ocean color sensors' data are widely used for the detection, mapping and monitoring of phytoplankton blooms because earth observation provides a synoptic view of the ocean, both spatially and temporally. Algal blooms are indicators of marine ecosystem health; thus, their monitoring is a key component of effective management of coastal and oceanic resources. Since the late 1970s, a wide variety of operational ocean color satellite sensors and algorithms have been developed. The comprehensive review presented in this article captures the details of the progress and discusses the advantages and limitations of the algorithms used with the multi-spectral ocean color sensors CZCS, SeaWiFS, MODIS and MERIS. Present challenges include overcoming the severe limitation of these algorithms in coastal waters and refining detection limits in various oceanic and coastal environments. To understand the spatio-temporal patterns of algal blooms and their triggering factors, it is essential to consider the possible effects of environmental parameters, such as water temperature, turbidity, solar radiation and bathymetry. Hence, this review will also discuss the use of statistical techniques and additional datasets derived from ecosystem models or other satellite sensors to characterize further the factors triggering or limiting the development of algal blooms in coastal and open ocean waters.

  8. Advanced Modeling Techniques to Study Anthropogenic Influences on Atmospheric Chemical Budgets

    NASA Technical Reports Server (NTRS)

    Mathur, Rohit

    1997-01-01

    This research work is a collaborative effort between research groups at MCNC and the University of North Carolina at Chapel Hill. The overall objective of this research is to improve the level of understanding of the processes that determine the budgets of chemically and radiatively active compounds in the atmosphere through development and application of advanced methods for calculating the chemical change in atmospheric models. The research performed during the second year of this project focused on four major aspects: (1) The continued development and refinement of multiscale modeling techniques to address the issue of the disparate scales of the physico-chemical processes that govern the fate of atmospheric pollutants; (2) Development and application of analysis methods utilizing process and mass balance techniques to increase the interpretive powers of atmospheric models and to aid in complementary analysis of model predictions and observations; (3) Development of meteorological and emission inputs for initial application of the chemistry/transport model over the north Atlantic region; and, (4) The continued development and implementation of a totally new adaptive chemistry representation that changes the details of what is represented as the underlying conditions change.

  9. Advancing the frontiers in nanocatalysis, biointerfaces, and renewable energy conversion by innovations of surface techniques.

    PubMed

    Somorjai, Gabor A; Frei, Heinz; Park, Jeong Y

    2009-11-25

    The challenge of chemistry in the 21st century is to achieve 100% selectivity of the desired product molecule in multipath reactions ("green chemistry") and develop renewable energy based processes. Surface chemistry and catalysis play key roles in this enterprise. Development of in situ surface techniques such as high-pressure scanning tunneling microscopy, sum frequency generation (SFG) vibrational spectroscopy, time-resolved Fourier transform infrared methods, and ambient pressure X-ray photoelectron spectroscopy enabled the rapid advancement of three fields: nanocatalysts, biointerfaces, and renewable energy conversion chemistry. In materials nanoscience, synthetic methods have been developed to produce monodisperse metal and oxide nanoparticles (NPs) in the 0.8-10 nm range with controlled shape, oxidation states, and composition; these NPs can be used as selective catalysts since chemical selectivity appears to be dependent on all of these experimental parameters. New spectroscopic and microscopic techniques have been developed that operate under reaction conditions and reveal the dynamic change of molecular structure of catalysts and adsorbed molecules as the reactions proceed with changes in reaction intermediates, catalyst composition, and oxidation states. SFG vibrational spectroscopy detects amino acids, peptides, and proteins adsorbed at hydrophobic and hydrophilic interfaces and monitors the change of surface structure and interactions with coadsorbed water. Exothermic reactions and photons generate hot electrons in metal NPs that may be utilized in chemical energy conversion. The photosplitting of water and carbon dioxide, an important research direction in renewable energy conversion, is discussed.

  10. Development of Advanced In-Situ Techniques for Chemistry Monitoring and Corrosion Mitigation in SCWO Environments

    SciTech Connect

    Macdonald, D. D.; Lvov, S. N.

    2000-03-31

    This project is developing sensing technologies and corrosion monitoring techniques for use in super critical water oxidation (SCWO) systems to reduce the volume of mixed low-level nuclear waste by oxidizing organic components in a closed cycle system where CO2 and other gaseous oxides are produced, leaving the radioactive elements concentrated in ash. The technique uses water at supercritical temperatures under highly oxidized conditions by maintaining a high fugacity of molecular oxygen in the system, which causes high corrosion rates of even the most corrosive resistant reactor materials. This project significantly addresses the high corrosion shortcoming through development of (a) advanced electrodes and sensors for in situ potentiometric monitoring of pH in high subcritical and supercritical aqueous solutions, (b) an approach for evaluating the association constants for 1-1 aqueous electrolytes using a flow-through electrochemical thermocell; (c) an electrochemical noise sensor for the in situ measurement of corrosion rate in subcritical and supercritical aqueous systems; (d) a model for estimating the effect of pressure on reaction rates, including corrosion reactions, in high subcritical and supercritical aqueous systems. The project achieved all objectives, except for installing some of the sensors into a fully operating SCWO system.

  11. Investigation to advance prediction techniques of the low-speed aerodynamics of V/STOL aircraft

    NASA Technical Reports Server (NTRS)

    Maskew, B.; Strash, D.; Nathman, J.; Dvorak, F. A.

    1985-01-01

    A computer program, VSAERO, has been applied to a number of V/STOL configurations with a view to advancing prediction techniques for the low-speed aerodynamic characteristics. The program couples a low-order panel method with surface streamline calculation and integral boundary layer procedures. The panel method--which uses piecewise constant source and doublet panels-includes an iterative procedure for wake shape and models boundary layer displacement effect using the source transpiration technique. Certain improvements to a basic vortex tube jet model were installed in the code prior to evaluation. Very promising results were obtained for surface pressures near a jet issuing at 90 deg from a flat plate. A solid core model was used in the initial part of the jet with a simple entrainment model. Preliminary representation of the downstream separation zone significantly improve the correlation. The program accurately predicted the pressure distribution inside the inlet on the Grumman 698-411 design at a range of flight conditions. Furthermore, coupled viscous/potential flow calculations gave very close correlation with experimentally determined operational boundaries dictated by the onset of separation inside the inlet. Experimentally observed degradation of these operational boundaries between nacelle-alone tests and tests on the full configuration were also indicated by the calculation. Application of the program to the General Dynamics STOL fighter design were equally encouraging. Very close agreement was observed between experiment and calculation for the effects of power on pressure distribution, lift and lift curve slope.

  12. Advancing the Frontiers in Nanocatalysis, Biointerfaces, and Renewable Energy Conversion by Innovations of Surface Techniques

    SciTech Connect

    Somorjai, G.A.; Frei, H.; Park, J.Y.

    2009-07-23

    The challenge of chemistry in the 21st century is to achieve 100% selectivity of the desired product molecule in multipath reactions ('green chemistry') and develop renewable energy based processes. Surface chemistry and catalysis play key roles in this enterprise. Development of in situ surface techniques such as high-pressure scanning tunneling microscopy, sum frequency generation (SFG) vibrational spectroscopy, time-resolved Fourier transform infrared methods, and ambient pressure X-ray photoelectron spectroscopy enabled the rapid advancement of three fields: nanocatalysts, biointerfaces, and renewable energy conversion chemistry. In materials nanoscience, synthetic methods have been developed to produce monodisperse metal and oxide nanoparticles (NPs) in the 0.8-10 nm range with controlled shape, oxidation states, and composition; these NPs can be used as selective catalysts since chemical selectivity appears to be dependent on all of these experimental parameters. New spectroscopic and microscopic techniques have been developed that operate under reaction conditions and reveal the dynamic change of molecular structure of catalysts and adsorbed molecules as the reactions proceed with changes in reaction intermediates, catalyst composition, and oxidation states. SFG vibrational spectroscopy detects amino acids, peptides, and proteins adsorbed at hydrophobic and hydrophilic interfaces and monitors the change of surface structure and interactions with coadsorbed water. Exothermic reactions and photons generate hot electrons in metal NPs that may be utilized in chemical energy conversion. The photosplitting of water and carbon dioxide, an important research direction in renewable energy conversion, is discussed.

  13. Procedural guidance using advance imaging techniques for percutaneous edge-to-edge mitral valve repair.

    PubMed

    Quaife, Robert A; Salcedo, Ernesto E; Carroll, John D

    2014-02-01

    The complexity of structural heart disease interventions such as edge-to edge mitral valve repair requires integration of multiple highly technical imaging modalities. Real time imaging with 3-dimensional (3D) echocardiography is a relatively new technique that first, allows clear volumetric imaging of target structures such as the mitral valve for both pre-procedural diagnosis and planning in patients with degenerative or functional mitral valve regurgitation. Secondly it provides intra-procedural, real-time panoramic volumetric 3D view of structural heart disease targets that facilitates eye-hand coordination while manipulating devices within the heart. X-ray fluoroscopy and RT 3D TEE images are used in combination to display specific targets and movement of catheter based technologies in 3D space. This integration requires at least two different image display monitors and mentally fusing the individual datasets by the operator. Combined display technology such as this, allow rotation and orientation of both dataset perspectives necessary to define targets and guidance of structural disease device procedures. The inherently easy concept of direct visual feedback and eye-hand coordination allows safe and efficient completion of MitraClip procedures. This technology is now merged into a single structural heart disease guidance mode called EchoNavigator(TM) (Philips Medical Imaging Andover, MA). These advanced imaging techniques have revolutionized the field of structural heart disease interventions and this experience is exemplified by a cooperative imaging approach used for guidance of edge-to-edge mitral valve repair procedures.

  14. EPS in Environmental Microbial Biofilms as Examined by Advanced Imaging Techniques

    NASA Astrophysics Data System (ADS)

    Neu, T. R.; Lawrence, J. R.

    2006-12-01

    Biofilm communities are highly structured associations of cellular and polymeric components which are involved in biogenic and geogenic environmental processes. Furthermore, biofilms are also important in medical (infection), industrial (biofouling) and technological (biofilm engineering) processes. The interfacial microbial communities in a specific habitat are highly dynamic and change according to the environmental parameters affecting not only the cellular but also the polymeric constituents of the system. Through their EPS biofilms interact with dissolved, colloidal and particulate compounds from the bulk water phase. For a long time the focus in biofilm research was on the cellular constituents in biofilms and the polymer matrix in biofilms has been rather neglected. The polymer matrix is produced not only by different bacteria and archaea but also by eukaryotic micro-organisms such as algae and fungi. The mostly unidentified mixture of EPS compounds is responsible for many biofilm properties and is involved in biofilm functionality. The chemistry of the EPS matrix represents a mixture of polymers including polysaccharides, proteins, nucleic acids, neutral polymers, charged polymers, amphiphilic polymers and refractory microbial polymers. The analysis of the EPS may be done destructively by means of extraction and subsequent chemical analysis or in situ by means of specific probes in combination with advanced imaging. In the last 15 years laser scanning microscopy (LSM) has been established as an indispensable technique for studying microbial communities. LSM with 1-photon and 2-photon excitation in combination with fluorescence techniques allows 3-dimensional investigation of fully hydrated, living biofilm systems. This approach is able to reveal data on biofilm structural features as well as biofilm processes and interactions. The fluorescent probes available allow the quantitative assessment of cellular as well as polymer distribution. For this purpose

  15. Statistical Diversions

    ERIC Educational Resources Information Center

    Petocz, Peter; Sowey, Eric

    2012-01-01

    The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the statistical…

  16. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    NASA Astrophysics Data System (ADS)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  17. Recent Advances in Stable Isotope Techniques for N2O Source Partitioning in Soils

    NASA Astrophysics Data System (ADS)

    Baggs, E.; Mair, L.; Mahmood, S.

    2007-12-01

    The use of 13C, 15N and 18O enables us to overcome uncertainties associated with soil C and N processes and to assess the links between species diversity and ecosystem function. Recent advances in stable isotope techniques enable determination of process rates, and are fundamental for examining interactions between C and N cycles. Here we will introduce the 15N-, 18O- and 13C-enrichment techniques we have developed to distinguish between different N2O-producing processes in situ in soils, presenting selected results, and will critically assess their potential, alone and in combination with molecular techniques, to help address key research questions for soil biogeochemistry and microbial ecology. We have developed 15N- 18O-enrichment techniques to distinguish between, and to quantify, N2O production during ammonia oxidation, nitrifier denitrification and denitrification. This provides a great advantage over natural abundance approaches as it enables quantification of N2O from each microbial source, which can be coupled with quantification of N2 production, and used to examine interactions between different processes and cycles. These approaches have also provided new insights into the N cycle and how it interacts with the C cycle. For example, we now know that ammonia oxidising bacteria significantly contribute to N2O emissions from soils, both via the traditionally accepted ammonia oxidation pathway, and also via denitrification (nitrifier denitrification) which can proceed even under aerobic conditions. We are also linking emissions from each source to diversity and activity of relevant microbial functional groups, for example through the development and application of a specific nirK primer for the nitrite reductase in ammonia oxidising bacteria. Recently, isotopomers have been proposed as an alternative for source partitioning N2O at natural abundance levels, and offers the potential to investigate N2O production from nitrate ammonification, and overcomes the

  18. Statistically advanced, self-similar, radial probability density functions of atmospheric and under-expanded hydrogen jets

    NASA Astrophysics Data System (ADS)

    Ruggles, Adam J.

    2015-11-01

    This paper presents improved statistical insight regarding the self-similar scalar mixing process of atmospheric hydrogen jets and the downstream region of under-expanded hydrogen jets. Quantitative planar laser Rayleigh scattering imaging is used to probe both jets. The self-similarity of statistical moments up to the sixth order (beyond the literature established second order) is documented in both cases. This is achieved using a novel self-similar normalization method that facilitated a degree of statistical convergence that is typically limited to continuous, point-based measurements. This demonstrates that image-based measurements of a limited number of samples can be used for self-similar scalar mixing studies. Both jets exhibit the same radial trends of these moments demonstrating that advanced atmospheric self-similarity can be applied in the analysis of under-expanded jets. Self-similar histograms away from the centerline are shown to be the combination of two distributions. The first is attributed to turbulent mixing. The second, a symmetric Poisson-type distribution centered on zero mass fraction, progressively becomes the dominant and eventually sole distribution at the edge of the jet. This distribution is attributed to shot noise-affected pure air measurements, rather than a diffusive superlayer at the jet boundary. This conclusion is reached after a rigorous measurement uncertainty analysis and inspection of pure air data collected with each hydrogen data set. A threshold based upon the measurement noise analysis is used to separate the turbulent and pure air data, and thusly estimate intermittency. Beta-distributions (four parameters) are used to accurately represent the turbulent distribution moments. This combination of measured intermittency and four-parameter beta-distributions constitutes a new, simple approach to model scalar mixing. Comparisons between global moments from the data and moments calculated using the proposed model show excellent

  19. Use of statistical and GIS techniques to assess and predict concentrations of heavy metals in soils of Lahore City, Pakistan.

    PubMed

    Alam, Nayab; Ahmad, Sajid Rashid; Qadir, Abdul; Ashraf, Muhammad Imran; Lakhan, Calvin; Lakhan, V Chris

    2015-10-01

    Soils from different land use areas in Lahore City, Pakistan, were analyzed for concentrations of heavy metals-cadmium (Cd), chromium (Cr), nickel (Ni), and lead (Pb). One hundred one samples were randomly collected from six land use areas categorized as park, commercial, agricultural, residential, urban, and industrial. Each sample was analyzed in the laboratory with the tri-acid digestion method. Metal concentrations in each sample were obtained with the use of an atomic absorption spectrophotometer. The statistical techniques of analysis of variance, correlation analysis, and cluster analysis were used to analyze all data. In addition, kriging, a geostatistical procedure supported by ArcGIS, was used to model and predict the spatial concentrations of the four heavy metals-Cd, Cr, Ni, and Pb. The results demonstrated significant correlation among the heavy metals in the urban and industrial areas. The dendogram, and the results associated with the cluster analysis, indicated that the agricultural, commercial, and park areas had high concentrations of Cr, Ni, and Pb. High concentrations of Cd and Ni were also observed in the residential and industrial areas, respectively. The maximum concentrations of both Cd and Pb exceeded world toxic limit values. The kriging method demonstrated increasing spatial diffusion of both Cd and Pb concentrations throughout and beyond the Lahore City area. PMID:26391490

  20. Assessment of Water Quality in a Subtropical Alpine Lake Using Multivariate Statistical Techniques and Geostatistical Mapping: A Case Study

    PubMed Central

    Liu, Wen-Cheng; Yu, Hwa-Lung; Chung, Chung-En

    2011-01-01

    Concerns about the water quality in Yuan-Yang Lake (YYL), a shallow, subtropical alpine lake located in north-central Taiwan, has been rapidly increasing recently due to the natural and anthropogenic pollution. In order to understand the underlying physical and chemical processes as well as their associated spatial distribution in YYL, this study analyzes fourteen physico-chemical water quality parameters recorded at the eight sampling stations during 2008–2010 by using multivariate statistical techniques and a geostatistical method. Hierarchical clustering analysis (CA) is first applied to distinguish the three general water quality patterns among the stations, followed by the use of principle component analysis (PCA) and factor analysis (FA) to extract and recognize the major underlying factors contributing to the variations among the water quality measures. The spatial distribution of the identified major contributing factors is obtained by using a kriging method. Results show that four principal components i.e., nitrogen nutrients, meteorological factor, turbidity and nitrate factors, account for 65.52% of the total variance among the water quality parameters. The spatial distribution of principal components further confirms that nitrogen sources constitute an important pollutant contribution in the YYL. PMID:21695032

  1. Craniospinal Irradiation Techniques: A Dosimetric Comparison of Proton Beams With Standard and Advanced Photon Radiotherapy

    SciTech Connect

    Yoon, Myonggeun; Shin, Dong Ho; Kim, Jinsung; Kim, Jong Won; Kim, Dae Woong; Park, Sung Yong; Lee, Se Byeong; Kim, Joo Young; Park, Hyeon-Jin; Park, Byung Kiu; Shin, Sang Hoon

    2011-11-01

    Purpose: To evaluate the dosimetric benefits of advanced radiotherapy techniques for craniospinal irradiation in cancer in children. Methods and Materials: Craniospinal irradiation (CSI) using three-dimensional conformal radiotherapy (3D-CRT), tomotherapy (TOMO), and proton beam treatment (PBT) in the scattering mode was planned for each of 10 patients at our institution. Dosimetric benefits and organ-specific radiation-induced cancer risks were based on comparisons of dose-volume histograms (DVHs) and on the application of organ equivalent doses (OEDs), respectively. Results: When we analyzed the organ-at-risk volumes that received 30%, 60%, and 90% of the prescribed dose (PD), we found that PBT was superior to TOMO and 3D-CRT. On average, the doses delivered by PBT to the esophagus, stomach, liver, lung, pancreas, and kidney were 19.4 Gy, 0.6 Gy, 0.3 Gy, 2.5 Gy, 0.2 Gy, and 2.2 Gy for the PD of 36 Gy, respectively, which were significantly lower than the doses delivered by TOMO (22.9 Gy, 4.5 Gy, 6.1 Gy, 4.0 Gy, 13.3 Gy, and 4.9 Gy, respectively) and 3D-CRT (34.6 Gy, 3.6 Gy, 8.0 Gy, 4.6 Gy, 22.9 Gy, and 4.3 Gy, respectively). Although the average doses delivered by PBT to the chest and abdomen were significantly lower than those of 3D-CRT or TOMO, these differences were reduced in the head-and-neck region. OED calculations showed that the risk of secondary cancers in organs such as the stomach, lungs, thyroid, and pancreas was much higher when 3D-CRT or TOMO was used than when PBT was used. Conclusions: Compared with photon techniques, PBT showed improvements in most dosimetric parameters for CSI patients, with lower OEDs to organs at risk.

  2. Advances in turbulent mixing techniques to study microsecond protein folding reactions

    PubMed Central

    Kathuria, Sagar V.; Chan, Alexander; Graceffa, Rita; Nobrega, R. Paul; Matthews, C. Robert; Irving, Thomas C.; Perot, Blair; Bilsel, Osman

    2013-01-01

    Recent experimental and computational advances in the protein folding arena have shown that the readout of the one-dimensional sequence information into three-dimensional structure begins within the first few microseconds of folding. The initiation of refolding reactions has been achieved by several means, including temperature jumps, flash photolysis, pressure jumps and rapid mixing methods. One of the most commonly used means of initiating refolding of chemically-denatured proteins is by turbulent flow mixing with refolding dilution buffer, where greater than 99% mixing efficiency has been achieved within 10’s of microseconds. Successful interfacing of turbulent flow mixers with complementary detection methods, including time-resolved Fluorescence Spectroscopy (trFL), Förster Resonance Energy Transfer (FRET), Circular Dichroism (CD), Small-Angle X-ray Scattering (SAXS), Hydrogen Exchange (HX) followed by Mass Spectrometry (MS) and Nuclear Magnetic Resonance Spectroscopy (NMR), Infrared Spectroscopy (IR), and Fourier Transform IR Spectroscopy (FTIR), has made this technique very attractive for monitoring various aspects of structure formation during folding. Although continuous-flow (CF) mixing devices interfaced with trFL detection have a dead time of only 30 µs, burst-phases have been detected in this time scale during folding of peptides and of large proteins (e.g., CheY and TIM barrels). Furthermore, a major limitation of CF mixing technique has been the requirement of large quantities of sample. In this brief communication, we will discuss the recent flurry of activity in micromachining and microfluidics, guided by computational simulations, that are likely to lead to dramatic improvements in time resolution and sample consumption for CF mixers over the next few years. PMID:23868289

  3. Application of Energy Integration Techniques to the Design of Advanced Life Support Systems

    NASA Technical Reports Server (NTRS)

    Levri, Julie; Finn, Cory

    2000-01-01

    Exchanging heat between hot and cold streams within an advanced life support system can save energy. This savings will reduce the equivalent system mass (ESM) of the system. Different system configurations are examined under steady-state conditions for various percentages of food growth and waste treatment. The scenarios investigated represent possible design options for a Mars reference mission. Reference mission definitions are drawn from the ALSS Modeling and Analysis Reference Missions Document, which includes definitions for space station evolution, Mars landers, and a Mars base. For each scenario, streams requiring heating or cooling are identified and characterized by mass flow, supply and target temperatures and heat capacities. The Pinch Technique is applied to identify good matches for energy exchange between the hot and cold streams and to calculate the minimum external heating and cooling requirements for the system. For each pair of hot and cold streams that are matched, there will be a reduction in the amount of external heating and cooling required, and the original heating and cooling equipment will be replaced with a heat exchanger. The net cost savings can be either positive or negative for each stream pairing, and the priority for implementing each pairing can be ranked according to its potential cost savings. Using the Pinch technique, a complete system heat exchange network is developed and heat exchangers are sized to allow for calculation of ESM. The energy-integrated design typically has a lower total ESM than the original design with no energy integration. A comparison of ESM savings in each of the scenarios is made to direct future Pinch Analysis efforts.

  4. Analysis of deformation patterns through advanced DINSAR techniques in Istanbul megacity

    NASA Astrophysics Data System (ADS)

    Balik Sanli, F.; Calò, F.; Abdikan, S.; Pepe, A.; Gorum, T.

    2014-09-01

    As result of the Turkey's economic growth and heavy migration processes from rural areas, Istanbul has experienced a high urbanization rate, with severe impacts on the environment in terms of natural resources pressure, land-cover changes and uncontrolled sprawl. As a consequence, the city became extremely vulnerable to natural and man-made hazards, inducing ground deformation phenomena that threaten buildings and infrastructures and often cause significant socio-economic losses. Therefore, the detection and monitoring of such deformation patterns is of primary importance for hazard and risk assessment as well as for the design and implementation of effective mitigation strategies. Aim of this work is to analyze the spatial distribution and temporal evolution of deformations affecting the Istanbul metropolitan area, by exploiting advanced Differential SAR Interferometry (DInSAR) techniques. In particular, we apply the Small BAseline Subset (SBAS) approach to a dataset of 43 TerraSAR-X images acquired, between November 2010 and June 2012, along descending orbits with an 11-day revisit time and a 3 m × 3 m spatial resolution. The SBAS processing allowed us to remotely detect and monitor subsidence patterns over all the urban area as well as to provide detailed information at the scale of the single building. Such SBAS measurements, effectively integrated with ground-based monitoring data and thematic maps, allows to explore the relationship between the detected deformation phenomena and urbanization, contributing to improve the urban planning and management.

  5. Advanced real-time dynamic scene generation techniques for improved performance and fidelity

    NASA Astrophysics Data System (ADS)

    Bowden, Mark H.; Buford, James A.; Mayhall, Anthony J.

    2000-07-01

    Recent advances in real-time synthetic scene generation for Hardware-in-the-loop (HWIL) testing at the U.S. Army Aviation and Missile Command (AMCOM) Aviation and Missile Research, Development, and Engineering Center (AMRDEC) improve both performance and fidelity. Modeling ground target scenarios requires tradeoffs because of limited texture memory for imagery and limited main memory for elevation data. High- resolution insets have been used in the past to provide better fidelity in specific areas, such as in the neighborhood of a target. Improvements for ground scenarios include smooth transitions for high-resolution insets to reduce high spatial frequency artifacts at the borders of the inset regions and dynamic terrain paging to support large area databases. Transport lag through the scene generation system, including sensor emulation and interface components, has been dealt with in the past through the use of sub-window extraction from oversize scenes. This compensates for spatial effects of transport lag but not temporal effects. A new system has been developed and used successfully to compensate for a flashing coded beacon in the scene. Other techniques have been developed to synchronize the scene generator with the seeker under test (SUT) and to model atmospheric effects, sensor optic and electronics, and angular emissivity attenuation.

  6. Characterization techniques for the high-brightness particle beams of the Advanced Photon Source (APS)

    SciTech Connect

    Lumpkin, A.H.

    1993-08-01

    The Advanced Photon Source (APS) will be a third-generation synchrotron radiation (SR) user facility in the hard x-ray regime (10--100 keV). The design objectives for the 7-GeV storage ring include a positron beam natural emittance of 8 {times} 10{sup {minus}9} m-rad at an average current of 100 mA. Proposed methods for measuring the transverse and longitudinal profiles will be described. Additionally, a research and development effort using an rf gun as a low-emittance source of electrons for injection into the 200- to 650-MeV linac subsystem is underway. This latter system is projected to produce electron beams with a normalized, rms emittance of {approximately}2 {pi} mm-mrad at peak currents of near one hundred amps. This interesting characterization problem will also be briefly discussed. The combination of both source types within one laboratory facility will stimulate the development of diagnostic techniques in these parameter spaces.

  7. The development of optical microscopy techniques for the advancement of single-particle studies

    SciTech Connect

    Marchuk, Kyle

    2013-05-15

    Single particle orientation and rotational tracking (SPORT) has recently become a powerful optical microscopy tool that can expose many molecular motions. Unfortunately, there is not yet a single microscopy technique that can decipher all particle motions in all environmental conditions, thus there are limitations to current technologies. Within, the two powerful microscopy tools of total internal reflection and interferometry are advanced to determine the position, orientation, and optical properties of metallic nanoparticles in a variety of environments. Total internal reflection is an optical phenomenon that has been applied to microscopy to produce either fluorescent or scattered light. The non-invasive far-field imaging technique is coupled with a near-field illumination scheme that allows for better axial resolution than confocal microscopy and epi-fluorescence microscopy. By controlling the incident illumination angle using total internal reflection fluorescence (TIRF) microscopy, a new type of imaging probe called “non-blinking” quantum dots (NBQDs) were super-localized in the axial direction to sub-10-nm precision. These particles were also used to study the rotational motion of microtubules being propelled by the motor protein kinesin across the substrate surface. The same instrument was modified to function under total internal reflection scattering (TIRS) microscopy to study metallic anisotropic nanoparticles and their dynamic interactions with synthetic lipid bilayers. Utilizing two illumination lasers with opposite polarization directions at wavelengths corresponding to the short and long axis surface plasmon resonance (SPR) of the nanoparticles, both the in-plane and out-of-plane movements of many particles could be tracked simultaneously. When combined with Gaussian point spread function (PSF) fitting for particle super-localization, the binding status and rotational movement could be resolved without degeneracy. TIRS microscopy was also used to

  8. The development of optical microscopy techniques for the advancement of single-particle studies

    NASA Astrophysics Data System (ADS)

    Marchuk, Kyle

    Single particle orientation and rotational tracking (SPORT) has recently become a powerful optical microscopy tool that can expose many molecular motions. Unfortunately, there is not yet a single microscopy technique that can decipher all particle motions in all environmental conditions, thus there are limitations to current technologies. Within, the two powerful microscopy tools of total internal reflection and interferometry are advanced to determine the position, orientation, and optical properties of metallic nanoparticles in a variety of environments. Total internal reflection is an optical phenomenon that has been applied to microscopy to produce either fluorescent or scattered light. The non-invasive far-field imaging technique is coupled with a near-field illumination scheme that allows for better axial resolution than confocal microscopy and epi-fluorescence microscopy. By controlling the incident illumination angle using total internal reflection fluorescence (TIRF) microscopy, a new type of imaging probe called "non-blinking" quantum dots (NBQDs) were super-localized in the axial direction to sub-10-nm precision. These particles were also used to study the rotational motion of microtubules being propelled by the motor protein kinesin across the substrate surface. The same instrument was modified to function under total internal reflection scattering (TIRS) microscopy to study metallic anisotropic nanoparticles and their dynamic interactions with synthetic lipid bilayers. Utilizing two illumination lasers with opposite polarization directions at wavelengths corresponding to the short and long axis surface plasmon resonance (SPR) of the nanoparticles, both the in-plane and out-of-plane movements of many particles could be tracked simultaneously. When combined with Gaussian point spread function (PSF) fitting for particle super-localization, the binding status and rotational movement could be resolved without degeneracy. TIRS microscopy was also used to

  9. The Use of Multi-Component Statistical Techniques in Understanding Subduction Zone Arc Granitic Geochemical Data Sets

    NASA Astrophysics Data System (ADS)

    Pompe, L.; Clausen, B. L.; Morton, D. M.

    2015-12-01

    Multi-component statistical techniques and GIS visualization are emerging trends in understanding large data sets. Our research applies these techniques to a large igneous geochemical data set from southern California to better understand magmatic and plate tectonic processes. A set of 480 granitic samples collected by Baird from this area were analyzed for 39 geochemical elements. Of these samples, 287 are from the Peninsular Ranges Batholith (PRB) and 164 from part of the Transverse Ranges (TR). Principal component analysis (PCA) summarized the 39 variables into 3 principal components (PC) by matrix multiplication and for the PRB are interpreted as follows: PC1 with about 30% of the variation included mainly compatible elements and SiO2 and indicates extent of differentation; PC2 with about 20% of the variation included HFS elements and may indicate crustal contamination as usually identified by Sri; PC3 with about 20% of the variation included mainly HRE elements and may indicate magma source depth as often diplayed using REE spider diagrams and possibly Sr/Y. Several elements did not fit well in any of the three components: Cr, Ni, U, and Na2O.For the PRB, the PC1 correlation with SiO2 was r=-0.85, the PC2 correlation with Sri was r=0.80, and the PC3 correlation with Gd/Yb was r=-0.76 and with Sr/Y was r=-0.66 . Extending this method to the TR, correlations were r=-0.85, -0.21, -0.06, and -0.64, respectively. A similar extent of correlation for both areas was visually evident using GIS interpolation.PC1 seems to do well at indicating differentiation index for both the PRB and TR and correlates very well with SiO2, Al2O3, MgO, FeO*, CaO, K2O, Sc, V, and Co, but poorly with Na2O and Cr. If the crustal component is represented by Sri, PC2 correlates well and less expesively with this indicator in the PRB, but not in the TR. Source depth has been related to the slope on REE spidergrams, and PC3 based on only the HREE and using the Sr/Y ratios gives a reasonable

  10. Ultra-precision geometrical measurement technique based on a statistical random phase clock combined with acoustic-optical deflection

    NASA Astrophysics Data System (ADS)

    Ekberg, Peter; Stiblert, Lars; Mattsson, Lars

    2010-12-01

    Mask writers and large area measurements systems are key systems for production of large liquid crystal displays (LCD) and image devices. With position tolerances in the sub-µm range over square meter sized masks, the metrology challenges are indeed demanding. Most systems used for this type of measurement rely on a microscope camera imaging system, provided with a charge coupled device, a complementary metal-oxide-semiconductor sensor or a time delay and integration sensor to transform the optical image to a digital gray-level image. From this image, processing algorithms are used to extract information such as location of edges. The drawback of this technique is the vast amount of data captured but never used. This paper presents a new approach for ultra-high-precision lateral measurement at nm-levels of chrome/glass patterns separated by centimeters, so called registration marks, on masks used for the LCD manufacturing. Registration specifications demand a positioning accuracy <200 nm and critical dimensions, i.e. chrome line widths, which need to be accurate in the 80 nm range. This accuracy has to be achieved on glass masks of 2.4 × 1.6 m2 size. Our new measurement method is based on nm-precise lateral scanning of a focused laser beam combined with statistical random phase sampling of the reflected signal. The precise scanning is based on an extremely accurate time measuring device controlling an acousto optic deflector crystal. The method has been successfully applied in measuring the 4 µm pitch of reference gratings at standard deviations σ of 0.5 nm and registration marks separated by several cm at standard deviations of 23 nm.

  11. APPLICATION OF ADVANCED IN VITRO TECHNIQUES TO MEASURE, UNDERSTAND AND PREDICT THE KINETICS AND MECHANISMS OF XENOBIOTIC METABOLISM

    EPA Science Inventory

    We have developed a research program in metabolism that involves numerous collaborators across EPA as well as other federal and academic labs. A primary goal is to develop and apply advanced in vitro techniques to measure, understand and predict the kinetics and mechanisms of xen...

  12. Landslide detection and long-term monitoring in urban area by means of advanced interferometric techniques

    NASA Astrophysics Data System (ADS)

    Cigna, Francesca; Del Ventisette, Chiara; Liguori, Vincenzo; Casagli, Nicola

    2010-05-01

    This work aims at illustrating the potential of advanced interferometric techniques for detection and long-term monitoring of landslide ground deformations at local scale. Space-born InSAR (Synthetic Aperture Radar Interferometry) has been successfully exploited in recent years to measure ground deformations associated to processes with slow kinematics, such as landslides, tectonic motions, subsidence or volcanic activity, thanks to both the standard single-interferogram approach (centimeter accuracy) and advanced time-series analyses of long temporal radar satellite data stacks (millimeter accuracy), such as Persistent Scatterers Interferometry (PSI) techniques. In order to get a complete overview and an in-depth knowledge of an investigated landslide, InSAR satellite measures can support conventional in situ data. This methodology allows studying the spatial pattern and the temporal evolution of ground deformations, improving the spatial coverage and overcoming issues related to installation of ground-based instrumentation and data acquisition in unstable areas. Here we describe the application of the above-mentioned methodology on the test area of Agrigento, Sicily (Italy), affected by hydrogeological risk. The town is located in Southern Sicily, at edge of the Apennine-Maghrebian thrust belt, on the Plio-Pleistocene and Miocene sediments of the Gela Nappe. Ground instabilities affect the urban area and involve the infrastructures of its NW side, such as the Cathedral, the Seminary and many private buildings. An integration between InSAR analyses and conventional field investigations (e.g. structural damages and fractures surveys) was therefore carried out, to support Regional Civil Protection authorities for emergency management and risk mitigation. The results of InSAR analysis highlighted a general stability of the whole urban area between 1992 and 2007. However, very high deformation rates (up to 10-12 mm/y) were identified in 1992-2000 in the W slope of the

  13. Advances in the regionalization approach: geostatistical techniques for estimating flood quantiles

    NASA Astrophysics Data System (ADS)

    Chiarello, Valentina; Caporali, Enrica; Matthies, Hermann G.

    2015-04-01

    The knowledge of peak flow discharges and associated floods is of primary importance in engineering practice for planning of water resources and risk assessment. Streamflow characteristics are usually estimated starting from measurements of river discharges at stream gauging stations. However, the lack of observations at site of interest as well as the measurement inaccuracies, bring inevitably to the necessity of developing predictive models. Regional analysis is a classical approach to estimate river flow characteristics at sites where little or no data exists. Specific techniques are needed to regionalize the hydrological variables over the considered area. Top-kriging or topological kriging, is a kriging interpolation procedure that takes into account the geometric organization and structure of hydrographic network, the catchment area and the nested nature of catchments. The continuous processes in space defined for the point variables are represented by a variogram. In Top-kriging, the measurements are not point values but are defined over a non-zero catchment area. Top-kriging is applied here over the geographical space of Tuscany Region, in Central Italy. The analysis is carried out on the discharge data of 57 consistent runoff gauges, recorded from 1923 to 2014. Top-kriging give also an estimation of the prediction uncertainty in addition to the prediction itself. The results are validated using a cross-validation procedure implemented in the package rtop of the open source statistical environment R The results are compared through different error measurement methods. Top-kriging seems to perform better in nested catchments and larger scale catchments but no for headwater or where there is a high variability for neighbouring catchments.

  14. Seismic response analysis of NAGRA-Net stations using advanced geophysical techniques

    NASA Astrophysics Data System (ADS)

    Poggi, Valerio; Edwards, Benjamin; Dal Moro, Giancarlo; Keller, Lorenz; Fäh, Donat

    2015-04-01

    In cooperation with the National Cooperative for the Disposal of Radioactive Waste (Nagra), the Swiss Seismological Service (SED) has recently completed the installation of ten new seismological observation stations, three of them including a co-located borehole sensor. The ultimate goal of the project is to densify the existing Swiss Digital Seismic Network (SDSNet) in northern Switzerland, in order to improve the detection of very-low magnitude events and to improve the accuracy of future location solutions. This is strategic for unbiased monitoring of micro seismicity at the locations of proposed nuclear waste repositories. To further improve the quality and usability of the recordings, a seismic characterization of the area surrounding the installation area was performed at each site. The investigation consisted of a preliminary geological and geotechnical study, followed by a seismic site response analysis by means of state-of-the-art geophysical techniques. For the borehole stations, in particular, the characterization was performed by combining different types of active seismic methods (P-S refraction tomography, surface wave analysis, Vertical Seismic Profiling - VSP) with ambient vibration based approaches (wavelet decomposition, H/V spectral ratio, polarization analysis, three-component f-k analysis). The results of all analyses converged to the definition of a mean velocity profile for the site, which was later used for the computation of engineering parameters (travel time average velocity and quarter-wavelength parameters) and the analytical SH-wave transfer function. Empirical site-amplification functions are automatically determined for any station connected to the Swiss seismic networks. They are determined based on building statistical models of systematic site-specific effects in recordings of small earthquakes when compared to the Swiss stochastic ground-motion model. Computed site response is validated through comparison with these empirical

  15. [Recent advances of anastomosis techniques of esophagojejunostomy after laparoscopic totally gastrectomy in gastric tumor].

    PubMed

    Li, Xi; Ke, Chongwei

    2015-05-01

    The esophageal jejunum anastomosis of the digestive tract reconstruction techniques in laparoscopic total gastrectomy includes two categories: circular stapler anastomosis techniques and linear stapler anastomosis techniques. Circular stapler anastomosis techniques include manual anastomosis method, purse string instrument method, Hiki improved special anvil anastomosis technique, the transorally inserted anvil(OrVil(TM)) and reverse puncture device technique. Linear stapler anastomosis techniques include side to side anastomosis technique and Overlap side to side anastomosis technique. Esophageal jejunum anastomosis technique has a wide selection of different technologies with different strengths and the corresponding limitations. This article will introduce research progress of laparoscopic total gastrectomy esophagus jejunum anastomosis from both sides of the development of anastomosis technology and the selection of anastomosis technology.

  16. Advanced Sensing and Control Techniques to Facilitate Semi-Autonomous Decommissioning

    SciTech Connect

    Schalkoff, Robert J.

    1999-06-01

    This research is intended to advance the technology of semi-autonomous teleoperated robotics as applied to Decontamination and Decommissioning (D&D) tasks. Specifically, research leading to a prototype dual-manipulator mobile work cell is underway. This cell is supported and enhanced by computer vision, virtual reality and advanced robotics technology.

  17. An analysis of I/O efficient order-statistic-based techniques for noise power estimation in the HRMS sky survey's operational system

    NASA Technical Reports Server (NTRS)

    Zimmerman, G. A.; Olsen, E. T.

    1992-01-01

    Noise power estimation in the High-Resolution Microwave Survey (HRMS) sky survey element is considered as an example of a constant false alarm rate (CFAR) signal detection problem. Order-statistic-based noise power estimators for CFAR detection are considered in terms of required estimator accuracy and estimator dynamic range. By limiting the dynamic range of the value to be estimated, the performance of an order-statistic estimator can be achieved by simpler techniques requiring only a single pass of the data. Simple threshold-and-count techniques are examined, and it is shown how several parallel threshold-and-count estimation devices can be used to expand the dynamic range to meet HRMS system requirements with minimal hardware complexity. An input/output (I/O) efficient limited-precision order-statistic estimator with wide but limited dynamic range is also examined.

  18. Groundwater quality assessment of the shallow aquifers west of the Nile Delta (Egypt) using multivariate statistical and geostatistical techniques

    NASA Astrophysics Data System (ADS)

    Masoud, Alaa A.

    2014-07-01

    Extensive urban, agricultural and industrial expansions on the western fringe of the Nile Delta of Egypt have exerted much load on the water needs and lead to groundwater quality deterioration. Documenting the spatial variation of the groundwater quality and their controlling factors is vital to ensure sustainable water management and safe use. A comprehensive dataset of 451 shallow groundwater samples were collected in 2011 and 2012. On-site field measurements of the total dissolved solids (TDS), electric conductivity (EC), pH, temperature, as well as lab-based ionic composition of the major and trace components were performed. Groundwater types were derived and the suitability for irrigation use was evaluated. Multivariate statistical techniques of factor analysis and K-means clustering were integrated with the geostatistical semi-variogram modeling for evaluating the spatial hydrochemical variations and the driving factors as well as for hydrochemical pattern recognition. Most hydrochemical parameters showed very wide ranges; TDS (201-24,400 mg/l), pH (6.72-8.65), Na+ (28.30-7774 mg/l), and Cl- (7-12,186 mg/l) suggesting complex hydrochemical processes of multiple sources. TDS violated the limit (1200 mg/l) of the Egyptian standards for drinking water quality in many localities. Extreme concentrations of Fe2+, Mn2+, Zn2+, Cu2+, Ni2+, are mostly related to their natural content in the water-bearing sediments and/or to contamination from industrial leakage. Very high nitrate concentrations exceeding the permissible limit (50 mg/l) were potentially maximized toward hydrologic discharge zones and related to wastewater leakage. Three main water types; NaCl (29%), Na2SO4 (26%), and NaHCO3 (20%), formed 75% of the groundwater dominated in the saline depressions, sloping sides of the coastal ridges of the depressions, and in the cultivated/newly reclaimed lands intensely covered by irrigation canals, respectively. Water suitability for irrigation use clarified that the

  19. Advanced remote sensing techniques for forestry applications: an application case in Sarawak, Malaysia

    NASA Astrophysics Data System (ADS)

    Nezry, Edmond; Yakam-Simen, Francis; Romeijn, Paul P.; Supit, Iwan; Demargne, Louis

    2001-02-01

    12 This paper reports the operational implementation of new techniques for the exploitation of remote sensing data (SAR and optical) in the framework of forestry applications. In particular, we present a new technique for standing timber volume estimation. This technique is based on remote sensing knowledge (SAR and optical synergy) and forestry knowledge (forest structure models), proved fairly accurate. To illustrate the application of these techniques, an operational commercial case study regarding forest concessions in Sarawak is presented. Validation of this technique by comparison of the remote sensing results and the database of the customer has shown that this technique is fairly accurate.

  20. An advanced technique for speciation of organic nitrogen in atmospheric aerosols

    NASA Astrophysics Data System (ADS)

    Samy, S.; Robinson, J.; Hays, M. D.

    2011-12-01

    threshold as water-soluble free AA, with an average concentration of 22 ± 9 ng m-3 (N=13). Following microwave-assisted gas phase hydrolysis, the total AA concentration in the forest environment increased significantly (70 ± 35 ng m-3) and additional compounds (methionine, isoleucine) were detected above the reporting threshold. The ability to quantify AA in aerosol samples without derivatization reduces time consuming preparation procedures while providing the advancement of selective mass determination that eliminates potential interferences associated with traditional fluorescence detection. This step forward in precise mass determination with the use of internal standardization, improves the confidence of compound identification. With the increasing focus on WSOC (including ON) characterization in the atmospheric science community, native detection by LC-MS (Q-TOF) will play a central role in determining the most direct approach to quantify an increasing fraction of the co-extracted polar organic compounds. Method application for further characterization of atmospheric ON will be discussed. Reference: Samy, S., Robinson, J., and M.D. Hays. "An Advanced LC-MS (Q-TOF) Technique for the Detection of Amino Acids in Atmospheric Aerosols", Analytical Bioanalytical Chemistry, 2011, DOI: 10.1007/s00216-011-5238-2

  1. Techniques Optimized for Reducing Instabilities in Advanced Nickel-Base Superalloys for Turbine Blades

    NASA Technical Reports Server (NTRS)

    MacKay, Rebecca A.; Locci, Ivan E.; Garg, anita; Ritzert, Frank J.

    2002-01-01

    is a three-phase constituent composed of TCP and stringers of gamma phase in a matrix of gamma prime. An incoherent grain boundary separates the SRZ from the gammagamma prime microstructure of the superalloy. The SRZ is believed to form as a result of local chemistry changes in the superalloy due to the application of the diffusion aluminide bondcoat. Locally high surface stresses also appear to promote the formation of the SRZ. Thus, techniques that change the local alloy chemistry or reduce surface stresses have been examined for their effectiveness in reducing SRZ. These SRZ-reduction steps are performed on the test specimen or the turbine blade before the bondcoat is applied. Stressrelief heat treatments developed at NASA Glenn have been demonstrated to reduce significantly the amount of SRZ that develops during subsequent high-temperature exposures. Stress-relief heat treatments reduce surface stresses by recrystallizing a thin surface layer of the superalloy. However, in alloys with very high propensities to form SRZ, stress relief heat treatments alone do not eliminate SRZ entirely. Thus, techniques that modify the local chemistry under the bondcoat have been emphasized and optimized successfully at Glenn. One such technique is carburization, which changes the local chemistry by forming submicron carbides near the surface of the superalloy. Detailed characterizations have demonstrated that the depth and uniform distribution of these carbides are enhanced when a stress relief treatment and an appropriate surface preparation are employed in advance of the carburization treatment. Even in alloys that have the propensity to develop a continuous SRZ layer beneath the diffusion zone, the SRZ has been completely eliminated or reduced to low, manageable levels when this combination of techniques is utilized. Now that the techniques to mitigate SRZ have been established at Glenn, TCP phase formation is being emphasized in ongoing work under the UEET Program. The

  2. Investigation of Advanced Dose Verification Techniques for External Beam Radiation Treatment

    NASA Astrophysics Data System (ADS)

    Asuni, Ganiyu Adeniyi

    Intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT) have been introduced in radiation therapy to achieve highly conformal dose distributions around the tumour while minimizing dose to surrounding normal tissues. These techniques have increased the need for comprehensive quality assurance tests, to verify that customized patient treatment plans are accurately delivered during treatment. in vivo dose verification, performed during treatment delivery, confirms that the actual dose delivered is the same as the prescribed dose, helping to reduce treatment delivery errors. in vivo measurements may be accomplished using entrance or exit detectors. The objective of this project is to investigate a novel entrance detector designed for in vivo dose verification. This thesis is separated into three main investigations, focusing on a prototype entrance transmission detector (TRD) developed by IBA Dosimetry, Germany. First contaminant electrons generated by the TRD in a 6 MV photon beam were investigated using Monte Carlo (MC) simulation. This study demonstrates that modification of the contaminant electron model in the treatment planning system is required for accurate patient dose calculation in buildup regions when using the device. Second, the ability of the TRD to accurately measure dose from IMRT and VMAT was investigated by characterising the spatial resolution of the device. This was accomplished by measuring the point spread function with further validation provided by MC simulation. Comparisons of measured and calculated doses show that the spatial resolution of the TRD allows for measurement of clinical IMRT fields within acceptable tolerance. Finally, a new general research tool was developed to perform MC simulations for VMAT and IMRT treatments, simultaneously tracking dose deposition in both the patient CT geometry and an arbitrary planar detector system, generalized to handle either entrance or exit orientations. It was

  3. Advanced imaging techniques II: using a compound microscope for photographing point-mount specimens

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Digital imaging technology has revolutionized the practice photographing insects for scientific study. Herein described are lighting and mounting techniques designed for imaging micro Hymenoptera. Techniques described here are applicable to all small insects, as well as other invertebrates. The ke...

  4. Advanced Techniques for Assessment of Postural and Locomotor Ataxia, Spatial Orientation, and Gaze Stability

    NASA Technical Reports Server (NTRS)

    Wall, Conrad., III

    1999-01-01

    and quantified. We are improving this situation by applying methodologies such as nonlinear orbital stability to quantify responses and by using multivariate statistical approaches to link together the responses across separate tests. In this way we can exploit the information available and increase the ability to discriminate between normal and pathological responses. Measures of stability and orientation are compared to measures such as dynamic visual acuity and with balance function tests. The responses of normal human subjects and of patients having well documented pathophysiologies are being characterized. When these studies are completed, we should have a clearer idea about normal and abnormal patterns of eye, head, and body movements during locomotion and their stability in a wide range of environments. We plan eventually to use this information to validate the efficacy of candidate neurovestibular and neuromuscular rehabilitative techniques. Some representative studies made during this year are summarized.

  5. Beyond whole-body imaging: advanced imaging techniques of PET/MRI.

    PubMed

    Barnwell, James; Raptis, Constantine A; McConathy, Jonathan E; Laforest, Richard; Siegel, Barry A; Woodard, Pamela K; Fowler, Kathryn

    2015-02-01

    PET/MRI is a hybrid imaging modality that is gaining clinical interest with the first Food and Drug Administration-approved simultaneous imaging system recently added to the clinical armamentarium. Several advanced PET/MRI applications, such as high-resolution anatomic imaging, diffusion-weighted imaging, motion correction, and cardiac imaging, show great potential for clinical use. The purpose of this article is to highlight several advanced PET/MRI applications through case examples and review of the current literature.

  6. The investigation of advanced remote sensing, radiative transfer and inversion techniques for the measurement of atmospheric constituents

    NASA Technical Reports Server (NTRS)

    Deepak, Adarsh; Wang, Pi-Huan

    1985-01-01

    The research program is documented for developing space and ground-based remote sensing techniques performed during the period from December 15, 1977 to March 15, 1985. The program involved the application of sophisticated radiative transfer codes and inversion methods to various advanced remote sensing concepts for determining atmospheric constituents, particularly aerosols. It covers detailed discussions of the solar aureole technique for monitoring columnar aerosol size distribution, and the multispectral limb scattered radiance and limb attenuated radiance (solar occultation) techniques, as well as the upwelling scattered solar radiance method for determining the aerosol and gaseous characteristics. In addition, analytical models of aerosol size distribution and simulation studies of the limb solar aureole radiance technique and the variability of ozone at high altitudes during satellite sunrise/sunset events are also described in detail.

  7. Topology for statistical modeling of petascale data.

    SciTech Connect

    Pascucci, Valerio; Mascarenhas, Ajith Arthur; Rusek, Korben; Bennett, Janine Camille; Levine, Joshua; Pebay, Philippe Pierre; Gyulassy, Attila; Thompson, David C.; Rojas, Joseph Maurice

    2011-07-01

    This document presents current technical progress and dissemination of results for the Mathematics for Analysis of Petascale Data (MAPD) project titled 'Topology for Statistical Modeling of Petascale Data', funded by the Office of Science Advanced Scientific Computing Research (ASCR) Applied Math program. Many commonly used algorithms for mathematical analysis do not scale well enough to accommodate the size or complexity of petascale data produced by computational simulations. The primary goal of this project is thus to develop new mathematical tools that address both the petascale size and uncertain nature of current data. At a high level, our approach is based on the complementary techniques of combinatorial topology and statistical modeling. In particular, we use combinatorial topology to filter out spurious data that would otherwise skew statistical modeling techniques, and we employ advanced algorithms from algebraic statistics to efficiently find globally optimal fits to statistical models. This document summarizes the technical advances we have made to date that were made possible in whole or in part by MAPD funding. These technical contributions can be divided loosely into three categories: (1) advances in the field of combinatorial topology, (2) advances in statistical modeling, and (3) new integrated topological and statistical methods.

  8. Removal of Lattice Imperfections that Impact the Optical Quality of Ti:Sapphire using Advanced Magnetorheological Finishing Techniques

    SciTech Connect

    Menapace, J A; Schaffers, K I; Bayramian, A J; Davis, P J; Ebbers, C A; Wolfe, J E; Caird, J A; Barty, C J

    2008-02-26

    Advanced magnetorheological finishing (MRF) techniques have been applied to Ti:sapphire crystals to compensate for sub-millimeter lattice distortions that occur during the crystal growing process. Precise optical corrections are made by imprinting topographical structure onto the crystal surfaces to cancel out the effects of the lattice distortion in the transmitted wavefront. This novel technique significantly improves the optical quality for crystals of this type and sets the stage for increasing the availability of high-quality large-aperture sapphire and Ti:sapphire optics in critical applications.

  9. [Advancement of colloidal gold chromatographic technique in screening of ochratoxin A].

    PubMed

    Zhou, Wei-lu; Wang, Yu-ting; Kong, Wei-jun; Yang, Mei-hua; Zhao, Ming; Ou-Yang, Zhen

    2015-08-01

    Ochratoxin A (OTA) is a toxic secondary metabolite mainly produced by Aspergillus and Penicillium species, existing in a variety of foodstuffs and Chinese medicines. OTA is difficult to be detected in practice because of the characteristics such as trace amounts, toxicity, existing in complex matrices. In the numerous detection technologies, colloidal gold chromatographic techniques are highly sensitive, specific, cost-effective and user-friendly, and are being used increasingly for OTA screening. Recently, with the development of aptamer technology and its application in chromatographic technique, a newly colloidal gold aptamer chromatographic technique has been developed. This review elaborates the structures and principles of both traditional and newly colloidal gold chromatographic techniques, focuses on newly colloidal gold aptamer chromatographic technique, summarizes and compares their use in rapid detection of OTA. Finally, in order to provide a reference for better research of related work, the development trends of this novel technique are prospected.

  10. Adaptations of advanced safety and reliability techniques to petroleum and other industries

    NASA Technical Reports Server (NTRS)

    Purser, P. E.

    1974-01-01

    The underlying philosophy of the general approach to failure reduction and control is presented. Safety and reliability management techniques developed in the industries which have participated in the U.S. space and defense programs are described along with adaptations to nonaerospace activities. The examples given illustrate the scope of applicability of these techniques. It is indicated that any activity treated as a 'system' is a potential user of aerospace safety and reliability management techniques.

  11. Euromech 260: Advanced non-intrusive experimental techniques in fluid and plasma flows

    NASA Astrophysics Data System (ADS)

    The following topics are discussed: coherent anti-Stokes and elastic Rayleigh scattering; elastic scattering and non linear dynamics; fluorescence; molecular tracking techniques and particle image velocimetry.

  12. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). Safety Section: Modules 1-3. Instructor Book.

    ERIC Educational Resources Information Center

    Averitt, Sallie D.

    These three modules, which were developed for use by instructors in a manufacturing firm's advanced technical preparation program, contain the materials required to present the safety section of the plant's adult-oriented, job-specific competency-based training program. The 3 modules contain 12 lessons on the following topics: lockout/tagout…

  13. Techniques to Assess and Mitigate the Environmental Risk Posed by use of Airguns: Recent Advances from Academic Research Programs

    NASA Astrophysics Data System (ADS)

    Miller, P. J.; Tyack, P. L.; Johnson, M. P.; Madsen, P. T.; King, R.

    2006-05-01

    There is considerable uncertainty about the ways in which marine mammals might react to noise, the biological significance of reactions, and the effectiveness of planning and real-time mitigation techniques. A planning tool commonly used to assess environmental risk of acoustic activities uses simulations to predict acoustic exposures received by animals, and translates exposure to response using a dose-response function to yield an estimate of the undesired impact on a population. Recent advances show promise to convert this planning tool into a real-time mitigation tool, using Bayesian statistical methods. In this approach, being developed for use by the British Navy, the environmental risk simulation is updated continuously during field operations. The distribution of exposure, set initially based on animal density, is updated in real-time using animal sensing data or environmental data known to correlate with the absence or presence of marine mammals. This conditional probability of animal presence should therefore be more accurate than prior probabilities used during planning, which enables a more accurate and quantitative assessment of both the impact of activities and reduction of impact via mitigation decisions. Two key areas of uncertainty in addition to animal presence/absence are 1.) how biologically-relevant behaviours are affected by exposure to noise, and 2.) whether animals avoid loud noise sources, which is the basis of ramp-up as a mitigation tool. With support from MMS and industry partners, we assessed foraging behaviour and avoidance movements of 8 tagged sperm whales in the Gulf of Mexico during experimental exposure to airguns. The whale that was approached most closely prolonged a surface resting bout hours longer than typical, but resumed foraging immediately after the airguns ceased, suggesting avoidance of deep diving necessary for foraging near active airguns. Behavioral indices of foraging rate (echolocation buzzes produced during prey

  14. Advanced imaging techniques for the study of plant growth and development.

    PubMed

    Sozzani, Rosangela; Busch, Wolfgang; Spalding, Edgar P; Benfey, Philip N

    2014-05-01

    A variety of imaging methodologies are being used to collect data for quantitative studies of plant growth and development from living plants. Multi-level data, from macroscopic to molecular, and from weeks to seconds, can be acquired. Furthermore, advances in parallelized and automated image acquisition enable the throughput to capture images from large populations of plants under specific growth conditions. Image-processing capabilities allow for 3D or 4D reconstruction of image data and automated quantification of biological features. These advances facilitate the integration of imaging data with genome-wide molecular data to enable systems-level modeling.

  15. Advanced karst hydrological and contaminant monitoring techniques for real-time and high resolution applications

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In telogenetic and soil-mantled karst aquifers, the movement of autogenic recharge through the epikarstic zone and into the regional aquifer can be a complex process and have implications for flooding, groundwater contamination, and other difficult to capture processes. Recent advances in instrument...

  16. Advance Appropriations: A Needless and Confusing Education Budget Technique. Federal Education Budget Project

    ERIC Educational Resources Information Center

    Delisle, Jason

    2007-01-01

    This report argues that advance appropriations serve no functional purpose for schools, but they create a loss of transparency, comparability, and simplicity in federal education budgeting. It allocates spending before future budgets have been established. The approach was originally used to skirt spending limits and budget procedures in place…

  17. Application of multivariate statistical techniques for characterization of groundwater quality in the coastal aquifer of Nador, Tipaza (Algeria)

    NASA Astrophysics Data System (ADS)

    Bouderbala, Abdelkader; Remini, Boualem; Saaed Hamoudi, Abdelamir; Pulido-Bosch, Antonio

    2016-06-01

    The study focuses on the characterization of the groundwater salinity on the Nador coastal aquifer (Algeria). The groundwater quality has undergone serious deterioration due to overexploitation. Groundwater samplings were carried out in high and low waters in 2013, in order to study the evolution of groundwater hydrochemistry from the recharge to the coastal area. Different kinds of statistical analysis were made in order to identify the main hydrogeochemical processes occurring in the aquifer and to discriminate between different groups of groundwater. These statistical methods provide a better understanding of the aquifer hydrochemistry, and put in evidence a hydrochemical classification of wells, showing that the area with higher salinity is located close to the coast, in the first two kilometers, where the salinity gradually increases as one approaches the seaside and suggests the groundwater salinization by seawater intrusion.

  18. Random-matrix approach to the statistical compound nuclear reaction at low energies using the Monte-Carlo technique

    SciTech Connect

    Kawano, Toshihiko

    2015-11-10

    This theoretical treatment of low-energy compound nucleus reactions begins with the Bohr hypothesis, with corrections, and various statistical theories. The author investigates the statistical properties of the scattering matrix containing a Gaussian Orthogonal Ensemble (GOE) Hamiltonian in the propagator. The following conclusions are reached: For all parameter values studied, the numerical average of MC-generated cross sections coincides with the result of the Verbaarschot, Weidenmueller, Zirnbauer triple-integral formula. Energy average and ensemble average agree reasonably well when the width I is one or two orders of magnitude larger than the average resonance spacing d. In the strong-absorption limit, the channel degree-of-freedom ν a is 2. The direct reaction increases the inelastic cross sections while the elastic cross section is reduced.

  19. Imaging functional blood vessels by the laser speckle imaging (LSI) technique using Q-statistics of the generalized differences algorithm.

    PubMed

    Ansari, Mohammad Zaheer; Cabrera, Humberto; Ramírez-Miquet, Evelio E

    2016-09-01

    In this work, we report about q statistics concept to improve the performance of generalized differences algorithm based on intensity histogram for imaging functional blood vessel structures in a rodent window chamber of a mice. The method uses the dynamic speckle signals obtained by transilluminating the rodent window chamber to create activity maps of vasculatures. The proposed method of generalized differences with q statistics (GDq) is very sensitive to the values of defined parameters such as: camera exposure time, the q value and the camera frame number. Appropriate choice of q values enhances the visibility (contrast) of functional blood vessels but at the same time without sacrificing the spatial resolution, which is of utmost importance for in-vivo vascular imaging.

  20. Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations, 2. Robustness of Techniques

    SciTech Connect

    Helton, J.C.; Kleijnen, J.P.C.

    1999-03-24

    Procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses are described and illustrated. These procedures attempt to detect increasingly complex patterns in scatterplots and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. A sequence of example analyses with a large model for two-phase fluid flow illustrates how the individual procedures can differ in the variables that they identify as having effects on particular model outcomes. The example analyses indicate that the use of a sequence of procedures is a good analysis strategy and provides some assurance that an important effect is not overlooked.

  1. Recent advances in electronic nose techniques for monitoring of fermentation process.

    PubMed

    Jiang, Hui; Zhang, Hang; Chen, Quansheng; Mei, Congli; Liu, Guohai

    2015-12-01

    Microbial fermentation process is often sensitive to even slight changes of conditions that may result in unacceptable end-product quality. Thus, the monitoring of the process is critical for discovering unfavorable deviations as early as possible and taking the appropriate measures. However, the use of traditional analytical techniques is often time-consuming and labor-intensive. In this sense, the most effective way of developing rapid, accurate and relatively economical method for quality assurance in microbial fermentation process is the use of novel chemical sensor systems. Electronic nose techniques have particular advantages in non-invasive monitoring of microbial fermentation process. Therefore, in this review, we present an overview of the most important contributions dealing with the quality control in microbial fermentation process using the electronic nose techniques. After a brief description of the fundamentals of the sensor techniques, some examples of potential applications of electronic nose techniques monitoring are provided, including the implementation of control strategies and the combination with other monitoring tools (i.e. sensor fusion). Finally, on the basis of the review, the electronic nose techniques are critically commented, and its strengths and weaknesses being highlighted. In addition, on the basis of the observed trends, we also propose the technical challenges and future outlook for the electronic nose techniques.

  2. Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations, 1: Review and Comparison of Techniques

    SciTech Connect

    Kleijnen, J.P.C.; Helton, J.C.

    1999-03-24

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (i) Type I errors are unavoidable, (ii) Type 11errors can occur when inappropriate analysis procedures are used, (iii) physical explanations should always be sought for why statistical procedures identify variables as being important, and (iv) the identification of important variables tends to be stable for independent Latin hypercube samples.

  3. Time-frequency and advanced frequency estimation techniques for the investigation of bat echolocation calls.

    PubMed

    Kopsinis, Yannis; Aboutanios, Elias; Waters, Dean A; McLaughlin, Steve

    2010-02-01

    In this paper, techniques for time-frequency analysis and investigation of bat echolocation calls are studied. Particularly, enhanced resolution techniques are developed and/or used in this specific context for the first time. When compared to traditional time-frequency representation methods, the proposed techniques are more capable of showing previously unseen features in the structure of bat echolocation calls. It should be emphasized that although the study is focused on bat echolocation recordings, the results are more general and applicable to many other types of signal. PMID:20136233

  4. Advanced techniques for noise source identification on a large generator unit

    SciTech Connect

    Williams, R.G.D. ); Yang, S.J. )

    1993-03-01

    Power station acoustic noise assessment, which has experienced increased environmental awareness and subsequently more stringent legislation for a number of years, has received and added stimulus due to the recent advent of powerful measurement and analysis techniques including sound intensity and coherence. These experimental techniques are explained and results, for a generator unit, illustrate their value in providing a unique, correlated insight into noise problems. This includes noise quantification, full explanation of site sound pressure level in terms of the various influences and major noise source identification. These techniques are widely applicable and an invaluable aid to any industrial noise problem.

  5. Time-frequency and advanced frequency estimation techniques for the investigation of bat echolocation calls.

    PubMed

    Kopsinis, Yannis; Aboutanios, Elias; Waters, Dean A; McLaughlin, Steve

    2010-02-01

    In this paper, techniques for time-frequency analysis and investigation of bat echolocation calls are studied. Particularly, enhanced resolution techniques are developed and/or used in this specific context for the first time. When compared to traditional time-frequency representation methods, the proposed techniques are more capable of showing previously unseen features in the structure of bat echolocation calls. It should be emphasized that although the study is focused on bat echolocation recordings, the results are more general and applicable to many other types of signal.

  6. Are conventional statistical techniques exhaustive for defining metal background concentrations in harbour sediments? A case study: The Coastal Area of Bari (Southeast Italy).

    PubMed

    Mali, Matilda; Dell'Anna, Maria Michela; Mastrorilli, Piero; Damiani, Leonardo; Ungaro, Nicola; Belviso, Claudia; Fiore, Saverio

    2015-11-01

    Sediment contamination by metals poses significant risks to coastal ecosystems and is considered to be problematic for dredging operations. The determination of the background values of metal and metalloid distribution based on site-specific variability is fundamental in assessing pollution levels in harbour sediments. The novelty of the present work consists of addressing the scope and limitation of analysing port sediments through the use of conventional statistical techniques (such as: linear regression analysis, construction of cumulative frequency curves and the iterative 2σ technique), that are commonly employed for assessing Regional Geochemical Background (RGB) values in coastal sediments. This study ascertained that although the tout court use of such techniques in determining the RGB values in harbour sediments seems appropriate (the chemical-physical parameters of port sediments fit well with statistical equations), it should nevertheless be avoided because it may be misleading and can mask key aspects of the study area that can only be revealed by further investigations, such as mineralogical and multivariate statistical analyses.

  7. Nde of Advanced Automotive Composite Materials that Apply Ultrasound Infrared Thermography Technique

    NASA Astrophysics Data System (ADS)

    Choi, Seung-Hyun; Park, Soo-Keun; Kim, Jae-Yeol

    The infrared thermographic nondestructive inspection technique is a quality inspection and stability assessment method used to diagnose the physical characteristics and defects by detecting the infrared ray radiated from the object without destructing it. Recently, the nondestructive inspection and assessment that use the ultrasound-infrared thermography technique are widely adopted in diverse areas. The ultrasound-infrared thermography technique uses the phenomenon that the ultrasound wave incidence to an object with cracks or defects on its mating surface generates local heat on the surface. The car industry increasingly uses composite materials for their lightweight, strength, and environmental resistance. In this study, the car piston passed through the ultrasound-infrared thermography technique for nondestructive testing, among the composite material car parts. This study also examined the effects of the frequency and power to optimize the nondestructive inspection.

  8. Recent advances in freeze-fracture electron microscopy: the replica immunolabeling technique

    PubMed Central

    2008-01-01

    Freeze-fracture electron microscopy is a technique for examining the ultrastructure of rapidly frozen biological samples by transmission electron microscopy. Of a range of approaches to freeze-fracture cytochemistry that have been developed and tried, the most successful is the technique termed freeze-fracture replica immunogold labeling (FRIL). In this technique, samples are frozen, fractured and replicated with platinum-carbon as in standard freeze fracture, and then carefully treated with sodium dodecylsulphate to remove all the biological material except a fine layer of molecules attached to the replica itself. Immunogold labeling of these molecules permits their distribution to be seen superimposed upon high resolution planar views of membrane structure. Examples of how this technique has contributed to our understanding of lipid droplet biogenesis and function are discussed. PMID:18385807

  9. Assessment of recent advances in measurement techniques for atmospheric carbon dioxide and methane observations

    NASA Astrophysics Data System (ADS)

    Zellweger, Christoph; Emmenegger, Lukas; Firdaus, Mohd; Hatakka, Juha; Heimann, Martin; Kozlova, Elena; Spain, T. Gerard; Steinbacher, Martin; van der Schoot, Marcel V.; Buchmann, Brigitte

    2016-09-01

    Until recently, atmospheric carbon dioxide (CO2) and methane (CH4) measurements were made almost exclusively using nondispersive infrared (NDIR) absorption and gas chromatography with flame ionisation detection (GC/FID) techniques, respectively. Recently, commercially available instruments based on spectroscopic techniques such as cavity ring-down spectroscopy (CRDS), off-axis integrated cavity output spectroscopy (OA-ICOS) and Fourier transform infrared (FTIR) spectroscopy have become more widely available and affordable. This resulted in a widespread use of these techniques at many measurement stations. This paper is focused on the comparison between a CRDS "travelling instrument" that has been used during performance audits within the Global Atmosphere Watch (GAW) programme of the World Meteorological Organization (WMO) with instruments incorporating other, more traditional techniques for measuring CO2 and CH4 (NDIR and GC/FID). We demonstrate that CRDS instruments and likely other spectroscopic techniques are suitable for WMO/GAW stations and allow a smooth continuation of historic CO2 and CH4 time series. Moreover, the analysis of the audit results indicates that the spectroscopic techniques have a number of advantages over the traditional methods which will lead to the improved accuracy of atmospheric CO2 and CH4 measurements.

  10. An overview on in situ micronization technique - An emerging novel concept in advanced drug delivery.

    PubMed

    Vandana, K R; Prasanna Raju, Y; Harini Chowdary, V; Sushma, M; Vijay Kumar, N

    2014-09-01

    The use of drug powders containing micronized drug particles has been increasing in several pharmaceutical dosage forms to overcome the dissolution and bioavailability problems. Most of the newly developed drugs are poorly water soluble which limits dissolution rate and bioavailability. The dissolution rate can be enhanced by micronization of the drug particles. The properties of the micronized drug substance such as particle size, size distribution, shape, surface properties, and agglomeration behaviour and powder flow are affected by the type of micronization technique used. Mechanical communition, spray drying and supercritical fluid (SCF) technology are the most commonly employed techniques for production of micronized drug particles but the characteristics of the resulting drug product cannot be controlled using these techniques. Hence, a newer technique called in situ micronization is developed in order to overcome the limitations associated with the other techniques. This review summarizes the existing knowledge on in situ micronization techniques. The properties of the resulting drug substance obtained by in situ micronization were also compared.

  11. An overview on in situ micronization technique – An emerging novel concept in advanced drug delivery

    PubMed Central

    Vandana, K.R.; Prasanna Raju, Y.; Harini Chowdary, V.; Sushma, M.; Vijay Kumar, N.

    2013-01-01

    The use of drug powders containing micronized drug particles has been increasing in several pharmaceutical dosage forms to overcome the dissolution and bioavailability problems. Most of the newly developed drugs are poorly water soluble which limits dissolution rate and bioavailability. The dissolution rate can be enhanced by micronization of the drug particles. The properties of the micronized drug substance such as particle size, size distribution, shape, surface properties, and agglomeration behaviour and powder flow are affected by the type of micronization technique used. Mechanical communition, spray drying and supercritical fluid (SCF) technology are the most commonly employed techniques for production of micronized drug particles but the characteristics of the resulting drug product cannot be controlled using these techniques. Hence, a newer technique called in situ micronization is developed in order to overcome the limitations associated with the other techniques. This review summarizes the existing knowledge on in situ micronization techniques. The properties of the resulting drug substance obtained by in situ micronization were also compared. PMID:25161371

  12. Statistical factor analysis technique for characterizing basalt through interpreting nuclear and electrical well logging data (case study from Southern Syria).

    PubMed

    Asfahani, Jamal

    2014-02-01

    Factor analysis technique is proposed in this research for interpreting the combination of nuclear well logging, including natural gamma ray, density and neutron-porosity, and the electrical well logging of long and short normal, in order to characterize the large extended basaltic areas in southern Syria. Kodana well logging data are used for testing and applying the proposed technique. The four resulting score logs enable to establish the lithological score cross-section of the studied well. The established cross-section clearly shows the distribution and the identification of four kinds of basalt which are hard massive basalt, hard basalt, pyroclastic basalt and the alteration basalt products, clay. The factor analysis technique is successfully applied on the Kodana well logging data in southern Syria, and can be used efficiently when several wells and huge well logging data with high number of variables are required to be interpreted.

  13. Three case reports of the metabolic and electroencephalographic changes during advanced Buddhist meditation techniques.

    PubMed

    Benson, H; Malhotra, M S; Goldman, R F; Jacobs, G D; Hopkins, P J

    1990-01-01

    To examine the extent to which advanced meditative practices might alter body metabolism and the electroencephalogram (EEG), we investigated three Tibetan Buddhist monks living in the Rumtek monastery in Sikkim, India. In a study carried out in February 1988, we found that during the practice of several different meditative practices, resting metabolism (VO2) could be both raised (up to 61%) and lowered (down to 64%). The reduction from rest is the largest ever reported. On the EEG, marked asymmetry in alpha and beta activity between the hemispheres and increased beta activity were present. From these three case reports, we conclude that advanced meditative practices may yield different alterations in metabolism (there are also forms of meditation that increase metabolism) and that the decreases in metabolism can be striking.

  14. External Magnetic Field Reduction Techniques for the Advanced Stirling Radioisotope Generator

    NASA Technical Reports Server (NTRS)

    Niedra, Janis M.; Geng, Steven M.

    2013-01-01

    Linear alternators coupled to high efficiency Stirling engines are strong candidates for thermal-to-electric power conversion in space. However, the magnetic field emissions, both AC and DC, of these permanent magnet excited alternators can interfere with sensitive instrumentation onboard a spacecraft. Effective methods to mitigate the AC and DC electromagnetic interference (EMI) from solenoidal type linear alternators (like that used in the Advanced Stirling Convertor) have been developed for potential use in the Advanced Stirling Radioisotope Generator. The methods developed avoid the complexity and extra mass inherent in data extraction from multiple sensors or the use of shielding. This paper discusses these methods, and also provides experimental data obtained during breadboard testing of both AC and DC external magnetic field devices.

  15. Development of heat transfer enhancement techniques for external cooling of an advanced reactor vessel

    NASA Astrophysics Data System (ADS)

    Yang, Jun

    Nucleate boiling is a well-recognized means for passively removing high heat loads (up to ˜106 W/m2) generated by a molten reactor core under severe accident conditions while maintaining relatively low reactor vessel temperature (<800 °C). With the upgrade and development of advanced power reactors, however, enhancing the nucleate boiling rate and its upper limit, Critical Heat Flux (CHF), becomes the key to the success of external passive cooling of reactor vessel undergoing core disrupture accidents. In the present study, two boiling heat transfer enhancement methods have been proposed, experimentally investigated and theoretically modelled. The first method involves the use of a suitable surface coating to enhance downward-facing boiling rate and CHF limit so as to substantially increase the possibility of reactor vessel surviving high thermal load attack. The second method involves the use of an enhanced vessel/insulation design to facilitate the process of steam venting through the annular channel formed between the reactor vessel and the insulation structure, which in turn would further enhance both the boiling rate and CHF limit. Among the various available surface coating techniques, metallic micro-porous layer surface coating has been identified as an appropriate coating material for use in External Reactor Vessel Cooling (ERVC) based on the overall consideration of enhanced performance, durability, the ease of manufacturing and application. Since no previous research work had explored the feasibility of applying such a metallic micro-porous layer surface coating on a large, downward facing and curved surface such as the bottom head of a reactor vessel, a series of characterization tests and experiments were performed in the present study to determine a suitable coating material composition and application method. Using the optimized metallic micro-porous surface coatings, quenching and steady-state boiling experiments were conducted in the Sub

  16. Advances in projection of climate change impacts using supervised nonlinear dimensionality reduction techniques

    NASA Astrophysics Data System (ADS)

    Sarhadi, Ali; Burn, Donald H.; Yang, Ge; Ghodsi, Ali

    2016-05-01

    One of the main challenges in climate change studies is accurate projection of the global warming impacts on the probabilistic behaviour of hydro-climate processes. Due to the complexity of climate-associated processes, identification of predictor variables from high dimensional atmospheric variables is considered a key factor for improvement of climate change projections in statistical downscaling approaches. For this purpose, the present paper adopts a new approach of supervised dimensionality reduction, which is called "Supervised Principal Component Analysis (Supervised PCA)" to regression-based statistical downscaling. This method is a generalization of PCA, extracting a sequence of principal components of atmospheric variables, which have maximal dependence on the response hydro-climate variable. To capture the nonlinear variability between hydro-climatic response variables and projectors, a kernelized version of Supervised PCA is also applied for nonlinear dimensionality reduction. The effectiveness of the Supervised PCA methods in comparison with some state-of-the-art algorithms for dimensionality reduction is evaluated in relation to the statistical downscaling process of precipitation in a specific site using two soft computing nonlinear machine learning methods, Support Vector Regression and Relevance Vector Machine. The results demonstrate a significant improvement over Supervised PCA methods in terms of performance accuracy.

  17. POC-SCALE TESTING OF AN ADVANCED FINE COAL DEWATERING EQUIPMENT/TECHNIQUE

    SciTech Connect

    X.H. Wang; J. Wiseman; D.J. Sung; D. McLean; William Peters; Jim Mullins; John Hugh; G. Evans; Vince Hamilton; Kenneth Robinette; Tim Krim; Michael Fleet

    1999-08-01

    Dewatering of ultra-fine (minus 150 {micro}m) coal slurry to less than 20% moisture is difficult using the conventional dewatering techniques. The main objective of the project was to evaluate a novel surface modification technique, which utilizes the synergistic effect of metal ions and surfactants in combination for the dewatering of ultra-fine clean-coal slurries using various dewatering techniques on a proof-of-concept (POC) scale of 0.5 to 2 tons per hour. The addition of conventional reagents and the application of coal surface modification technique were evaluated using vacuum filtration, hyperbaric (pressure) filtration, ceramic plate filtration and screen-bowl centrifuge techniques. The laboratory and pilot-scale dewatering studies were conducted using the fine-size, clean-coal slurry produced in the column flotation circuit at the Powell Mountain Coal Company, St. Charles, VA. The pilot-scale studies were conducted at the Mayflower preparation plant in St. Charles, VA. The program consisted of nine tasks, namely, Task 1--Project Work Planning, Task 2--Laboratory Testing, Task 3--Engineering Design, Task 4--Procurement and Fabrication, Task 5--Installation and Shakedown, Task 6--System Operation, Task 7--Process Evaluation, Task 8--Equipment Removal, and Task 9--Reporting.

  18. Comparison of advanced optical imaging techniques with current otolaryngology diagnostics for improved middle ear assessment (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Nolan, Ryan M.; Shelton, Ryan L.; Monroy, Guillermo L.; Spillman, Darold R.; Novak, Michael A.; Boppart, Stephen A.

    2016-02-01

    Otolaryngologists utilize a variety of diagnostic techniques to assess middle ear health. Tympanometry, audiometry, and otoacoustic emissions examine the mobility of the tympanic membrane (eardrum) and ossicles using ear canal pressure and auditory tone delivery and detection. Laser Doppler vibrometry provides non-contact vibrational measurement, and acoustic reflectometry is used to assess middle ear effusion using sonar. These technologies and techniques have advanced the field beyond the use of the standard otoscope, a simple tissue magnifier, yet the need for direct visualization of middle ear disease for superior detection, assessment, and management remains. In this study, we evaluated the use of portable optical coherence tomography (OCT) and pneumatic low-coherence interferometry (LCI) systems with handheld probe delivery to standard tympanometry, audiometry, otoacoustic emissions, laser Doppler vibrometry, and acoustic reflectometry. Comparison of these advanced optical imaging techniques and current diagnostics was conducted with a case study subject with a history of unilateral eardrum trauma. OCT and pneumatic LCI provide novel dynamic spatiotemporal structural data of the middle ear, such as the thickness of the eardrum and quantitative detection of underlying disease pathology, which could allow for more accurate diagnosis and more appropriate management than currently possible.

  19. Advanced Endovascular Approaches in the Management of Challenging Proximal Aortic Neck Anatomy: Traditional Endografts and the Snorkel Technique

    PubMed Central

    Quatromoni, Jon G.; Orlova, Ksenia; Foley, Paul J.

    2015-01-01

    Advances in endovascular technology, and access to this technology, have significantly changed the field of vascular surgery. Nowhere is this more apparent than in the treatment of abdominal aortic aneurysms (AAAs), in which endovascular aneurysm repair (EVAR) has replaced the traditional open surgical approach in patients with suitable anatomy. However, approximately one-third of patients presenting with AAAs are deemed ineligible for standard EVAR because of anatomic constraints, the majority of which involve the proximal aneurysmal neck. To overcome these challenges, a bevy of endovascular approaches have been developed to either enhance stent graft fixation at the proximal neck or extend the proximal landing zone to allow adequate apposition to the aortic wall and thus aneurysm exclusion. This article is composed of two sections that together address new endovascular approaches for treating aortic aneurysms with difficult proximal neck anatomy. The first section will explore advancements in the traditional EVAR approach for hostile neck anatomy that maximize the use of the native proximal landing zone; the second section will discuss a technique that was developed to extend the native proximal landing zone and maintain perfusion to vital aortic branches using common, off-the-shelf components: the snorkel technique. While the techniques presented differ in terms of approach, the available clinical data, albeit limited, support the notion that they may both have roles in the treatment algorithm for patients with challenging proximal neck anatomy. PMID:26327748

  20. Advanced Endovascular Approaches in the Management of Challenging Proximal Aortic Neck Anatomy: Traditional Endografts and the Snorkel Technique.

    PubMed

    Quatromoni, Jon G; Orlova, Ksenia; Foley, Paul J

    2015-09-01

    Advances in endovascular technology, and access to this technology, have significantly changed the field of vascular surgery. Nowhere is this more apparent than in the treatment of abdominal aortic aneurysms (AAAs), in which endovascular aneurysm repair (EVAR) has replaced the traditional open surgical approach in patients with suitable anatomy. However, approximately one-third of patients presenting with AAAs are deemed ineligible for standard EVAR because of anatomic constraints, the majority of which involve the proximal aneurysmal neck. To overcome these challenges, a bevy of endovascular approaches have been developed to either enhance stent graft fixation at the proximal neck or extend the proximal landing zone to allow adequate apposition to the aortic wall and thus aneurysm exclusion. This article is composed of two sections that together address new endovascular approaches for treating aortic aneurysms with difficult proximal neck anatomy. The first section will explore advancements in the traditional EVAR approach for hostile neck anatomy that maximize the use of the native proximal landing zone; the second section will discuss a technique that was developed to extend the native proximal landing zone and maintain perfusion to vital aortic branches using common, off-the-shelf components: the snorkel technique. While the techniques presented differ in terms of approach, the available clinical data, albeit limited, support the notion that they may both have roles in the treatment algorithm for patients with challenging proximal neck anatomy.